Sponsored Link •
The classic essay on "worse is better" is either misunderstood or wrong. And in citing it in our design arguments, we short change both our customers and our craft. I revisit this essay, and reflect...
Recently, I was interviewed by a reporter who was doing a story on the difference between east coast engineers and west coast engineers (yes, that old chestnut is again being revisited). This in turn got me thinking about Dick Gabriel's classic note "Worse is Better", which is often credited with first articulating the distinction between the MIT (or east coast; after all, what else is there on the east coast) approach to engineering (do the right thing, no matter how complex it makes the code) and the Berkeley (or west coast) approach to engineering (make the code simple, even if it makes the user do more work).
The notion that worse is better has become something of a truism in the programming industry. The usual examples are the C language (worse than lisp, but it dominated anyway), Unix (or more recently, Windows) as opposed to Multics or VMS, and (in a completely different arena) VHS tape over Beta. Each of the dominant technologies, it is pointed out, was worse than the alternative, but the worse technology became the standard anyway. The moral to the story, or the reason that people bring the principle up in argument, is to convince whoever is on the other side of the argument that we should set our sights on the quick and dirty, less elegant solution to a problem, because "worse is better."
Of course, this received wisdom is just so much crap. The arguments simply don't hold up. But the damage that this principle has done to the technology industry is real, and we should start pushing back before we do any more harm than we already have done.
First, the arguments. Gabriel's original writing (which can still be found, like everything else, either through Google or by going here) makes an interesting read, and a number of good points. Reading it again not only reminds one of how well it is written, but debunks a lot of the usual common knowledge about the article. For example, the piece never contrasts the east coast and west coast engineering styles; the two groups it talks about are the MIT/Stanford style (one group) and the New Jersey style of design. Not nearly so interesting in these days as the notion that the contrast in styles is between west coast and east coast.
The rest of the paper is an excellent analysis for why Lisp lost out to C as a programming language, even though Lisp was a superior language. Or at least superior on the grounds that Dick found most important. But this doesn't necessarily show that Lisp was in fact superior to C; it can just as easily be taken to show that the metrics that were cited in the article were not the ones that were taken to be most important by those choosing a programming language. The fact that C produced faster code, was easier to master, was easier to use in groups, and ran well on less expensive hardware were not considerations that Gabriel found important. But others did. On those metrics, the dominance of C as a programming language was an example of better is better, not worse is better.
The old chestnut of beta and vhs is open to the same sort of alternate interpretation. On the "worse is better" interpretation, the superior quality beta was beaten out by the clearly inferior vhs tape format because of some inexplicable perversity of human nature (or the machinations of clever marketing people, or the hubris of Sony, who owned the Beta brand). But the vhs tapes were capable of recording longer programs on a single cassette, and could be played on cheaper recorders, and had a number of different suppliers. Thus there were a set of metrics on which vhs was the superior technology, and these metrics seemed to be the ones that most in the market found to be important. Vhs beat out beta not because worse is better, but better in some dimensions (cost, time to record) beat out better in other dimensions (picture quality).
Even the case of Unix vs. Multics misses an important point. While
Multics may have been a better operating system, Unix had the twin
advantages of actually existing, and running on a wide variety of fairly
cheap hardware. Windows (and DOS) ran on even cheaper hardware, and
though it was easy to argue on any technical grounds that you wanted
that Unix was a better OS than DOS, the property of existence on really
cheap hardware turned out to the the metric of goodness that appealed to
the customer. The emergence of Linux as a real choice is beginning to
give us more evidence in this particular choice; perhaps a better OS
In all of these cases, there is an alternate interpretation of the choices that were made that lead us to the conclusion that worse is not better. Instead, what we see is that better is a complicated notion, and can depend on a variety of different metrics. It may be disappointing to find out that what we geeks think of as better may not be what our customers think is better. But finding this out shouldn't surprise us too much.
Of course, worse is better is a much catchier slogan than better depends on your goodness metric or some other, equivalent phrase that would actually reflect what was going on. And there is wisdom and insight in the original article that can be used by designers, even under the catchier (but less historically accurate) slogan.
My problem with the slogan is that it has become a catch phrase for those who either haven't read the original article, or have read it and either have forgotten what it really talked about or never understood it in the first place. As a catch phrase, it is often used to justify shoddy design, or following the crowd rather than doing what is right, or as short-hand for the real claim that our customers are too stupid to either appreciate or deserve high quality products. Why spend the time doing things right, this line of reasoning goes, when we all know that worse is better. You are far better off giving the customer something that you know is less than what you could produce, because they (those simple customers) will think that it is better.
The end result of this thinking is sloppy products that don't work, are hard to use, or are unreliable (or all of the above). We try to convince our customers that this is the way software has to be, and then turn around and convince ourselves that they won't pay for anything better. But we short-change our customers, and we cheapen our craft, when we put up with this sort of thinking. When the original article was produced, I don't think that this is what the author had in mind; even if it was, it is time for us to reject the simple-minded interpretation of the slogan, and start putting out software that really is better (on the dimension of goodness that our customers have, not necessarily our own).
|Jim Waldo is a Distinguished Engineer with Sun Microsystems, where he is the lead architect for Jini, a distributed programming system based on Java. Prior to Jini, Jim worked in JavaSoft and Sun Microsystems Laboratories, where he did research in the areas of object-oriented programming and systems, distributed computing, and user environments. Before joining Sun, Jim spent eight years at Apollo Computer and Hewlett Packard working in the areas of distributed object systems, user interfaces, class libraries, text and internationalization. While at HP, he led the design and development of the first Object Request Broker, and was instrumental in getting that technology incorporated into the first OMG CORBA specification.|