312 The Rise of Worse Is Better • Completeness—the design must cover as many important situations as is practical. All reasonably expected cases must be covered. Simplicity is not allowed to overly reduce completeness. I believe most people would agree that these are all good characteristics. I will call the use of this philosophy of design the “MIT approach.” Common Lisp (with CLOS) and Scheme represent the MIT approach to design and implementation. The worse-is-better philosophy is only slightly different: • Simplicity—the design must be simple, both in implementation and interface. It is more important for the implementation to be simple than the interface. Simplicity is the most important consideration in a design. • Correctness—the design must be correct in all observable aspects. It is slightly better to be simple than correct. • Consistency—the design must not be overly inconsistent. Consis- tency can be sacrificed for simplicity in some cases, but it is better to drop those parts of the design that deal with less common circum- stances than to introduce either implementational complexity or inconsistency. • Completeness—the design must cover as many important situations as is practical. All reasonably expected cases should be covered. Completeness can be sacrificed in favor of any other quality. In fact, completeness must be sacrificed whenever implementation simplic- ity is jeopardized. Consistency can be sacrificed to achieve com- pleteness if simplicity is retained especially worthless is consistency of interface. Unix and C are examples of the use of this school of design, and I will call the use of this design strategy the “New Jersey approach.” I have intention- ally caricatured the worse-is-better philosophy to convince you that it is obviously a bad philosophy and that the New Jersey approach is a bad approach. However, I believe that worse-is-better, even in its strawman form, has bet- ter survival characteristics than the-right-thing, and that the New Jersey approach when used for software is a better approach than the MIT approach. Let me start out by retelling a story that shows that the MIT/New Jersey distinction is valid and that proponents of each philosophy actually believe their philosophy is better.
313 Two famous people, one from MIT and another from Berkeley (but work- ing on Unix), once met to discuss operating system issues. The person from MIT was knowledgeable about ITS (the MIT AI Lab operating system) and had been reading the Unix sources. He was interested in how Unix solved the PC2 loser-ing problem. The PC loser-ing problem occurs when a user program invokes a system routine to perform a lengthy operation that might have significant state, such an input/output operation involving IO buffers. If an interrupt occurs during the operation, the state of the user pro- gram must be saved. Because the invocation of the system routine is usu- ally a single instruction, the PC of the user program does not adequately capture the state of the process. The system routine must either back out or press forward. The right thing is to back out and restore the user program PC to the instruction that invoked the system routine so that resumption of the user program after the interrupt, for example, reenters the system rou- tine. It is called “PC loser-ing” because the PC is being coerced into “loser mode,” where “loser” is the affectionate name for “user” at MIT. The MIT guy did not see any code that handled this case and asked the New Jersey guy how the problem was handled. The New Jersey guy said that the Unix folks were aware of the problem, but the solution was for the system routine to always finish, but sometimes an error code would be returned that signaled that the system routine had failed to complete its action. A correct user program, then, had to check the error code to deter- mine whether to simply try the system routine again. The MIT guy did not like this solution because it was not the right thing. The New Jersey guy said that the Unix solution was right because the design philosophy of Unix was simplicity and that the right thing was too complex. Besides, programmers could easily insert this extra test and loop. The MIT guy pointed out that the implementation was simple but the inter- face to the functionality was complex. The New Jersey guy said that the right trade off has been selected in Unix—namely, implementation sim- plicity was more important than interface simplicity. The MIT guy then muttered that sometimes it takes a tough man to make a tender chicken, but the New Jersey guy didn’t understand (I’m not sure I do either). Now I want to argue that worse-is-better is better. C is a programming lan- guage designed for writing Unix, and it was designed using the New Jersey approach. C is therefore a language for which it is easy to write a decent 2Program Counter. The PC is a register inside the computer’s central processing unit that keeps track of the current execution point inside a running program.