Subscribe to Dr. Macro's XML Rants

NOTE TO TOOL OWNERS: In this blog I will occasionally make statements about products that you will take exception to. My intent is to always be factual and accurate. If I have made a statement that you consider to be incorrect or innaccurate, please bring it to my attention and, once I have verified my error, I will post the appropriate correction.

And before you get too exercised, please read the post, date 9 Feb 2006, titled "All Tools Suck".

Monday, August 09, 2010

Worse is Better, or Is It?

At the just-concluded Balisage conference, Michael Sperberg-McQueen brought up the (apparently) famous "worse is better" essay by Richard P. Gabriel (Wikipedia entry here, original paper here). I had never heard of this (or at least had no memory of ever hearing of it) even though it is directly relevant to my experiences as a standard developer and engineer, where I've done things in both the "MIT" way (correctness is most important) and, more or less, the "New Jersey" way (simplicity is most important). I was actually very surprised that nobody had ever pointed me to it before.

Gabriel's original argument is essentially that software that chooses simplicity over correctness and completeness has better survivability for a number of reasons, and cites as a prime example Unix and C, which spread precisely because they were simple (and thus easy to port) in spite of being neither complete functionally nor consistent in terms of their interfaces (user or programming). Gabriel then goes on, over the years, to argue against his own original assertion that worse is better and essentially falls into a state of oscillation between "yes it is" and "no it isn't" (see his history of his thought here).

The concept of "worse is better" certainly resonated with me because I have, for most of my career, fought against it at every turn, insisting on correctness and completeness as the primary concerns. This is in some part because of my work in standards, where correctness is of course important, and in part because I'm inherently an idealist by inclination, and in part because I grew up in IBM in the 80's when a company like IBM could still afford the time and cost of correctness over simplicity (or thought it could).

XML largely broke me of that. I was very humbled by XML and the general "80% is good enough" approach of the W3C and the Web in general. It took me a long time to get over my anger at the fact that they were right because I didn't want to live in that world, a world where <a href/> was the height of hyperlinking sophistication.

I got over it.

Around 1999 I started working as part of a pure Extreme Programming team implementing a content management system based on a simple but powerful abstract model (the SnapCM model I've posted about here in the past) and implemented using iterative, requirements-driven processes. We were very successful, in that we implemented exactly what we wanted to, in a timely fashion and with all the performance characteristics we needed, and without sacrificing any essential aspects of the design for the sake of simplicity of implementation or any other form of expediency.

That experience convinced me that agile methods, as typified by Extreme Programming, are very effective, if not the most effective engineering approach. But it also taught me the value of good abstract models, that they ensure consistency of purpose and implementation and allow you to have both simplicity of implementation and consistency of interface, that one need not be sacrificed for the other if you can do a bit of advanced planning (but not too much--that's another lesson of agile methods).

Thinking then about "worse is better" and Gabriel's inability to decide conclusively if it is actually better got me to thinking and the conclusion I came to is that the reason Gabriel can't decide is because both sides of his dichotomy are in fact wrong.

Extreme Programming says "start with the simplest thing that could possibly work" (italics mine). This is not the same as saying "simplicity trumps correctness", it just says "start simple". You then iterate until your tests pass. The tests reflect documented and verified user requirements.

The "worse is better" approach as defined by Gabriel is similar in that it also involves iteration but it largely ignores requirements. That is, in the New Jersey approach, "finished" is defined by the implementors with no obvious reference to any objective test of whether they are in fact finished.

At the same time, the MIT approach falls into the trap that agile methods are designed explicitly to avoid, namely overplanning and implementation of features that may never be used.

That is, it is easy, as an engineer or analyst who has thought deeply about a particular problem domain, to think of all the things that could be needed or useful and then design a system that will provide them, and then proceed to implement it. In this model, "done" is defined by "all aspects of the previously-specified design are implemented", again with no direct reference to actual validated requirements (except to the degree the designer asserts her authority that her analysis is correct). [The HyTime standard is an example of this approach to system design. I am proud of HyTime as an exercise in design that is mathematically complete and correct with respect to its problem domain. I am not proud of it as an example of survivable design. The fact that the existence of XML and the rise of the Web largely made HyTime irrelevant does not bother me particularly because I see now that it could never have survived. It was a dinosaur: well-adapted to its original environment, large and powerful and completely ill adapted to a rapidly changing environment. I learned and moved on. I am gratified only to the degree that no new hyperlinking standard, with the possible exception of DITA 1.2+, has come anywhere close to providing the needed level of standardization of hyperlinking that HyTime provided. It's a hard problem, one where the minimum level of simplicity needed to satisfy base requirements is still dauntingly challenging.]

Thus both the MIT and New Jersey approaches ultimately fail because they are not directly requirements driven in the way that agile methods are and must be.

Or put another way, the MIT approach reflects the failure of overplanning and the New Jersey approach reflects the failure of underplanning.

Agile methods, as typified by Extreme Programming, attempt to solve the problem by doing just the right amount of planning, and no more, and that planning is primarily a function of requirements gathering and validation in the support of iteration.

To that degree, agile engineering is much closer to the worse is better approach, in that it necessarily prefers simplicity over completeness and it tends, by its start-small-and-iterate approach, to produce smaller solutions faster than a planning-heavy approach will.

Because of the way projects tend to go, where budgets get exhausted or users get bogged down in just getting the usual stuff done or technology or the business changes in the meantime, it often happens that more sophisticated or future-looking requirements never get implemented because the project simply never gets that far. This has the effect of making agile projects look, after the fact, very much like worse-is-better projects simply because informed observers can see obvious features that haven't been implemented. Without knowing the project history you can't tell if the feature holes are there because the implementors refused to implement them on the grounds of preserving simplicity or because they simply fell off the bottom of the last iteration plan.

Whether an agile project ends with a greater degree of consistency in interface is entirely a function of engineering quality but it is at least the case that agile projects need not sacrifice consistency as long as the appropriate amount of planning was done, and in particular, a solid, universally-understood data or system model was defined as part of the initial implementation activity.

At the time Unix was implemented the practice of software and data modeling was still nascent at best and implementation was hard enough. Today we have deep established practice of software models, we have well-established design patterns, we have useful tools for capturing and publishing designs, so there is no excuse for not having one for any non-trivial project.

To that degree, I would hope that the "worse is more" engineering practice typified by Unix and C is a thing of the past. We now have enough counterexamples of good design with simplest-possible implementation and very consistent interfaces (Python, Groovy, Java, XSLT, and XQuery all come to mind, although I'm sure there are many many more).

But Michael's purpose in presenting worse-is-better was primarily as it relates to standards and I think the point is still well taken--standards have value only to the degree they are adopted, that is to the degree they survive in the Darwinian sense. Worse is more definitely tells us that simplicity is a powerful survival characteristic--we saw that with XML relative to SGML and with XSLT relative to DSSSL. Of course, it is not the only survival characteristic and is not sufficient, by itself, to ensure survival. But it's a very important one.

As somebody involved in the DITA standard development, I certainly take it to heart.

My thanks to Michael for helping me to think again about the value of simplicity.