All Tools Suck
Or: why do I hate everything?
The motto of this blog (and my professional life) is "all tools suck (some suck less than others)"
That's a pretty harsh statement. After all there's lots of useful software out there (and a lot more heineous piles of crap, but that's just the human condition).
So what do I really mean by "all tools suck"?
A better question might be "what makes a tool a good tool?" The trite answer is "if it meets requirements" but in fact this is the only really useful answer--the key is defining your requirments clearly. If one of your requirements is "doesn't crash all the time" then you've just eliminated a huge swath of tools. If another one is "doesn't take proprietary ownership of my data" then you've eliminated a whole other swath.
My problem, and in fact the problem of anyone who has chosen to use standards for their data in order to get the value that standards provide because they are standard (and not, for example, simply because that's the format the tool you like happens to use), is that standards are explicitly statements of requirement and generally very demanding statements of requirement.
For example, the XSL Formatting Objects (XSL-FO) recommendation, the standard I've worked most closely with the last few years, defines a very sophisticated mechanism for doing automatic composition of printable pages. As such it defines lots of requirements for what XSL-FO should and should not do.
The rub is that for any non-trivial standard few tools will completely satisfy the requirements inherent in that standard, whether it's failure to correctly implement a required feature or just not implementing an optional, but useful, feature.
Therefore, any tool that doesn't fully implement the standard at hand inherently sucks, by definition, because it doesn't meet requirements.
Of course, a given user of the standard may not require all the things the standard requires (that's why most standards have optional features), in which case, any tool that implements those features the user does require and otherwise meets the user's requirements (reliability, performance, cost, etc.) doesn't suck for that user.
But I'm an integrator. That means the set of requirements I care about is the union of all the requirements of my clients and prospects which, since I don't always know what my clients' and prospects' requirements are, is easiest to define as "everything the standard does that isn't prima facia useless".
Plus I make a point of trying to explore the practical boundaries of standards and their supporting technologies, so I tend to try to do things that aren't workaday requirements (but that would still be useful in real use cases).
As most engineers implementing tools are focused either on the requirements they understand or what the marketing department tells them is important or the features they can quickly implement, they tend not to focus on the edge cases. This is just how software development works. It's the very rare engineer who has the time, inclination, and luxury of implementing a standard completely just for the sake of completeness. In fact I'm not sure it's ever happened, at least not within the XML family of standards (the Saxon family of XSLT implementations may be the exception here--Mike Kay is one wicked mofo when it comes to standards implementation and software engineering).
So this means that my starting point when coming to a new tool that purports to support some set of standards is that it fails to support some set of requirements that I have (because it doesn't implement the whole standard [because no tool ever supports the whole standard]). So right off it's at a serious disadvantage. Then it still has to satisfy all the other requirements that any piece of useful software has to satisfy: cost, performance, correct operation, ease of use, ease of integration, etc. These requirements, by themselves are hard enough for most software to satisfy (because most software just isn't engineered that well so, on average, most tools you encounter will be pretty weak, just by the law of averages).
So to review:
Given that:
- By definition, all tools will fail to meet all the requirements inherent in the standards they claim to support to one degree or another, and
- Most tools are buggy, slow, bloated examples of poor software engineering
It is clear that:
- All tools suck, to one degree or another
The degree to which a tool doesn't suck is then a function of two factors:
- The number of requirements in the standard it does support and the value of those requirements to the likely users of the tools (implementing parts of standards that nobody wants or uses or should use doesn't reduce your suckiness score). [For example, the XSL-FO specification includes a bunch of features for aural presentation of rendered documents. These features are essentially useless for print applications so few, if any, FO implementations support them. That does not count against those implementations because the features are not useful to most users of XSL-FO. However, failing to support a very useful feature like block-containers does count against you.]
- The overall software engineering quality with regard to performance, buginess, value (price relative to features provided), support, documentation, ease of integration, and so on.
For most purposes I give these two factors roughly equal weight, although for most work I give engineering quality somewhat greater weight assuming that the critical features of the standard are otherwise implemented. But sometimes you can tolerate a slower or buggyier or more bloated tool because it implements more critical features.
Finally, as an integrator I don't care just about the raw functionality of the tool but about its features that support integration, such as APIs, command-line options, packaging as plug-ins, platform support, documentation for APIs, and so on. Many tools that are otherwise quite strong often fall down here because this is stuff that relatively few users of the tool care about. So it tends to get to no love (I'm talking to you, Documentum).
So on top of the usual frustrations with poor, incomplete, and incorrect implementation of standards and typically buggy and poorly-supported programs, add my frustration with trying to integrate these tools with other similarly joyful tools and you can see that my job is a recipe for bitterness and pain.
Oh yeah, and one more thing: I am freakin' Eliot Kimber! I've been doing this more years than most of you snot nosed kids with your IDEs and your AJAX hacks and your iPods have been alive so don't be telling me that my requirements are somehow less compelling than what you've figured out by reading XML for Dummies! Listen to me: implement the features I want or your software will be forever cursed! You have been warned!
Now do you understand why all tools suck?
The motto of this blog (and my professional life) is "all tools suck (some suck less than others)"
That's a pretty harsh statement. After all there's lots of useful software out there (and a lot more heineous piles of crap, but that's just the human condition).
So what do I really mean by "all tools suck"?
A better question might be "what makes a tool a good tool?" The trite answer is "if it meets requirements" but in fact this is the only really useful answer--the key is defining your requirments clearly. If one of your requirements is "doesn't crash all the time" then you've just eliminated a huge swath of tools. If another one is "doesn't take proprietary ownership of my data" then you've eliminated a whole other swath.
My problem, and in fact the problem of anyone who has chosen to use standards for their data in order to get the value that standards provide because they are standard (and not, for example, simply because that's the format the tool you like happens to use), is that standards are explicitly statements of requirement and generally very demanding statements of requirement.
For example, the XSL Formatting Objects (XSL-FO) recommendation, the standard I've worked most closely with the last few years, defines a very sophisticated mechanism for doing automatic composition of printable pages. As such it defines lots of requirements for what XSL-FO should and should not do.
The rub is that for any non-trivial standard few tools will completely satisfy the requirements inherent in that standard, whether it's failure to correctly implement a required feature or just not implementing an optional, but useful, feature.
Therefore, any tool that doesn't fully implement the standard at hand inherently sucks, by definition, because it doesn't meet requirements.
Of course, a given user of the standard may not require all the things the standard requires (that's why most standards have optional features), in which case, any tool that implements those features the user does require and otherwise meets the user's requirements (reliability, performance, cost, etc.) doesn't suck for that user.
But I'm an integrator. That means the set of requirements I care about is the union of all the requirements of my clients and prospects which, since I don't always know what my clients' and prospects' requirements are, is easiest to define as "everything the standard does that isn't prima facia useless".
Plus I make a point of trying to explore the practical boundaries of standards and their supporting technologies, so I tend to try to do things that aren't workaday requirements (but that would still be useful in real use cases).
As most engineers implementing tools are focused either on the requirements they understand or what the marketing department tells them is important or the features they can quickly implement, they tend not to focus on the edge cases. This is just how software development works. It's the very rare engineer who has the time, inclination, and luxury of implementing a standard completely just for the sake of completeness. In fact I'm not sure it's ever happened, at least not within the XML family of standards (the Saxon family of XSLT implementations may be the exception here--Mike Kay is one wicked mofo when it comes to standards implementation and software engineering).
So this means that my starting point when coming to a new tool that purports to support some set of standards is that it fails to support some set of requirements that I have (because it doesn't implement the whole standard [because no tool ever supports the whole standard]). So right off it's at a serious disadvantage. Then it still has to satisfy all the other requirements that any piece of useful software has to satisfy: cost, performance, correct operation, ease of use, ease of integration, etc. These requirements, by themselves are hard enough for most software to satisfy (because most software just isn't engineered that well so, on average, most tools you encounter will be pretty weak, just by the law of averages).
So to review:
Given that:
- By definition, all tools will fail to meet all the requirements inherent in the standards they claim to support to one degree or another, and
- Most tools are buggy, slow, bloated examples of poor software engineering
It is clear that:
- All tools suck, to one degree or another
The degree to which a tool doesn't suck is then a function of two factors:
- The number of requirements in the standard it does support and the value of those requirements to the likely users of the tools (implementing parts of standards that nobody wants or uses or should use doesn't reduce your suckiness score). [For example, the XSL-FO specification includes a bunch of features for aural presentation of rendered documents. These features are essentially useless for print applications so few, if any, FO implementations support them. That does not count against those implementations because the features are not useful to most users of XSL-FO. However, failing to support a very useful feature like block-containers does count against you.]
- The overall software engineering quality with regard to performance, buginess, value (price relative to features provided), support, documentation, ease of integration, and so on.
For most purposes I give these two factors roughly equal weight, although for most work I give engineering quality somewhat greater weight assuming that the critical features of the standard are otherwise implemented. But sometimes you can tolerate a slower or buggyier or more bloated tool because it implements more critical features.
Finally, as an integrator I don't care just about the raw functionality of the tool but about its features that support integration, such as APIs, command-line options, packaging as plug-ins, platform support, documentation for APIs, and so on. Many tools that are otherwise quite strong often fall down here because this is stuff that relatively few users of the tool care about. So it tends to get to no love (I'm talking to you, Documentum).
So on top of the usual frustrations with poor, incomplete, and incorrect implementation of standards and typically buggy and poorly-supported programs, add my frustration with trying to integrate these tools with other similarly joyful tools and you can see that my job is a recipe for bitterness and pain.
Oh yeah, and one more thing: I am freakin' Eliot Kimber! I've been doing this more years than most of you snot nosed kids with your IDEs and your AJAX hacks and your iPods have been alive so don't be telling me that my requirements are somehow less compelling than what you've figured out by reading XML for Dummies! Listen to me: implement the features I want or your software will be forever cursed! You have been warned!
Now do you understand why all tools suck?
15 Comments:
Mr. Bray linked me here.
It occurs to me that given the fundamentally requirements-based view of tool suckage you present here (and that I agree with, don't get me wrong!) there's an obvious corollary to make:
If you want you want to make a tool that Sucks Less, implement a simple set of requirements.
Which, when you apply it to the standards world, speaks quite clearly as to why it is that all WS-* tools currently suck a lot, and why most XML tools do too.
I think you make a good point: implementing features that are not required is almost as bad as failing to implement features that are required. This is a complaint I have with a number of (if not most) content management/document management tools: they implement features I don't want (or don't want them to implement) and fail to implement features I do need.
As far as implementing a "simple set of requirements" I think that mostly is a function of the users--that is, it's incumbent on users to keep their requirement sets as small as they can and still satisfy their operational goals.
In the context of one-off projects, this means not asking for whacky stuff that you won't actually use (which is why use-cases and test-driven development are so important in software engineering).
In the context of standards development, it means working hard to keep the feature set as small as possible while at the same time not making the standard too limiting. In my experience this is usually the hardest part of doing good standards: deciding what's in and what's out, especially as the standard becomes less about data storage (XML) and more about semantics and processing (XSLT, XSL-FO, XQuery, etc.). And note that the XML recommendation was entirely an exercise in leaving things out of SGML--there was no invention in the XML 1.0 specification.
I think the W3C is very wise, for example, to require that all recommendations demonstrate at least two implementations for every feature (requirement) of the specification--this helps ensure that the resulting standard is both implementable and more likely to be a close match to the real requirements of the target users. Of course it doesn't requirement that the implementations be good or even publicly available....
Note that the W3C borrowed that requirement from the IETF, which has always insisted on at least two compatible implementations of any protocol before it can become a full standard.
As for universal tool suckage, I think this is a projection from your admitted role as a big-tools big-iron big-standards guy. I use small tools and small standards, and in general the tools meet the standards to perfection. As an obvious example, I have a choice of three free implementations of most Posix.2 tools (GNU, BSD, and Solaris) and it's hardly surprising that they meet the standard, which was pretty much written after the fact (as standards should be, IMHO).
In the XML world, I point to James Clark's jing schema validator and trang schema-language translator as sterling examples of strong implementations of small standards. It's true that trang doesn't always generate optimal or even error-free W3C XML Schemas in extremely complicated cases, but I'm willing to overlook that for the sake of not having to write them myself.
So many Tools not apply standards?
Hmmmmm... Are they from Microsoft?
--
cablemodem.ch
I certainly recognize that a number of standards have included features that may have been either not useful or not very well thought out or beyond implementability.
However, I must strongly disagree with the notion of "spirt rather than letter". The whole point of a standard is that it defines testable conditions for conformance. You either conform to the standard or you don't. An implementation either conforms or it doesn't and, in most cases, there's a clear test for the case (of course, as the thing a standard describes becomes more abstract or the rules more difficult to define declaratively or the language appears to be self contradictory it can become hard to know if a specific behavior does or does not conform--this is a problem with some aspects of the XSL-FO specification, for example).
Remember too that I put the power in the hands of users not implementors. Users have the right to choose which standards they want their tools to support and which aspects of those standards are or are not important. It is the responsibility of implementors to then implement those standards and parts of standards correctly and accurately per the specification.
The thing that enrages me most is implementors telling me that "well we just implemented the spirit of the standard" or "that feature isn't really useful". It's not their place to decide.
If you're paying close attention to my posts here you might notice that I do things for use-by-reference that are "consistent with" the XInclude standard but that are not strictly conformant with it. I am doing this as a user not as a tools developer. I have recognized that the XInclude specification as written doesn't completely satisfy my requirements as a user and therefore I do something different. But I still expect my tools that do do XInclude to do it correctly (or at least what my interpretation, as a user, of what "correct" is in the case of XInclude).
Finally, I will re-iterate that there is a fundamental difference between pure data standards, like XML, and application standards, like XInclude or XSL-FO. In the case of data standards, the standard defines the rules for storing data irrespective of how that data might be used, ensuring usability of that data over a wide scope of applications and indefinite time. Data standards must be implemented exactly or they have no value.
By constrast, application standards, because they define processing rather than data representation, are both inherently fuzzier (because there are always things that have to be left open to implementations) and less critical to implement 100% correctly because at the end of the day what's important is what you, the user, get done with your data.
That's why we can not only tolerate but expect proprietary XSL-FO extensions but would never tolerate proprietary XML syntax extensions.
Also, I'll mention that in my experience implementors who claim "spirit rather than letter" often simply didn't get what the standard was trying to tell them and instead of trying to figure it out just did something that was close. To my mind that's just poor engineering.
Of course, other spirit rather than letter instances are sincere attempts to address a shortcoming in the standard or adapt a poorly-thought-out standard to a use case for which it is not well suited (authoring vs. delivery, for example).
And I'm not excusing standards writers either. The creators of standards have a responsibility to apply the same level of engineering rigor to their specifications that I expect engineers to apply to implementations of those standards.
This is why I have come to really appreciate the implementation requirement of W3C specifications: it helps ensure that specifications are at least minimally coherent and implementable.
Hehe especially the ones for Linux!! Thank you!!
yellowpages
This is a simple problem that we find in all walks of our human life.
In truth, all hardware tools suck as well – because no single tool can do EVERYTHING you want. A philips screwdriver will not work on a flat screw. You need a flat screwdriver.
However, is that the tool's fault? Or the user's? This is the question we have to consider even in the aspect of coding. If you're trying to use a tool for something that it wasn't designed for – it will suck.
The problem is that this arena is highly-subjective, and some of us will find certain tools more helpful than others. Interesting, that.
So many Tools not apply standards
The first time I saw your blog I thought you’re against all software developers or perhaps you’ve had some bad experiences with some applications or software. I’m glad that it’s not what it seems like it is.
Have you ever heard of an article which discusses the original sin that goes with all tools? This original sin is conceived and for every tool or software made this sin will always be there. And perhaps this so-called sin makes every tool imperfect. But at least they’re more helpful than imperfect right?
Hi Eliot,
Thanks for your detailed analysis of DocBook versus DITA. My colleague and I have been ordered to enter the 21st century (what, abandon FrameMaker? Abandon PDFs??? Yes!) so we are deep in the DocBook versus DITA research right now. Richard's post and your reply are both very detailed and helpful. Based on your arguments, I am leaning much more toward DITA.
What a fantastic blog, well I should say "rant" haha. Quite true I must agree, no software engineer quite fit any particular requirments, and if required, only design to the clients specifications.
Quite often, nothing fits requiments! Instead you are charged with multiple licenses to each peice of software you need to write a script. Microsoft wins again.
Standards are very definitely very important, and I think that a lot of guys really don't comply to any standards at all or even understand the advantages of standards. But, I must agree with you, except to say that MOST tools suck. To say that ALL tools suck is taking it a little far – maybe you just don't know where to find the right tools that work seamlessly. I guess, it may be true to say that this is a very subjective thing.
Well ranted. :)
It looks that a series of negative comments is understood as the power to highlight weaknesses in the standards system. But remember better standards are evolved with a larger set of positive people and a very few balance minded negative people. This article in not able to do justice in form of rightly balanced negative comments. All negative comments from technologist should have the odour of bringing positive changes. This article doesn't smell like that.
To SJG: I think you have misunderstood the point of my post, which is not that there are things wrong with standards but that standards tend to define set of requirements larger than tool vendors tend to want to address, which leads unavoidably to the tools not fully satisfying requirements (as defined by the standards the tools implement).
I have spent my career developing standards and have, as much as possible, tried to do so in the most positive and constructive manner I could.
Post a Comment
<< Home