- Look http://www.laputan.org/gabriel/worse-is-better.html for section 2.1
Re-reading this due to discussion over on SituatedSoftware, and I'm struck by this :
MIT / Stanford : *Simplicity - the design must be simple, both in implementation and interface. **It is more important for the interface to be simple than the implementation.***
Worse-is-Better : *Simplicity - the design must be simple, both in implementation and interface. **It is more important for the implementation to be simple than the interface. ***
Which clearly needs a link to ModularityMistake. Because forcing an inapropriately simple and restricted interface is one of the classic symptoms of modularity mistakes. It's the problem with plug'n'play urbanism described in TheCityAsInformationSystem.
ClayShirky on the Permanet and Nearly-net : http://www.shirky.com/writings/permanet.html
- The permanet strategy is to start with a service that is good but expensive, and to make it cheaper. The nearlynet strategy is to start with a service that is lousy but cheap, and to make it better. The permanet strategy assumes that quality is the key driver of a new service, and permanet has the advantage of being good at every iteration. Nearlynet assumes that cheapness is the essential characteristic, and that users will forgo quality for a sufficient break in price.*
What the permanet people have going for them is that good vs. lousy is not a hard choice to make, and if things stayed that way, permanet would win every time. What they have going against them, however, is incentive. The operator of a cheap but lousy service has more incentive to improve quality than the operator of a good but expensive service does to cut prices. And incremental improvements to quality can produce disproportionate returns on investment when a cheap but lousy service becomes cheap but adequate. The good enough is the enemy of the good, giving an edge over time to systems that produce partial results when partially implemented.
Similar to DisruptiveTechnology (The InnovatorsDilemma)
See also BigBallOfMud, ExtremeProgramming
Earlier ClayShirky on EvolvableSystems
Anyway, back to the topic. The other interesting observation is that a lot of these tools are built with the latest and greatest programming ideas and environments. This leads to technologies that are simply vastly superior to that of yesteryear. It doesn't matter if yesteryear's standards are equivalent to Web Services. The sheer superiority of the tools simply overwelm the fundamental value of the original systems.
In short, vendor persistence has self-willed the market into existence. It's time to prepare for this new reality!
Another counter : might worse is better be an illusion? For example, we remember cases where apparently worse beat apparently better, but because in fact 99% of the worse, was worse, and got forgotten. Meanwhile, the "failed" betters were actually pretty good, and only just failed due to bad luck. The worses that won were flukes.
MacToolbox as middle-ware, PHP as MacToolbox : http://www.andrewsw.com/news/index.php?p=744
Update 2009: Thinking more about this recently and I'm starting to think it's "ecological perspective" (OnEcology). Complex Interface / Simple Implementation (CISI) beats Simple Interface / Complex Implementation (SICI) when we consider the implementation's influence on
- decision by developers about whether to adopt the component
- maintainence and bug-fixing
- updating to new conditions
Worse is better as another way of worrying about DoesAbstractionScale or LeakyAbstraction.
We contrast worse-is-better with do-the-right-thing. In the most famous case we consider the implementation of a component and do-the-right-thing argues that it is better to trade a more complex implementation for a simpler interface. Now, do-the-right-thing IS the right thing. All else being equal it is indeed right to prefer a more complex implementation that buys a simpler interface.
Rather like a lever, a reusable component multiplies. Every extra step an interface requires is repeated across all uses of that interface. A cheaper interface soon pays off.
Why, then, worse-is-better?
One way to understand it is that do-the-right-thing holds up until what I call "evolutionary timescales" or "ecological issues".
Or rather, it holds up until the implementation does become an issue. The abstraction leaks.
Common examples of hitting "evolutionary scale" are
- the component must be ported to a new operating system, library or even language
- the component being large enough to hide bugs that will require ongoing maintenance
- the component requiring changes of behaviour (ie. new functionality)
The moment we hit these events, the cosy assumptions of do-the-right-thing can no longer be entertained
Maintenance is now an open-ended activity and the component's internal complexity is now a multiplier of the cost of that.
At this point, we enter a realm of pragmatism where the sweet spot can occur at any arbitrary balance between complexity of interface and complexity of implementation.
We also enter a zone of conflict where the interests of implementers or the component and the consumers of its services may be in a zero sum game. In the long run, a provider which externalizes much of its pain to the consumers is unlikely to survive in a competitive market.
At the same time, a component that cannot control its internal costs will go bankrupt.
Backlinks (26 items)