If you think wiki should be redundant with the same data reprepresented in different places, then you open yourself up to the usual problem of distributing information : when you want to update it, it's in several places. So you :
- a) need to update in several places
- b) have the danger of the different places becoming inconsistent
One response is to find this so disturbing, that you try to artificially restrict the number of places the information can be kept. (In database terminology this is normalization)
An alternative is to accept it as a fair TradeOff to (TradeOffBetweenReadingAndWriting) for the other virtues. And one way to cope with the problem is that readers have a "responsibility" to check several pages.
This is analogous with markets, which also spread information ie. the relative value of things, across multiple loci or vendors. There also buyers have a "responsibilty" to check several alternatives / models (ie. bundles of functionality). In doing so, and in choosing the correct price, they help to unify the information about the relative value.
You might also see linking two pages which contain similar pages as a simple form of arbitrage. (Or maybe that comes when you ReFactor and merge them into one)
Hmm, I don't think of the average wiki as containing 'data', but rather fuzzy thoughtstreams :) - so they can't be redundant unless the exact text is copied to 2 places (and even then you could argue about context affecting blahblahblah). But that just reinforces your point that you want to ease the opportunity to see multiple pages "about" the same "thought" (as encapsulated by its WikiName). So InterWiki and SisterSites is cool... –BillSeitz
See also :
- PredictiveMarkets (more on markets processing information)
- MetaMarkets (more arbitrage considered)
- (WarpLink) OnTesting ... another way to aggregate information from several sources.