AutomaticWiki
ThoughtStorms Wiki
For me wiki is a connection machine. An aid to creativity. (OnCreativity) A place to bisociate ideas, BangTheRocksTogether. WikiIsACauldron ... etc. (Maybe compare InfovoreWiki now)
I can make pages about or representing any idea, however vague or concrete. However half-abstracted and refactored from others.
But I jealously guard the role of humans to animate this fantastic machine. Not for me the slavery of the SemanticWeb where humans are reduced to humbly mark-up documents in RDF so that some bloody scutter can have the fun of surfing around, feeling the exquisite buzz of popping neurons in it's cognitive innards, as it discovers fascinating and productive connections.
But why, actually, not?
I'm left cold by attempts to "find similar" texts. Why do I want similar? I want new and exciting. And I'm not going to slog through putting things into bad categories, when the whole point is fluid refactoring and recreation of categories by the minute.
But there is, surely, a role for more, higher level automation.
Ever since I thought of searching for WarpLink (ThingsToSearchForOnThoughtStorms) I realize that my informal TypedLinks, once again combined with grep has created a new and exciting level of organization and understanding here. And this is partly to do with the automation of searching.
What other automations and tropisms might take us to higher level discovery on wiki?
Another way of trying to think about this. Wiki is a BlackBoardArchitecture, the sort of thing agents use to pass messages to each other in DistributedArtificialIntelligence and ContextSensitiveSoftware like DashBoard. It's a black-board where humans pass messages to each other. Or could be between humans and software.
IRC is full of bots posting useful information and providing services like asynchronous message passing. What might a swarm of automated bots do for a, primarily human, wiki?
Of course, in the discussion on SpammingThoughtStorms I talked about making it ObfusticatedVariableGeometry HTML in order to confuse SpamBots. This would also make it a impossible for third-party bot providers.
: Hmmm. Maybe there is scope for a WikiImmuneSystem like my EmailImmuneSystem. A bot which notices when the same address has been added the same link to two or more pages and deletes the change?
I guess my (very simple) Python code generator on BEACH is an example of a script which interacts with wiki. But given that it's triggered by a human it's not really an agent.
I'm pretty sure there is definitely a role for wiki-bots ...
- spelling correctors?
- broken link detectors?
- visitor trackers ... remember the sequence in which certain logged on visitors update pages. (Or if suitably connected to logs, track readers' trails.)
See also :
Backlinks (1 items)