SingularityThinking (ThoughtStorms)

I don't get the excitement about the singularity ie. the point at which machines become more "intelligent" than humans after which we stop being able to predict the future.

I can see it's a possibility. But there are plenty of other not so spectacular events which make the future pretty unpredictable too.

: The tribe on Meta Brain Growth (TribeNet) is about a counter process of augmenting human branis with machine elements so that humans always keep ahead of pure computers. -- ZbigniewLukasiak

: so see also CybOrgs / ProductivityOfKnowledgeWork

Singularity Investor :

JonUdell :

Atually, I think that the "singularity" is a pretty dumb idea.

It assumes that IQ == power (which is patently nonsense; as most of the leading decision makers in the world - in government and business - are demonstrably stupider (in IQ terms) than many academics or engineers who have far less influence).

It ignores the fact that computers are outperforming humans all the time in various tasks such as maths, pattern matching, remembering large quantities of data.

It ignores the fact that plenty of humans are already disempowered and left incapable by technologies they don't understand. And have been for hundreds of years.

Undoubtedly, there are many dramatic transitions and step-changes coming to society over the next 100 years or so, as new technologies are invented and new communities become connected. But it's ludicrous to imagine that we'll ever find one single moment which corresponds to "computers becoming more intelligent than humans".

Strip the singularity of that idea and it's no different from "progress as usual" ... just wrapped in overblown hype.

See also :

Compare :