SingularityThinking

ThoughtStorms Wiki

"Teh singularity" is just such bollocks!

Every big technological change has "unforeseen" (and therefore, "unforeseeable") effects. Could anyone in the 13th century predict the effects of the printing press? Or could people in the 19th century predict electronics as an application of electricity? Of course not. Simply being a change beyond which we can't accurately foresee, is a pretty low barrier for any technology to clear.

ArtificialIntelligence will massively change society. Sure. But how will it change society?

Everything I've heard Kurzweil predict has either been boringly "business as usual" or some version of "ineffable". In other words, like every science fiction prediction, it's a mix of a) extrapolations, and b) some attempt to capture the idea of "stuff we can't imagine" by just using a lot of words to say "unimaginable" and "amazing".

I don't really mean to throw shade on Kurzweil here, because that's more or less inevitable. Our culture can't imagine stuff that's significantly different from what it's seen before.

Take social media as a comparison. We could predict a bunch of amazing things about social media before it took off. But not the really weird stuff. For obvious reasons. The things we predicted were just extrapolations of what we knew. We imagined it would bring greater knowledge and education to people because we made analogies with our historical understanding of the printing press and other kinds of education systems. We didn't predict that the internet would destroy knowledge through TheEndOfConsensus, because we'd never seen that happen before. We'd never seen InformationOverload at that scale, or the wide-scale social effects of it. The weird effects of social media: the balkanization into mutually incomprehensible and recriminatory belief bubbles, TheAlgorithm driving SocialMediaStrife, the Rohingya genocide driven by Facebook. None of that could be foreseen because it was so different to any experience we'd had previously. (I'm reminded of SuperStruct. In which we did predict some negative consequences eg. from "griefers" on social media, but nothing at the scale like what has actually appeared now).

So if the idea of unpredictability is already "normal" for any big technological changes. And the so-called "visionaries" only spout incremental / extrapolated predictions, rather than the discontinuous ones and sudden reversals etc. Then what does "singularity" actually do for us? As an idea?

It ends up just being ineffable woo.

Another version of the singularity is just "super-intelligence". Again, so what? Computers have been outperforming humans in some tasks since they were invented. Otherwise we wouldn't use them. Super-intelligence is just computers doing more of that in more domains. I'm happy to say that AI is going to give us more and cheaper "intelligence" than we've ever had before. (MyFearsAboutAI) But again, 99% of the predictions of what this entails are so, so fucking banal and obvious; and obviously mere "extrapolation", that it's embarrassing to have to listen to people making them. I have yet to hear one prediction about superintelligence or "the singularity" that is as interestingly surprising and counter to expectation as the, now common-place, observation that the internet has been better at disseminating and encouraging disinformation than information. (I call this AllLiesSurvive, but to be clear, I don't claim I predicted this, OR, TheEndOfConsensus. Those were just observations of what was already happening. But they are absolutely radically different from the predictions that we all did make about the spread of the internet and social software / social media.)

Is super-intelligence going to magically solve all our problems? Revolutionize medicine and make us immortal? No, of course not. There are constraints hardwired into physics. However smart we are, we aren't going to find a metal lighter than lithium. Or accelerate a starship faster-than-light. There's an outside chance we might tame NuclearFusion and therefore solve our energy and environmental problems. And AI might help with that. But if you start drilling down asking for good reasons to predict, let lone assume, this will happen, then you'll find that the reasons don't go far beyond wishful thinking. You're basically left with "well, we haven't been smart enough to figure out fusion by ourselves, but maybe smarter-than-human AIs might".

Previously

Quora Answer : What could go REALLY wrong with the Singularity?

Apr 29, 2011

You upload your brain to the computer. You find it is you. Has all your memories, skills, creativity. Feels the way you do. It even has your Facebook password and hangs out with your friends.

And yet ... and yet ... your perspective. Your "view from somewhere". The unity that makes you, you, is still stuck in your existing, now redundant, still mortal and soon to die body.

Bet that's going to suck.

No, I really just don't get the excitement about the singularity ie. the point at which machines become more "intelligent" than humans after which we stop being able to predict the future.

I can see it's a possibility. But there are plenty of other not so spectacular events which make the future pretty unpredictable too.

Phil: Computers have been smarter than us in certain dimensions ever since they were invented. That's the point. No one is upset that they do sums faster.

You: But AGI ... general intelligence, right? THAT will be new.

Phil: Not sure I believe in it. We're all just an evolved grab-bag of heuristics.

Atually, I think that the "singularity" is a pretty dumb idea.

It assumes that IQ == power (which is patently nonsense; as most of the leading decision makers in the world - in government and business - are demonstrably stupider (in IQ terms) than many academics or engineers who have far less influence).

It ignores the fact that computers are outperforming humans all the time in various tasks such as maths, pattern matching, remembering large quantities of data.

It ignores the fact that plenty of humans are already disempowered and left incapable by technologies they don't understand. And have been for hundreds of years.

Undoubtedly, there are many dramatic transitions and step-changes coming to society over the next 100 years or so, as new technologies are invented and new communities become connected. But it's ludicrous to imagine that we'll ever find one single moment which corresponds to "computers becoming more intelligent than humans".

Strip the singularity of that idea and it's no different from "progress as usual" ... just wrapped in overblown hype.

Compare :

See also :