ThoughtStorms Wiki

Context : CriticalRationalism

Does ConjectureIsBlind, ie. the supposition that there's no right logic or way to go from evidence to hypothesis, mean that in effect you can believe anything you want?

Up to a point. For a critical rationalist there remains a very tough constraint : that what you conjecture must be consistent with all your other conjectures. So when you aquire a new belief, if it's inconsistent with the others, something must give. Either you must reject it, or you must reject some of the others. Or modify both in a way to relieve the inconsistency. That's because, while conjectures don't depend on justificatory or genealogical properties for their membership of WorldTwo or WorldThree (similar to though not quite the same as TheSpaceOfReasons) they do depend on rational relations with other beliefs in order to deserve their place there.

Let's try an example. Suppose I conjecture that there's a civilization of pink elephants on Jupiter. (And let's call this conjecture C)

Now there is, at first glance, neither evidence for nor against this conjecture. Most justificiationists would take it as a virtue of their position that, even without such evidence, their logic of belief formation will bias them towards believing that there's no such civilization. In other words, their logic drives them against C.

I, on the other hand, have no such constraints on conjecture to fall back on. So how can it be that I, also, believe that there is no civilization of pink elephants on Jupiter. Now, let's get this straight. It's not that I think the chances are 50 / 50. I'd put money on a bet that there was no civilization there. But I can't explain my preference using some probabalistic or causal story to constrain my conjecture.

How, then, do I come to the conclusion that there are no pink elephants, civilized or uncivilized there?

Well, first I should run the conjecture against my other beliefs about pinkness, elephants, civilization and Jupiter. This will immedietely highlight plenty of incompatibilities. I believe elephants to be part of a lineage of pachyderms native to Earth. So in order to believe C, I'd also need to a) modify the conjecture to refering to "elephant-like" aliens, or presume some historical events whereby elephants went from Earth to Jupiter or vice-versa. Equally, my beliefs about elephant biology and the Jovian environment mean that I also believe elephants couldn't survive on Jupiter. So, again, I'd need modify these beliefs to preserve C.

But now, these modifications are going to come up against real evidence. If I try to modify my beliefs about Jupiter, I must be careful I don't come up with a new conjecture about it's solidity or the composition of the atmosphere which violates astronomical data or evidence from the Galileo probe. Modifying my beliefs about elephant respiration or metabolism mustn't be just wrong about Earth elephants.

And so on, each modification will trigger the need for more modifications. And in order to fit conjecture C into my network of other beliefs I am likely to need to enter a labyrinth of subtle shifts and modifications to my beliefs. And will, in essense, need to have constructed a vast, complex and sophisticated new model of the universe.

Now, if I succeed in this, so be it. I accept, we're down to a 50 / 50 chance of pink elephants on Jupiter. I'm not a Quinean because I don't have any principled bias against making large changes to the network rather than small changes. I regard that as a "conservatism" which is unnecessary. (I can even forsee a day when large automated conjecturing / model building software will regularly generate huge alternative models to those we laboriously constructed by hand.)

If we end up with two models that equally fit and explain all the data, and make equally good predictions, then as a critical rationalist, I take it as equally rational to believe either of them, regardless of where they came from.

And I assume that equally rational people can disagree over which is right. Contrast DisagreementAndDishonesty)

See also :

  • Russel's teapot (and http://en.wikipedia.org/wiki/Russell%27s_teapot) and) similar, FlyingSpaghettiMonster et al, arguments. I happen to think that these arguments are wrong. It's counter-intuitive to believe in the teapot. It flies in the face of common sense. But it's not irrational because of that. In the same way, relativity or string-theory or quantum physics or helio-centricism are counter-intuitive too. Nevertheless, they are rational. The problem with IntuitionPumps like FSM and the teapot is that they simply work off the counter-intuitiveness of the phenomena and blur that in with the irationality. I think we need to preserve the distinction. Being rational is not simply the same as following common sense.

Of course, the same logic I apply to Jovian pink elephants will probably do for the teapot as well. So it is irrational for me to believe in it if I can't remake all my other concepts to fit.

Backlinks (1 items)