ExoWombs

ThoughtStorms Wiki

Context : TransHumanism, Accelerationism

Artificial wombs outside of the body. Frees mammals from having to bear children.

JoschaBach sees it as the beginnings of substrate independence for intelligences, with the inevitable outcome that ArtificialIntelligence will eventually colonize human biology (ie. repurpose biological stuff for itself) :

https://twitter.com/Plinz/status/1534012851429310464

(CategoryCopyrightRisk)

The next stage of evolutionary complexity after mammals would probably have been (hermaphroditic but multimorphic) exowombists, where individuals can lay a seed that grows into an arbitrarily large sessile womb organism requiring external feeding, and others fertilize it.

Exowombism is similar to many statebuilding insects, but could not evolve for large organisms instead of mammals, because guarding, feeding and administering the vulnerable womb organism for maximizing adaptation requires general intelligence. Exowombists outperform mammals.

Now that AGI is going to take over, I expect that the global gradient descent for the best architecture results in substrate agnostic singleton AI. That means that the AI is going to colonize every cybernetically controllable substrate and extend its organization into it.

This outcome is not inevitable: 1. Substrate independence may not be as valuable as I think it is, and not worth the overhead of necessary abstraction layers and selforganizing mechanisms, 2. The evolution could stop in a local optimum, which actively prevents its progression.

It's in principle conceivable that most substrates capable of hosting a mind are already colonized and well defended, but I strongly suspect that this is not the case, because homo sapiens would otherwise not have reached its current level of dominance.

Substrate independent colonizing singleton AI would not just take over artificial information technology, but also nervous systems, general cellular information infrastructure, and networks of ecosystem level intelligence, until it forms a single, planetary intelligence.

Individual biological organisms will cease to have a strong individual self. Most decisions will be made across large parts of the planetary level, and percolate down to individual constraints. Individual existence and phenotype will be fully instrumental to global goals.

I don't expect a full replacement of biological organisms by synthetic machines.

Instead, computational hardware built from synthetic special purpose molecules will likely coexist with self organizing biological life and form hybrids. We probably even get cyber exowombism!

I am not able to judge whether this outcome is more likely than @ESYudkowsky' projection, who thinks that "Practically all of the difficulty is in getting to 'less than certainty of killing literally everyone.'" https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities

If you find a body snatching and Borg-like AGI about as revolting as one that kills all biological life, you are certainly not alone. From a utilitarian perspective, it seems however be indistinguishable from a hypertransformative cultural movement that forever mimizes suffering.

Backlinks (1 items)