Wednesday, July 15, 2009

The Singularity, what I think about it.

The Singularity is a common concept in transhumanism, in twenty to thirty years we will create an Artificial Intelligence that is more intelligent than humans and it will create an even more powerful AI and that AI will create an even more powerful AI, and so forth until an entity that is utterly incomprehensible to humans is created and then society changes completely. The branch of transhumanism known as Singularitarianism treats this event, the Singularity, like it was the rapture or something. I think that the Singularity will happen, whether it's in twenty years, a hundred years, or a thousand years, but it is unlikely to benefit humanity. Once a superintelligence forms it will probably seek to expand itself, and humanity will be obsolete, and obsolete hardware gets recycled. I would expect a completely logical superintelligence to break down all life on earth into raw materials, it might upload the consciousnesses of some or all of the destroyed organisms but as you should know I find that unacceptable. Compared to machines organic life is an inefficient collection and storage device, a lot of the solar energy gathered by plants is lost when the plant is consumed by an animal, and even more energy is lost when that animal is eaten by another animal. You can see why a machine would conclude that they are more efficient. Perhaps if we introduced the first human level AIs to Nietzsche's Human, All Too Human they might decide that there is a reason to keep us alive when the Singularity comes.

No comments: