Thursday, September 13, 2007

Who are we?


Within the story of Contact is a framework for understanding our place in the grand scheme of cosmic evolution. But in this particular work I don't think Sagan took some of his previously established ideas far enough. I'm talking specifically about our impending 'singularity.' Now before you get that look on your face, I have no idea if such a thing is even possible. But I think the concept could be very useful in this story. 

The term singularity is borrowed from astrophysics, where it defines the center of a black hole; a point between relativity and quantum mechanics where our understanding of physics breaks down. But the newer definition I refer to is in the context of the evolution of intelligence.

Singularity is a massive discontinuity in history, a point in our near future where prediction breaks down due to the acceleration of change in our world. In other words, as the rate of technical evolution accelerates to infinity our ability to predict the future drops to zero. Super intelligence is one possible result. Death is another.

It amazes me that even before there was a word for it, Dr Sagan sensed intuitively what is now termed the singularity. The word was coined (applied?) in 1981 by retired San Diego State University professor Vernor Vinge.

"Here I had tried a straightforward extrapolation of technology, and found myself precipitated over an abyss. It's a problem we face every time we consider the creation of intelligences greater than our own. When this happens, human history will have reached a kind of singularity - a place where extrapolation breaks down and new models must be applied - and the world will pass beyond our understanding."
-- Vernor Vinge, True Names and Other Dangers, p. 47.

It's a great idea. One ready for exploration in science fiction. Sagan touched upon the notion of the emergence of a global consciousness in The Persistence of Memory (Cosmos, episode XI). So he was certainly aware of this idea when he wrote Contact. But he left it out. The concept of ultimate life v. death was a recurring theme in his public work, especially in the context of nuclear war. But maybe the specific idea of singularity hadn't quite congealed in his mind, and so he couldn't connect it to his story. At any rate, I think this is a loss that can be corrected postmortem. Singularity theory provides a motive for the aliens to contact us. (More later...)

For a far better explanation of the singularity I refer you to Staring Into The Singularity by Eliezer Yudkowsky, and The Singularity Institute for Artificial Intelligence, (click on Overview). These guys are bent on making it happen as soon as possible. 

OK, back to us...

If you consider singularity theory carefully you will understand that humans are the end product of natural selection, but not of evolution as a whole. Natural selection has taken life as far as it can, to intelligence. We can probably evolve farther through natural selection, but the point is we don't have to. 

Agent Smith from The Matrix was right in comparing us to a virus. But this demeaning insult doesn't take into account our macroscopic (compared to a virus) brains. Sure we reproduce like crazy and run riot over our environment, just like a virus. But our brains, not our sex drive allowed this to happen.

Because our brains give us a huge advantage, other large organisms simply cannot compete with us. Micro-organisms are still a legitimate threat, but we're gaining on them fast. A virulent outbreak of Ebola may yet have a chance to get us, but not a pack of wolves. In fact the only large animal we still have to fear is ourselves.

So we are not in equilibrium with the rest of life on Earth. We kill other species and whole environments on a par with the greatest mass extinctions. And any means we posses of destroying ourselves will almost certainly take a big chunk of the biosphere with us. Natural selection would never, by itself, allow a species to proliferate to the point of endangering all life. So what did?

Right now on Earth 'Externally Self-Optimizing Selection' is the name of the game. With our large brains, eSOS created a global civilization. Our brain and its ability to manipulate the world outside our bodies has no equal on Earth. Wagon wheels, fishing poles, computers (especially computers), cars, airplanes, frozen foods, nukes, mousetraps; all are evolving at a previously unheard of rate. The key point being that none of them are having sex.

Of course eSOS begs the question... What happens when 'external' becomes 'internal?' When Internal Self-Optimizing Selection, iSOS, begins to recursively self-improve our genes and our very minds (or the surrogate minds we create), singularity may be knocking on our door.

In the real world, as in the fictional world of Contact, we humans occupy an extremely narrow, and highly volatile zone between the invention of radio astronomy and super-intelligence. The way I see it there are only two possible outcomes... Give me singularity or give me death.

Does anyone else see the potential for a great movie? 

No comments: