Monday, March 28, 2011

Even The Gods Die

The horror of the tsunami in Japan is not in any way lessened by the hundreds of other natural disasters we have had in recent years. In fact the fear of a Nuclear melt down in the process makes it even more urgently terrifying than most. This is not at all a surprising thing to say, yet something struck me about this month’s events, as I have been reading a lot, and spending time with people involved in what is a multifaceted movement around a technological Singularity. Many of you know about this through Ray Kurzweil who is most famous for explaining it years before Time magazine gave it a cover declaring that 2045 was indeed the year when humans and machines will be unrecognizable from each other. Perhaps that is one of the more extreme predictions, but one which seems more down to earth  is actually more philosophically complex than even how we interact with machines. That is an extrapolation that a reduction of aging will move at a faster rate than aging. I have not studied this enough, so I apologize to the really deep thinkers and scientists (such as Eliezer Yudkowsky) who have plotted historical technological growth so precisely as to make this prediction in a rather convincing way. During this same time there was the release of a documentary about Kurzweil called “Transcended Man” , which I immediately watched. Though this is a terrible spoiler, the film’s most profound metaphysical revelation came in the last question of the film, when Kurzweil was asked “is there a  God?” His reply; “not yet.” This is a profound insight into his position that humans and technology will be equivalent to what we now consider god-like characteristics. This is not a new thought, that humans can catch up with “God” or “The Gods” of mythology. After all, much of the powers of the Greek Gods of mythology are now in our power. Those abilities which we recognize as god-like may certainly be within the grasp of human created technology and artificial intelligence. So the goal for Singularity followers and Kurzweil is to stay alive until 2045, or around then, when the exponential growth they predicted reaches the intersection with the the Singularity. I have written about how appealing yet complicated this is myself 3 times (for instance here), so any reader of my blogs is likely tired of hearing that I am attracted by the desire to live forever, but at the same time philoshophically bound to mortality. This was especially evident in an obvious way that I had not considered exactly before. The earthquakes and tsunami in Japan remain an easy reminder that stopping aging only succeeds in prolonging life, which is an important goal. After all that is what hospitals and medicine exist for. That is what the NIH is for. That in some ways is also what the Singularity is for. What it also seems to be suggesting is something metaphysically unavoidable, which is the deeper existential reality that no matter how long we may live, we will in fact one day die. A Tsunami, a car crash or a bomb being dropped on our homes. The goal of the technology has  to be practical not metaphysical. Death is reality, singularity or not.

3 comments:

David Larkin said...

The politics and cost immortality may well emerge as the cause of more violence and unrest than anything else this century. Doubtless radical life extension treatments be they medical or cybernetic will be extremely expensive, at least at first. Will society permit the wealthy and powerful to become immortal while the rest of us die? It's a chilling notion for the rest of us to be ruled by immortal bionic cyber gods....

Unknown said...

That is of course a point, but like patented drugs that become generic, laptop computers that become cheap, or just clean drinking water, the technology will eventually reach even the less wealthy. There may be some violence along the way, but since it is likely not an instantaneous switch from being mortal to being immortal, it will seem less of a shock. I still think the larger question is not sociological but existential. if we frame it the way you suggest, then it is bad for society to make any technological progress as it is expensive at first. If instead though we think of it as a moral imperative to pursue the technology, but have the psychological realization that we won't achieve immortality no matter what, then we stay sane in the process of progress.

David Larkin said...

I think this is a different order of technology progression than having a better TV than your neighbor or a Ferrari. This is not just any technology but one that elevates a few to "godhood", while in the interim, which I suspect will last longer than the copyright on a drug, the rest of us will remain mortal. Also when Ray K succeeds in transferring his consciousness to cloud based AI, he will be able to copy himself infinitely, giving him an infinite advantage that the rest of us. This could be "creative destruction" applied not to our economy but to our species.