One of the heroes of modern democratic society is the whistleblower. The brave individual who trades in a lifetime of insider status for a brief (selfless?) moment in the spotlight. If he is lucky, the public takes seriously the information he brings to the frontpage. If he is very lucky, entire structures fall at the sound of his bugle.
There is an increasingly high profile variation on the traditional whistleblower these days. He comes from the future. He does not bring with him dishes of dirt on this or that institution, but rather awakes from his research based visions pale faced and full of dread. His predecessor is not the CIA rat, but the laboratory seer. The template for his role was cut by the first generation of atomic scientists, who went from revolutionaries to reactionaries after witnessing the Blast and reflecting upon its meaning.
The latest in a long line of defectors from the bullet train of modern technology is Bill Joy, renowned scientist, presidential appointee and, by all acounts, smart and serious man. Mr. Joy of Sun Microsystems recently warned in Wired magazine of all places that current research in the fields of robotics and nanotechnology will lead to the potential for the "destructive self-replication" of artificial intelligence. Rapid advances in the field of molecular electronics in particular could allow computers to match the capacity of the human brain as early as 2030. Because these thinking machines could then reproduce themselves and enslave the human race, Mr. Joy thinks further research in this direction to be a bad idea. He even expressed muted sympathies for the Unabomber.
Happier futurists have since told Joy to relax. They argue that what is needed is simply more technology to overcome whatever speedbumps emerge on the bright road ahead. Channeling Timothy Leary from beyond inner space, Alvin Toffler reassures us that people will grow smarter along with technology in the future, thus maintaining mastery.
I prefer the skeptics to the boosters, but the argument is almost certainly an academic one. Technology has a nasty habit of being developed whether we wish it or not. The best that can be hoped for is that sensible policies prevail in controlling technology as it advances; that some measure of public control is exerted over blind commercial forces.
But we musn't wait for killer komputers to step out of science fiction and into our living rooms to find targets upon which to train our human concern. The fact that nuclear missiles are currently wired to complex systems is enough for me. That there was a chance, however small, that the Y2K bug could have triggered Doomsday leads me to think that Bill Joy is a little late to the game.
And one can get much more mundane in criticizing our silicon coated culture than ICBM's. Witness what 'computer time' has done to our collective attention span and our shared memory. Communitieseven global onesare built and sustained by connections to a common past, and computer culture has no need for a past. This can be seen in the near total lack of history on the internet, as well as the transitory nature of what appears on the web on any given day. Should the internet come to replace the printed word, we will commence writing the first blank spot in human history. It is hard enough finding documents from the Bush administration online, let alone the Federalist papers.
But if the political consequences of a culture deprived the lessons of History await us, the social lessons do not. Psychologists have been reporting for years the punishmnet inflicted on those who spend lives ordered not by the minute but by the nanosecond, not by the rhythms of human interaction but by the protocol of the porn site. It is now a well documented fact that long-term computer use creates irritable and uninformed hyperconsumers of bits and bytes. Technology dependant society is a dead end. We needn't wait for HALor Bill Joyto tell us that.