Singularity IV Section F: Will Computers Become Super Human
Jaron Lanier takes on the "cybernetic totalists"
Jaron Lanier, a pioneer in virtual reality, musician and currently the lead scientist for the National TeleImmersion Initiative, finally decides to take on Kurzweil, Drexler and others, whom he calls the "cybernetic totalists".
"There is a real chance that evolutionary psychology, artificial intelligence, Moore’s Law fetishzing, and the rest of the package will catch on in a big way, as big as Freud or Marx did in their times. Or bigger, since these ideas might end up essentially built into the software that runs our society and our lives." Lanier warns, "If that happens, the ideology of cybernetic totalist intellectuals will be amplified from novelty into a force that could cause suffering for millions of people."
In other words, we could be misled into shaping our future according to this dreadful vision that will become a self-fulfilling prophecy.
Lanier refers in particular to the "eschatologies" of Kurzweil, Moravec and Drexler, which seem to follow directly and inevitably from an understanding of the world that has been most sharply articulated by biologist Richard Dawkins, and philosopher Daniel Dennett.
Richard Dawkins is responsible for the idea that evolution can be understood in terms of competition between ‘selfish’ genes, whose only mission is to replicate. The organism is merely the gene’s way of begetting another gene. Ideas are said to spread in an analogous way as bit-ideas or ‘memes’. Dennett subscribes wholeheartedly to this view, and sees humans as simply specialized computers.
So, cybernetic totalists look at culture and see ‘memes’ that compete for brain space in humans, rather like viruses. And once we have reduced ideas into meaningless bits, then "any particular reshuffling of its bits seem unimportant". "We kid ourselves when we think we understand something, even a computer, merely because we can model or digitise it." Lanier says.
There’s a plethora of recent theories in which the brain is said to produce random distribution of subconscious ideas that compete with one another, Darwinian fashion, until only the best has survived. But do these theories really fit with what people do?
Over a decade of work world wide in Darwinian approaches to generating software has produced nothing that would make software in general any better, Lanier points out.
People have confused ideal computers with real computers. Although ‘self-reproducing automata’ are possible in theory, someone’s got to write the software that gets the process going, and humans have given absolutely no evidence of being able to write such software.
Just because the computer can get faster doesn’t mean it gets smarter. Moore’s law in hardware development must be starkly contrasted with "the Great Shame of computer science," says Lanier, "which is that we don’t seem to be able to write software much better as computers get much faster".
What worries Lanier, is decidedly not that we shall be enslaved by computers. His own vision of terror is this. The biotech industry is setting itself up for decades of expensive software trouble. While there are all sorts of useful databases and modelling packages developed by biotech firms and labs, they all exist in isolated bubbles. So vast resources will be expended to get data from one bubble to another.
"If Moore’s Law is upheld for another 20 to 30 years, there will not only be a vast amount of computation going on planet earth but also the maintenance of that computation will consume the efforts of almost every living person." We may end up with a planetful of help desks!
I have argued that’s precisely why it is time to abandon the mausoleums of useless genomic information and employ scientists in more imaginative, less mind-numbing research.
The solution is to keep all genomic data on the public Genbank, where they can be accessed freely by everyone, everywhere.
Lanier concludes: "Treating technology as if it were autonomous is the ultimate self-fulfilling prophecy. There is no difference between machine autonomy and the abdication of human responsibility."
Jaron Lanier, a pioneer in virtual reality, musician and currently the lead scientist for the National TeleImmersion Initiative, finally decides to take on Kurzweil, Drexler and others, whom he calls the "cybernetic totalists".
"There is a real chance that evolutionary psychology, artificial intelligence, Moore’s Law fetishzing, and the rest of the package will catch on in a big way, as big as Freud or Marx did in their times. Or bigger, since these ideas might end up essentially built into the software that runs our society and our lives." Lanier warns, "If that happens, the ideology of cybernetic totalist intellectuals will be amplified from novelty into a force that could cause suffering for millions of people."
In other words, we could be misled into shaping our future according to this dreadful vision that will become a self-fulfilling prophecy.
Lanier refers in particular to the "eschatologies" of Kurzweil, Moravec and Drexler, which seem to follow directly and inevitably from an understanding of the world that has been most sharply articulated by biologist Richard Dawkins, and philosopher Daniel Dennett.
Richard Dawkins is responsible for the idea that evolution can be understood in terms of competition between ‘selfish’ genes, whose only mission is to replicate. The organism is merely the gene’s way of begetting another gene. Ideas are said to spread in an analogous way as bit-ideas or ‘memes’. Dennett subscribes wholeheartedly to this view, and sees humans as simply specialized computers.
So, cybernetic totalists look at culture and see ‘memes’ that compete for brain space in humans, rather like viruses. And once we have reduced ideas into meaningless bits, then "any particular reshuffling of its bits seem unimportant". "We kid ourselves when we think we understand something, even a computer, merely because we can model or digitise it." Lanier says.
There’s a plethora of recent theories in which the brain is said to produce random distribution of subconscious ideas that compete with one another, Darwinian fashion, until only the best has survived. But do these theories really fit with what people do?
Over a decade of work world wide in Darwinian approaches to generating software has produced nothing that would make software in general any better, Lanier points out.
People have confused ideal computers with real computers. Although ‘self-reproducing automata’ are possible in theory, someone’s got to write the software that gets the process going, and humans have given absolutely no evidence of being able to write such software.
Just because the computer can get faster doesn’t mean it gets smarter. Moore’s law in hardware development must be starkly contrasted with "the Great Shame of computer science," says Lanier, "which is that we don’t seem to be able to write software much better as computers get much faster".
What worries Lanier, is decidedly not that we shall be enslaved by computers. His own vision of terror is this. The biotech industry is setting itself up for decades of expensive software trouble. While there are all sorts of useful databases and modelling packages developed by biotech firms and labs, they all exist in isolated bubbles. So vast resources will be expended to get data from one bubble to another.
"If Moore’s Law is upheld for another 20 to 30 years, there will not only be a vast amount of computation going on planet earth but also the maintenance of that computation will consume the efforts of almost every living person." We may end up with a planetful of help desks!
I have argued that’s precisely why it is time to abandon the mausoleums of useless genomic information and employ scientists in more imaginative, less mind-numbing research.
The solution is to keep all genomic data on the public Genbank, where they can be accessed freely by everyone, everywhere.
Lanier concludes: "Treating technology as if it were autonomous is the ultimate self-fulfilling prophecy. There is no difference between machine autonomy and the abdication of human responsibility."
Comments