A Battle of Determinisms: Kelly vs Kaczynski
A central and perhaps the most revealing part of Kevin Kelly's recent book What Technology Wants, is his discussion of the argument of Theodore J. Kaczynski, a.k.a. The Unabomber. Kaczynski's collected writings were published almost simultaneously with Kelly's book, and the critical reader can now compare the two arguments in their "definitive" book form. Here I will explore what is wrong with Kelly's reasoning, especially. Kaczynski and Kelly seem share one basic conviction, however, and it can be illuminating to analyze the differences within this apparent similarity.
Kaczynski is known for having become so enraged by the impact of modern technology on human society, that he decided to kill to make his point. This tactic, I'm sorry to say, was quite effective. I can think of no other critic of modern technological society that has come even close to the media coverage bestowed on "The Unabomber" and his ideas. Kelly is also very well known, being a founder and the first editor of Wired, for example, but his way of going about things has been more peaceful and genial. His ideas are also, at least superficially, contrarian in their own way, but still decidedly more mainstream.
This mainstream quality comes out clearly in statements like this: "[If the] state of happy poverty is so desirable and good for the soul, why do none of the anticivilizationists live like this?" (p 211). This, in effect, is all that Kelly has to say against Kaczynski's argument as presented in "Industrial Society and Its Future" (now in its final form in the book Technological Slavery). Kelly's reasoning on this point is really incongruous, because the title of his critique of Kaczynski is "The Unabomber Was Right". What is going on here?
Pretending to Know
Kelly first. Why does he think that Kaczynski is "right", when he obviously does not share his standpoint?
Ted Kaczynski [...] was right about one thing: Technology has its own agenda. [...] technology is a dynamic, holistic system. [...] technology seeks and grabs resources for its own expansion. It [...] transcends human action and desires. [What Technology Wants, p 198]
Consequently, we must believe that Kelly means what he says when he writes:
As the most powerful force in the world, technology tends to dominate our thinking. Because of its ubiquity, it monopolizes any activity and questions any nontechnological solution as unreliable and impotent. Because of its power to augment us, we give precedence to the made over the born. Which do we expect to be more effective, a wild herb or an engineered drug? [...] We have been imprisoned in the technological framework of what the poet William Blake called "the mind-forg'd manacles". [p 194]
This, according to Kelly, unstoppable force he calls the Technium. Not incidentally, it is interesting to compare the last quote with a passage in Jaron Lanier's recent You Are Not a Gadget. In connection with a critique of materialistic computationalism, Lanier makes this important point:
I acknowledge that there are dangers when you allow for the legitimacy of a metaphysical idea (like the potential for consciousness to be something beyond computation). No matter how careful you are not to "fill in" the mystery with superstitions, you might encourage some fundamentalists or new-age romantics to cling to weird beliefs. "Some dreadlocked computer scientist says consciousness must be more than a computer? Then my food supplement [a wild herb perhaps] must work!"
But the danger of an engineer pretending to know more than he really does is the greater danger, especially when he can reinforce the illusion through the use of computation. The cybernetic totalists [that is Kelly, for one] awaiting the Singularity are nuttier than the folks with the food supplements." [p 43]
It seems like Lanier is also in at least partial agreement with Kaczynski, but he sees technology, in the sense of Kelly's Technium, not as an unstoppable force, but as a religion, as a matter of (blind) faith.
I will return to this core matter shortly. First, let us recall what it is that Kelly pretends to know. He is not an engineer, but he very much thinks like one, of the kind that writes starry-eyed science fiction, and then fatally mistakes it for non-fiction. (I sincerely love science fiction, but not on the basis of a blind faith in science.) He states his fundamental premise like this:
The technium [is] the seventh kingdom of life. It extends a process begun four billion years ago. Just as the evolutionary tree of Sapiens branched off from its animal precursors long ago, the technium now branches off from its precursor, the mind of the human animal. Outward from this common root flow new species of hammers, wheels, screws, refined metal, and domesticated crops, as well as rarefied species like quantum computers, genetic engineering, jet planes, and the World Wide Web. [p 49]
In "fact", modern technology, according to Kelly, is the natural and inexorable continuation of not only life on earth, but of the origin of the universe itself: "In a very real sense our entry into a service- and idea-based economy is a continuation of a trend that began at the big bang" (p 68). This is the essence of Kelly's stance, the consequences of which seem, simultaneously, to make him quite uncomfortable, as when he has to concede that Kaczynski is "right".
"If the system survives, the consequences will be inevitable"
Laying the genial prose, and meandering personal ambivalence, of Kevin Kelly aside, it is a shock of icy cold to enter into the precise and starkly consequential thought of T.J. Kaczynski. What follows from his premises, as far as the course of technological development is concerned, is no less inexorable than what follows from Kelly's, but Kaczynski entertains no illusions of cosmological significance. He sees the process, rather, as a matter of human stupidity and a collective inability to see what is going on. And that is why he not only philosophized about it, but acted against it, albeit in a misguided way.
While it is quite easy to see through the sillier sides of Kelly's book, it is a definitely more demanding task to find the holes in Kaczynski's "manifesto". And that is also why not only Kelly, but other technophiles as well, like Bill Joy, are so taken by it; after all, they appreciate a rational argument when they see one.
Kaczynski's basic premises are:
Industrialization has "greatly increased the life expectancy of those of us who live in 'advanced' countries, but they have destabilized societies, have made life unfulfilling, have subjected human beings to indignities, ave led to widespread [...] suffering [...] and have inflicted severe damage on the natural world. The continued development of technology will worsen the situation." [p 38 in the book version].
"if the system survives, the consequences will be inevitable: There is no way of reforming or modifying the system so as to prevent it from depriving people of dignity and autonomy." [ibid.]
"the bigger the system grows, the more disastrous the results of its breakdown will be" [ibid.].
However, the fact that there are systemic consequences of industrialization and technological development, does not mean that its multi-faceted dynamics are coherent. There exists an economic model which we, with Douglas Rushkoff, may call corporatism, and there exist strong political incentives to implement this model in the real world, but the actual processes of technological innovation and development do not, in themselves, obey the rationality of that model. Rather, the political implementation of the model filters or channels the manifold technological implementations that continually crop up, most of them disappearing in the process. Thus, Kaczynski's argument can be construed as being more a critique of an economic model requiring certain kinds of technology, than a critique of technology per se, with its myriads of possibilities.
Kelly misses this rather obvious point, because he is so taken with the premise that "technology" is an entity, and not some kind of indeterminate ecosystem, which is being politically and economically manipulated to certain ends rather than others. Even more than Kaczynski, Kelly is, in this regard, guilty of essentialism, and does not really appreciate the Darwinian nature of ecological systems, be they natural or artificial. Rushkoff's corporatism is precisely an attempt to tame this basically Darwinian dynamics, to construct the entity that Kelly thinks of as something cosmologically given. And since it is constructed it can be destroyed, which is what Kaczynski wants.
So the reason Kelly feels obliged to agree with Kaczynski seems to be his essentialist take on technology, which he recognizes in Kaczynski's all-encompassing condemnation. But what Kaczynski still, in effect, sees as a matter of (violent) politics, Kelly sees as a cosmological force. To that extent Kelly is, in Lanier's terms, "nuttier" than Kaczynski, because according to Kelly's faith the Technium really is unstoppable, and therefore it must be good: "The great difficulty of the anticivilizationists is that a sustainable, desirable alternative to civilization is unimaginable" [p 210]. "We willingly choose technology, with its great defects and obvious detriments, because we unconsciously calculate its virtues" [p 215]. Kelly does not explain how it can be, that we can "choose" something which is as inexorable as gravity. The choice then can, in fact, only be which subjective attitude one adopts toward technological determinism; according to Kelly it is basically misguided not to feel good about it.
Of course, there is more to the matter than this, especially in Kaczynski's case. If we disregard the conceptualization of technology/industralization in terms of an "entity", and view this as just a shorthand way of expressing something quite complex, albeit with a certain more or less contingent direction, we still have to grapple with the moral force of Kaczynski's reasoning. Consider this:
Suppose the system survives the crisis of the next several decades. By that time it will have to have solved, or at least brought under control, the principal problems that confront it, in particular that of "socializing" human beings; that is, making people sufficiently docile so that their behavior no longer threatens the system. That being accomplished, it does not appear that there would be any further obstacle to the development of technology, and it would presumably advance toward its logical conclusion, which is complete control of everything on Earth, including human beings and all other important organisms. [Technological Slavery, p 89]
From this point of view Kelly's stance can be read as designed to further our collective docility.