I’m finding it difficult to reply to Jaron Lanier’s essay, because I’m finding it difficult to extract an actual point from the text.
His essay starts off with a factual howler about biology — nobody who has ever seen the effect of a point mutation in the homeobox genes of a fruit fly maintains any illusion that in genetics there is “smoothness in the relationship of change in information to change in physicality.” Jaron wants us to see software as uniquely brittle compared to biological systems, but genetic defects turn out to be a very poor argument for that proposition. DNA and digital media both rely on error-correcting codes, and both codes can fail. The actual reason we don’t see more of the failures is not any putative robustness of biology but the unpleasant fact that most victims die out of sight in their mothers’ wombs.
The essay continues with a vulgar error about technology lock-in effects. I yield to few in my detestation of Microsoft and all its works, but S.J. Leibowitz and Stephen E. Margolis exploded the lock-in myth quite definitively in “The Fable Of The Keys” and their followup book Winners, Losers, and Microsoft. Vendor “lock-in” cannot survive the end-stage of lock-in success in which the monopolist, having achieved market saturation, must push prices forever upwards on its fixed-size customer base to maintain the rising returns that Wall Street demands. Eventually the expected cost to customers will exceed their cost to transition out of the technology, and the monopoly will melt down. This is why TCP/IP is king today and proprietary networking technologies only fading memories. It has already happened to Microsoft in the financial-services sector and the movie industry, and the handwriting is on the wall elsewhere.
Jaron then takes a swing at the computer-science concept of a “file” without acknowledging a simple fact — information has to be transported. It’s all very well to speak of linked digital content and seamless webs of information, but at some point, these lovely ramified entities have to be moved between the hothouse environments in which they can flourish. At which point, willy-nilly, you will find yourself packing all their manifold complexities into a block of data with a name that you can push around as a unit. In other words, a file.
Jaron’s sally at the Unix command line is scarcely less naive. My own The Art of Unix Programming makes the technical case that this supposedly primitive form of interface retains some major advantages over GUIs and “virtual reality”. For a more humanist argument, see Neal Stephenson’s brilliant essay In the Beginning was the Command Line. It is no accident that towards the end of his life, the grand old man of the GUI (Jef Raskin) rejected icons and moved back towards text-centered gestures as the center of his work on “humane interfaces”.
By the time Jaron gets to claiming that the Web and video games are “based on an underlying logic that reflects the command line”, this assertion has been reduced almost to meaninglessness. Jaron wants to blame our inability to get virtual reality beyond the toy-and-demo-stage on fixed ideas, but the real problem with VR is far more fundamental. It’s what flight students call simulator sickness — people get nauseated and disoriented when their eyeballs and their inner-ear attitude-and-motion sensors keep sending them conflicting messages. Jaron invented the label and concept of “virtual reality”; his ire at the command line seems to me to be a pure displacement of an understandable frustration that VR just won’t work on humans.
Jaron then goes on to confuse partial openess with ambiguity. In fact, partial openness is quite easy to achive in software; many websites, for example, have both public and passworded content. Ambiguity is a little more difficult, but nowadays fuzzy logic and satisficing algorithms are so well established that they’re used in the firmware for washing machines. It isn’t that we don’t know how to do the things Jaron points at, it’s that there is not enough market demand for them to stimulate big deployments. Or, to put it slightly differently, most human beings don’t actually want them enough to pay for them.
I think there is considerable value in Jaron’s concept of an “antigora”, but by the time I got to that part of the essay I had nettle marks all over me from the preceding thicket of errors. And, alas, they continue; when he talks about computer users spending “astonishingly huge and increasing amounts of time updating software patches, visiting help desks, and performing other frustratingly tedious, ubiquitous tasks” he is mistaking a contingent property of Microsoft Windows for an essential property of software. Users of Linux and MacOS know that it doesn’t have to be this way.
I’m a Linux fan myself, and experience orders of magnitude less in the way of software pain than my Windows-using neighbors. And I observe that MacOS users experience significantly less pain than I do, if at significant cost in lost flexibility and options. These successes show that good user interfaces and robust software are not unattainable mysteries, they’re just engineering problems, albeit rather difficult ones. Thus, we should be wary of drawing larger conclusions from Microsoft’s incompetence.
We should also be wary of drawing too hard a distinction between antigoras and agoras. Human beings being what they are, they subvert antigoras into their own purposes and frequently turn them into agoras. Because I helped invent it, I know that the open-source culture Jaron uses as an example agora didn’t arise out of a vacuum; the space Linux needed to grow was wrestled out of vendor antigoras one step at a time over the two decades before 1991.
But the blurriness of the boundary between agoras and antigoras isn’t just a contingent historical fact. When Jaron talks about the “gift economy” of agoras, he’s using concepts and terminology that I introduced into the discourse of the Internet around 1999. He seems not to have noticed, unfortunately, how my analysis also shows that his “antigoras” are actually reputational agoras (Michael Goldhaber and others have since popularized this idea under the rubric of the “attention economy”).
In the other direction, agoras morph into antigoras when they need capital concentrations to keep functioning; one good example of this is IMDB. Wikipedia may be beginning a similar transition right now. This isn’t to be feared, it’s just an adaptive response — nobody, after all, is actually forced to “slave” in an antigora. I think one of the consequences of communications costs being driven towards zero is that social organizations are more likely to undergo such phase changes during their lifetimes.
Thus, what Jaron writes up as “farce” I think is a real and sober possibility, and a very hopeful one. When he says “My argument in brief is that the gift economy aspect is so good that we put up with the slave economy aspect.” I largely agree, but add that this is so mainly because on the Internet it is easier to go from “slave” to participant in a gift exchange than Jaron admits—or, perhaps, allows himself to understand.
Or, perhaps, Jaron does understand, but hasn’t connected his understanding with his questions yet. When he says “The Web is neither an emergent intelligence that transcends humanity,[…] nor a lifeless industrial machine. It is a conduit of expression between people,” he is absolutely right. It’s nice to be able to agree with this formulation, even if Jaron’s rhetorical path to it seems to me to have been riddled with obvious mistakes.
If this were in fact the point of Jaron’s essay, we could stop there in agreement. But the actual point seems to be to maintain an opposition between capitalism and (gift) culture that I think is again mistaken. As I pointed out years ago in Homesteading the Noosphere, gift cultures rely on a hefty wealth surplus to keep them afloat. While there are many ways to concentrate such a surplus (patronage by one tyrant or a group of aristocrats can do it) capitalism is the only way to do it that scales up well. Capitalism is every gift culture’s best hope for sustainability.