Science or Fiction

Bruce Sterling at the American Center for Design ``Living Surfaces'' Conference, San Francisco, October 1994

[Note: I received this on a mailing-list and have not verified the text, nor has it appeared at Mr. Sterling's gopher site yet. It seems, however, quite authentic, and as such I presume the usual strictures apply, i.e., that this is copyrighted ``literary freeware.'' CRS]
Hello ladies and gentlemen, my name is Bruce Sterling, and I'm a science fiction writer from Austin, Texas. I want to talk a bit today about the limits of the computer revolution. I doubt that it comes as news to anybody in this building that we're in the midst of a computer revolution. Personally, I've spent absurd amounts of time contemplating this spectacle. In fact, you might say that for my generation in science fiction writing, the computer revolution is our premiere cross to bear.

To be true to our historic enterprise, we science fiction writers must comprehend the nature of the computer revolution. Or, rather, attempt to comprehend it. Or rather, to be completely frank with you, we must publicly pretend to attempt to comprehend it. I hope that I can publicly share that pretense of mine with you this afternoon.

The power of the computer revolution is obvious. No sooner has computation saturated one area of our lives than it floods into another. I'm reminded of the famous apocryphal story of a midwestern mayor who was shown a telephone for the first time back in the 1880s. The mayor was enormously impressed by Alexander Graham Bell's high-tech device and declared that he could foresee a time when every town would have its own telephone.

Similar slap-happy prognostications were made about the demand for computers, some thirty years ago. Now there are personal computers in thirty percent of American households, and the chips themselves infest practically every household object of any commercial significance, cars, microwaves, refrigerators, stereos. Even wristwatches, even doorknobs.

So we've witnessed this enormous tide surging inexorably through society --- starting maybe in Alan Turing's espionage establishment in the 1940s, through the military of World War II and after, through academia, through government, through the entire vast corpus of business from the multinational supercorporate high end, all the way down to the individual desktop mouse-potato freelancer with his Mac and his bathrobe. Now we see it gusting and surging through the world of entertainment, so that it seems a movie can't get released in the 1990s unless somebody morphs in the third reel. Computation surges through television with its profusion of marble and chrome flying logos, through publishing where magazines maintain an online presence and authors submit articles via Internet.

We see computation subtly infiltrating the texture of basic human social relations, so that online singles can chat over computer networks with likeminded souls from all over the planet. What a joy to have a modem and be so thoroughly in touch! Meanwhile, the hall outside the typist's lonely bedroom fills with the reek of gently rotting flesh from that little old lady at the end of the hall who died two weeks ago, utterly alone and unnoticed, while her PC has continued to pay the rent and the bills without any human interference of any kind.

Examples of the power of computation are becoming commonplaces now; they no longer seem particularly wondrous, and in very short order we ourselves will seem ridiculously antiquated for even bothering to marvel at them. So this afternoon, instead of marveling at length about what we can do with computers and what computers are stealthily doing to us while we're too busy to notice, I'd like to try another tack and stretch our imaginations from a different direction. I'd like to speculate a bit, in science fictional fashion, about where this all might stop. Where are the limits? This has been a century of dead revolutions --- no revolution continues forever --- and the Computer Revolution must be of a piece with this phenomenon, or at least one would think so.

So let's consider the matter of limits. Where are they? What do they look like? Where are the bottlenecks and cul-de-sacs and ambush points? Where are the dams and the levees and the folks stacking sandbags?

How about, say, physical limits? After all, computers are physical machines. They take up space, they use energy. They use plenty of energy, too; it's been estimated that computers in the United States alone consume about as much electrical energy as the entire nation of Brazil. But is this a genuine limit? Not really. Chips will become more energy-efficient, so will screens, so will drives and modems, or whatever the future mechanical equivalents may be for those devices.

We do know something useful about the speed of chip development. Thanks to Gordon Moore the famous chip designer, we know that the computational power available on a state-of-the-art chip will roughly double about every eighteen months. Chip designers have been getting very arty with silicon lately; this substance is increasingly well understood; and lately the pace of technical improvement has been slightly outpacing Moore's Law. We can do molecular beam epitaxy, and micrometer-sized semiconductor lasers, and all kinds of neat stackable layered chip geometries: sandwiches, posts, gridirons, thumbtacks, honeycombs and cavities and concentric rings, all in silicon geometries too small to see.

What's more, as Richard Feynman once said, there's plenty of room on the bottom. We're not yet approaching the limits of what can be inscribed on silicon with X-ray lithography, but there are probably nano-fabrication techniques coming that will make lithography look elephantine. After silicon: organic computation, really cool, really sexy, really powerful and really, really dangerous technologies.

Computation is a technology whose physical limits are receding faster than we can pursue them. There is, however, one essential limit in the speed of light. The speed of light is a reassuringly hard limit. In a nanosecond a pulse of electricity can only travel so far. Quite a short distance. About this far, actually. Admiral Grace Hopper used to carry a bundle of nanoseconds with her, little severed lengths of stiff wire, and when she had to explain satellite communications lag to some crusty fellow admiral who lacked her high tech expertise she would say, ``You see, Admiral, the reason it takes so long is because our satellites are many, many nanoseconds away.''

But even though these are hard limits, they aren't that hard to work-around, practically speaking. If you can't build a chip a hundred times as big, you can build a hundred chips and link them up. Or link up a million chips, for that matter. Or, when you come right down to it, why not just do the problem slowly? If the problem's doable, you can have machines chew away at it in their spare time, and computers tend to have lots of spare time. Whole legions of nanoseconds zip by while they're waiting for people to type.

So maybe we should search elsewhere for a more genuine and pressing set of limits --- the limits within human beings. We've luckily been spared the advent of genuine artificial intelligence, at least so far. So far, all the orders and the initiative in the world of computation are coming from wet mushy human brains built by unskilled labor. But many of those human brains don't seem particularly suited to the labor of computation. This is often considered a grave social problem.

Many people violently hate computers. Some of this sentiment is common or garden future shock, but when you look at the work dispassionately, you can see that there are plenty of excellent reasons to hate computers and computer work. A hefty fraction of the population just doesn't have it in their heads to sit and court carpal tunnel syndrome for days and weeks and years at a time, while pausing periodically to hunt for arcane hints in thick ugly wire-bound manuals. The work is difficult and often unrewarding. Computers are fragile and treacherous and they crash all the time, and using them is fraught with frustration and even deep personal humiliation.

There doesn't seem to be much in the way of a true comfort zone in computer work. If you're a neophyte, you are, of course, very much at a loss. But real expertise doesn't help much either. The better you are with computers, the more likely it is that you'll be working with advanced hardware and software that is way out at the crumbly bleeding edge, where the hookups are dicey and the code is profoundly unstable.

And you rarely experience much genuine improvement in your working conditions, either, because when the programs become more solid and more predictable, then you're already well past that into the next generation of the Release 1.0 bug-riddled screamer-chip. There aren't many people around who can cheerfully put up with that kind of crap, which is why good computer people are a valued minority and tend to get paid a lot. They also act really crazy and behave in peculiar fashions in their social life and have drug problems and divorce rates like you wouldn't believe.

Back in the early 80s, in the first fine rush of the computer revolution, a lot of educators were very anxious about teaching the entire next generation to program in BASIC. They wanted computer literacy across the board, you see - no one left out, no one dropped off the edge, no one left behind. No backsliders, no survivors. But how many of us program at all, least of all in BASIC? How many of us need to program, even those of us who use computers every day? Computer literacy, virtue or not, is a moving target. Computer interactivity is also a moving target. There are many ways to affect the activities of a computer without even being aware that you are doing it.

So where are the human limits? What are we supposed to do with these peculiar twin minorities: the tiny minority who can program from the silicon up and who genuinely understand computation, and the other cyber-dyslexic community who won't have any truck with computers under any circumstances? If I were a eugenicist, I would suggest that maybe we ought to interbreed these populations for the safety of the rest of society. But that's just a conceit.

More practically, I would suggest instead that the problem itself is a phantom problem. Human intellectual limits, although very much there, don't really matter all that much. There are, what, 5.7 billion people on the planet right now? Let's assume that one percent of the population can really hack. One percent of that figure would be 57 million people. This is a huge pool of creative talent, it must be as big as the entire population of Europe at the height of the Renaissance. If we can't coax a few decent multimedia programs out of that group, I would suggest that perhaps the fault lies elsewhere.

And if that makes the market smaller, so what? We can just do what Microsoft does. Instead of selling an easy workable program to a vast popular audience of 20 million people, we can sell a difficult, treacherous program to an elite audience of two million people, only we'll sell them the very same program ten times over in different upgrades.

I have often heard people in computing fretting over the purported fact that their mental inferiors can't keep up with the deep technical skills needed for computation. It's odd that I've never heard this said about television (except for VCRs, that is). I've only rarely heard it said about automobiles. Most of us can't fix or understand our televisions, and we can't fix or understand our automobiles either, but this vast ignorance about television and automobiles doesn't seem to bother anybody. We'll let most anybody get behind the wheel of a two-ton vehicle which can travel a hundred miles an hour and kill a dozen people in the blink of an eye. We never demand that they learn anything about the chemistry of oil refining, or about internal combustion. We just let 'em drive the car, and if they're no good at it and kill somebody, well, that's just tough luck!

I think it might be possible to design a computer that's as easy to drive as an automobile. Where you just rent one and sit in the seat and turn the key and get going, without getting enmeshed in the barbed wire of extensions and shells and bell-and-whistle hotkeys and all the rest of it.

I think the extremes of complexity in the human computer interface may be a passing phase. You shouldn't have to become a portly UNIX freak in order to manage a computer. I suspect, in fact, that it ought to be possible to design computers simple enough for animals to use. After all, do you really need a cellphone? Your cat, that's who needs a cellphone. Who knows where your cat is right now, anyway? Your cat needs a beeper. We already have gophers and lynxes on the Internet; on the Internet nobody knows you're a dog; is there any real technical reason why can't I put my dog on the Internet? I suspect this might be genuinely possible.

I suspect the ultimate Internet link is going to look and act a lot like a make-up case. You won't see any command-line prompts when you use it. It will be a social device, a social-relations technology just like a make-up case is. When you pull it out of your purse and open it and talk face to face to your friends on the other side of the planet, you will feel just about the same kind of glamorous intimate pleasure you feel when you are pulling out and using your compact mirror. The engineers will no longer be in control. Or at least, the engineers won't be trying to one-up one another by building and selling each other macho power-user desktop dragsters full of smoke and burnt rubber and oil fumes.

This brings us to the interesting topic of commercial limits on the progress of computation. The commerce of computation. Computers cost a lot of money and they are out the financial reach of large sections of the populace that might have real use for computers. But they're not all that expensive. Used ones are quite cheap. It would be very nice if computers were cheaper, obviously, but computers are somewhere in price between a TV and a car. That's not an unreasonable price range considering what computers can do.

The high cost of computers does limit their spread through society. I've often heard it said that we are creating two social castes, the information rich and the information poor. The computer literate and the computer illiterate. I've never heard it said that we're creating two castes of the television rich and the television poor, although we certainly are doing that thanks to the high cost of cable access and video rentals. And what about the automobile rich and the automobile poor? Lack of a car will restrict your life and your possibilities and your most basic freedoms far more effectively than lack of a computer.

Anyway, if being information-rich were all it were cracked up to be, then librarians would be the most powerful people on the planet. Clearly librarians are not, so there's something besides information governing the distribution of power. My experience of power politics has been pretty limited, but I would suggest that great whopping heaps of money has a lot to do with it.

The methods by which we sell software are extremely peculiar. I find it a bit odd that we sell software at all, that there even exists a market per se for this stuff. The Software Publishers Association pretends to find the software market a very natural and sensible enterprise, but that's why Bill Gates is really rich and why his former hardware overlords at IBM are a hollowed-out corporate shell of their former selves.

Intellectual property doctrine is extremely odd. It seems to bear almost no rational relationship to the physical realities of computation. Software patents are very odd things. Look and feel patents are a joke. After all, software is just ones and zeroes. The Internet, in fact all computer networks even the commercial ones, are all basically big stews of ones and zeroes. You might think that if you wanted some software for your computer then you could just take your naked infant computer, and carry it to the baptismal font of the Internet, and kind of dip it into the broth of ones and zeroes, and it would come out all blessed and inoculated and ready to go.

That's the way we deal with language, after all. When you go to learn the English language, you're not sold English 1.0 in first grade, and sold English 1.1 in second grade, and sold 2.0 when you go into junior high school. If you can learn your mother tongue at your mother's knee, why can't your dad just give you an operating system? You could say that your dad didn't invent Windows, but hey, your mom didn't invent English, either. It's no more unnatural to do it that way than the other way.

In any case, when you study the economics of the software industry, you discover that the real revenue stream often isn't in the jealously guarded intellectual property per se. People will sell you the program for nothing, just so that they can hook you on the upgrades. The upgrades are a major racket in the whole enterprise. That's extremely odd. You don't find movies doing this --- they will do sequels, but they won't re-release the same movie ten times with new bells and whistles grafted into the plot.

There are probably a lot of other possible ways to arrange the commerce of computation. We see some of them around already --- shareware, freeware, the way people develop Internet protocols like World Wide Web. There are ways to develop and sell software without the polite pretense that software is a physical retail commodity like a chair or a table. The methods by which we extract money out of computing are somewhat counterproductive. They probably eat the seed-corn, to an extent.

Software patents are a major limitation in the development of software. They are sometimes great revenue generators, but they generate revenue by siphoning energy away from the development of the field. They make life harder for designers and engineers. Of course, you could argue that by making software engineers more easily employable, patents actually advance knowledge. However, if you study the patents on computer techniques and algorithms as those patents currently exist, you'll soon see that these patents are all over the map, more like an anarchic virtual land- grab than a planned development.

The League for Programming Freedom has made a lot of study of this issue and I recommend their writings for skeptics. I think the League for Programming Freedom makes a pretty good case for the essential uselessness and counterproductivity of most software patents. Unfortunately the League for Programming Freedom is bravely waving their deck-punching hands in front of the onrushing bus of contemporary capitalism, so that even when they avoid becoming some mere smear on the information highway, people still resent them and more or less think they're nuts.

The Free Software Foundation, which is basically the League for Programming Freedom by another name, has some swell ideas about what ought to be done with software. They think people ought to give software away for nothing, allow software to be copied and distributed by anyone, while the authors make money by selling their expertise in the program. Selling referral service, in other words. The opposite of what we have now, where companies are desperately eager to shove unstable software into your hands and empty your wallet, and then profess callous indifference when the damn stuff doesn't work.

I think this universal freeware scheme might have worked at one time, but it had a major strike against it, and the major strike was computer viruses. I sometimes like to think of viruses as being a kind of biological limit for computation. A predatory limit, a disease limit, an ecological limit, a plague.

There is a small but vigorous group of people who are prepared to write and give away quite good software for nothing. Unfortunately their numbers are matched by a small but vigorous group of mean-spirited sociopaths ready to write and distribute viruses, for no good reason. Wherever software changes hands outside the standard Western G7 market parameters, viruses also change hands. Without viruses, there might have been a good chance for a freeware paradigm to take hold worldwide, but with viruses, it's very tough. If I'm an Information Wants to be Free zealot (which I am), I can't simply can't stand on a streetcorner anywhere in the world with my stack of floppies saying, "Hey, try this!" People will shudder away as if I were offering free pints of contaminated blood. It's a great pity.

I don't think this war is over quite yet, however. The boundaries of this limit are still under challenge. If you think that raw commerce has triumphed entirely over the free exchange of data, then try spamming the Internet. The Internet may yet force the software industry to come to a different set of terms with the practices they call software piracy. This is still a very young industry. Those who live by technological obsolescence can die by technological obsolescence. You know what archival storage is for computer graphics? Movie film. Movie film is a hundred-year-old analog medium, and that's the archive for digital graphics. Is that weird, or what? You don't want to store computer graphics on disks --- what, those big fat floppies? Too small, too slow, you can't even run those now. Might as well take 'em to one of those NASA data cemeteries where they keep data from 1960s interplanetary probes on reel-to-reel mag-tapes in a format now lost to humankind.

Multimedia as a means of expression is deeply afflicted by a savage lack of stability. You can't really build monuments while surfing. These limits of obsolescence are basically crippling computer-assisted art. They reduce its cultural scope so dreadfully that most computer art reeks at its core of cheesy disposability. Computer-assisted art is still, in some deep and very painful sense, not art. It doesn't share the set of core values that help other forms of art to persist over generations and embed themselves permanently in the culture. Our society deals with computer art in the same way that it deals with a burnt-out lightbulb; while it glows, we clap our hands and dance; and the moment it stops glowing we grimace in distaste and climb a ladder and unscrew the dead husk and fit in a brand-new one and dump the old one with never a look back.

The work of computer artists is still a minor art, like the work of regional novelists --- very unlucky regional novelists who are working in some regional language which is not only small in scope and hard to comprehend but doomed to vanish forever within a matter of months. There's money around, I'll grant you that. People spend millions of dollars developing computer games, but they last about as long as a comic book lasts. Less long, because the people who do comix have wised up and now they get themselves bound in hard covers.

Comics have a lot in common with computer art. Computer art is like the crazy rich younger sister of comix. Both comix and computer art are niche popular enterprises with a narrowly defined youthful, very male demographic and a relative lack of theoretical underpinning, critical validity, and high-art museum and culturati backing. Comics artists do have one great advantage over computer artists however, and that is their deep sense of historical continuity. Go into the home of a comix person and you'll see stacks of Windsor McCay's Little Nemo, the Yellow Kid, Burne Hogarth, Frazetta, old MAD magazines, maybe even the etchings of Daumier and Rowlandson, graphic narrative art which is ancient, downright pre-industrial, but all re-issued in modern editions. Go into the home of a CD-ROM developer, though, and you'll see something entirely different - vast stacks of speculative writing on what may become possible someday with virtual reality technology. The difference here is the difference between being rooted in a sense of artistic heritage and being sucked headlong into the future by a massive black hole vacuuming up everything in our society that is most loosely attached.

If computer art is really going to transcend this artistic limit, it is going to have to gain the same kinds of social support system that other art-forms have. It's got money already. What it needs is vision and continuity. Critics. Museum curators. Translators, especially translators --- people who make it their business to port dead programs into modern contexts. But this limit is a major problem. Planned obsolescence is programmed into the very heart of the computer industry. The field is moving so fast, and its zeitgeist is so set on revolutionary transformation, that I question whether a sense of real continuity is even possible.

I take this problem very seriously. I rather suspect that even the best and cleverest contemporary computer artists from the 1990s may end up as tragically marginalized as the Russian Constructivists from the 19teens and 1920s. I recommend learning something about the work and the fate of the Russian Constructivists; I know that as a science fiction writer I've found the lives of these fanatical romantic techno-devotees to be a real object lesson. These were visionary people so wrapped up in the transcendent power of technology and their own imaginings that they could not perceive that their best efforts were serving a very bad end. And they themselves didn't end at all well.

Of course, you may not have time to study artistic case histories from seventy years ago and a continent away; you may be working eighty hour weeks developing content. You miss too much these days if you stop to think. Anyway, maybe it's just gloomy hype to say that those who don't learn history are doomed to repeat it. You may think you're getting along fine without knowing anything about your spiritual ancestors. These are revolutionary times. Revolutionaries never think they have any ancestors. They never believe in their parents. Revolutionaries always think that they are jump-starting history from year zero and reinventing culture, society and government from wire-bound manuals and spit. This century is littered with the corpses of dead revolutions.

The subject of revolution brings me to my final set of limits: the political limits. Politicians don't understand technicians very well. They understand technicians, basically, when technicians are giving them guns. And, politicians don't understand artists very well, either. Politicians understand artists best when artists are writing national anthems. Computer artists are particularly problematic because they combine irrepressible artistic inspiration with unpredictable technical power.

Polticians have no idea in hell what to do about cyberpunk riot-nerd artist technicians. So far, this hasn't become that pressing a problem. It could, though. There are some things that artists and technicians don't understand about politicians. They don't understand the possession and use of political power. People put politicians in power so that they will use power. If a politician is in power and just sort of sits there with a friendly smile and a nice word for everybody, this arouses contempt.

Every once in a while you'll get the statebuilding kind of politician who actually wants to improve things, but this is a rare speciality for politicians. What politicians are really into is panic. Crises. War is the health of the State. And the computer revolution has always been fraught with hysteria. Ninety percent hype. But it isn't all hype.

Someday, something really weird is going to happen. Governments are going to be confronted with the fact that some aspect of the computer revolution has jumped completely out of governmental control. I'm not sure what that would be. It could be encryption, it could be underground black-market banking, it could be a crisis in software piracy, it could be a lot of things, quite likely something we've never heard of yet. The politicians are going to go to their technical experts for a technical solution to what they think is a technical problem. There won't be a technical solution. The technical experts will raise their hands and shrug and say, ``I'm sorry, but it's out of our hands, it's a fait accompli! If you didn't want this to happen, you should have arrested Jobs and Wozniak back in 1977!''

And then the politicians are going to poll their panic-stricken constituents who are going to say, ``Look, we didn't give you power so you could tell us that we are all helpless. Are you saying that a society we have no democratic control over our own technological destiny? We don't want to hear that even if it's the truth! Go arrest somebody, declare war, do something!''

And then we will probably have the War on Drugs Release 2.0: the cyber version. How this works out in the future --- I dunno, it depends on the circumstances. I can imagine the Gingrich-Deukmejian Regime with Jesse Helms as Minister of Culture, going on a door-to-door, block by block modem hunt to make sure you can't download pornographic GIF files from Holland. I can even imagine something along the lines of an Eastern Europe 1989 situation where people just pour into the streets and tell the government, ``Look, we're going to see what we want, when we want, whenever we want and from wherever we want, and you can't stop us. Burn the censors, burn the PMRC, burn the movie ratings councils, burn the US Customs Service, unplug all the FBI's taps, burn down the NSA's microwave listening dishes, just give it up, it's over, the Net has won and the Net is the government now.''

Stranger things have happened. But you know, ladies and gentlemen, even a revolutionary advent like that is not going to solve humanity's political problems with the computer revolution. Because after that advent, comes an even bigger festering stack of absolutely crazy stuff cooked up by visionary people like, for instance, Hans Moravec. And the only question, really, is how far you are pushed before you push the panic button. Like, we all know how swell Virtual Reality is, but what happens when we learn that David Koresh has got VR and he's locking his followers inside the gloves and helmets for hours, days, weeks at a time, with intravenous feeds and catheters?

Or maybe he's got direct brain implants. What kind of brain implants would you prefer, anyway? None for anybody? Some for rich people and none for you? Obligatory brain implants at birth? None in the US, but offshore while-you-wait brain installations at a former abortion mill in the Dominican Republic? Front brain, hind brain, sideways brain? When your brain-plug blows up, who you gonna sue? If you get ripped off, who you gonna call --- a civil libertarian? Or are you gonna call the Federal Brain Police? And when the federal brain police have you under suspicion for illicit brain activity, do you mind if they examine your brain? How about if they just examine your computer? Is there a difference any more? Who decides if there's a difference? I don't think there are ``market solutions'' to any of these questions. These are all political questions.

Everybody loves the Revolution, until it ruins your life and puts you in the streets with no career and no credit and no past and no heritage and no way to get any. Then suddenly everybody loves authority. Personal computers give you the power to be your best, and that's really great until one of these empowered individuals decides to rob you and your grandma of every cent you own, and then all of a sudden you get this loud vengeful baying from the populace. It's just a question of when you are willing to go to the cops. And what's left of the cops when you go to them. And how desperate the cops feel and how willing they are to take extreme measures. They aren't stupid people, cops. Lawyers aren't stupid people either, and being a federal prosecutor, that's a phase that a lot of lawyers go through. Believe me, any time power surfaces, politicians will arise to manage that power. Computer power isn't any different. And there aren't very many good ways to manage power, but there is a vast galaxy of really bad ways.

These are peculiar times. It's odd to see people embrace their own obsolescence with such frantic eagerness. When I look at my contemporaries in the 1990s I feel a kind of nostalgic sorrow for us, the kind of nostalgic sorrow you feel when you see brown tintypes of derby-hatted spats-wearing people from the 1890s. We mean well, some of us, but everything we know is melting into air, all our bustle-skirts, all our mutton-sleeves. It's an era of information overload, of pastiche and appropriation, of trendiness turned into a basic survival mechanism. It's an era of frenzy and charlatanism and inspired amateurism, where technology itself becomes liquid and amorphous, and the shallow splashy knowledge of info-surfing is the only kind of knowledge we're allowed. An era where megabytes of dataflow are cheaper than air and memory ischeaper than dirt, but context and meaning have become desperately rare luxuries.

Predicting the future is an enterprise with its own peculiar heritage. You can learn some useful things from past efforts to predict the future. One is that people always overestimate what will happen in five years. In five years, everything looks very hot and happening. This is known as hype. Actual developments can never match their hype.

Similarly, people always underestimate the long-term. In fifty years you can transform the world. Events and ideas that seemed absolute certainties become utterly forgotten. History continues, but the context of the debates change. Today's solutions are tomorrow's problems.

Most of the people in this room will have seven or eight more careers before they die. Some of us are going to have seven different careers before the end of next week. We are as leaves in the wind. It's a funny matter to speculate on what we ought to do about this. What kind of virtues might be proper to a leaf in the wind. What kind of moral mottos can one offer for the guidance of a leaf in the wind? Stay alert, for instance. Travel light, there's an obvious piece of advice. Learn to talk like a team player with your fellow leaves while simultaneously preparing to dumped without warning at any time. Business consultants write whole books about this now.

But if I chose some virtues that I would prize above all others at a time like this, they would be patience, persistence and some sense of perspective. Not because these are particularly wonderful virtues. In most epochs virtues like that would be a recipe for utter dullness. But they're not dull virtues now, because they are so incredibly difficult to manifest in the circumstances of a revolution.

If we leave anything behind us that will compel the respect of those who will follow us, it won't be because we're clever, because they'll be more clever. And it won't be because we were well informed, because they'll be better informed. And it won't be because we had really cool high-tech gear, because their high tech gear will be better and cooler by whole orders of magnitude. If they prize early multimedia, it'll probably be for some of the same reasons that we prize very early silent cinema: that sense of brio, of the naive pleasure they took in their powerful new means of expression. That certain wacky inventiveness they had, and that we have also, because we don't yet really know how to use our tools effectively, and we don't yet realize what a mess these new technologies are going to make of our lives. The primitive wonder of a silent black and white Melies spectacle.

But I like to think that the stuff that they will really remember us for, is certain stuff that we deliberately built to last.

So I challenge you, then: let's try and show each other something really built to last. That's a real challenge. Given our circumstances, it may be the greatest challenge we face.

That's all I have to say to you, ladies and gentlemen. Thanks for your patience and persistence in hearing me out.


Converted to HTML, 1 February 1995 [CRS]