In the Beginning...Was the Command Line (2 page)

BOOK: In the Beginning...Was the Command Line
12.08Mb size Format: txt, pdf, ePub

Now the first job that any coder needs to do when writing a new piece of software is to figure out how to take the information that is being worked with (in a graphics program, an image; in a spreadsheet, a grid of numbers) and turn it into a linear string of bytes. These strings of bytes are commonly called files or (somewhat more hiply) streams. They are to telegrams what modern humans are to Cro-Magnon man, which is to say, the same thing under a different name. All that you see on your computer screen—your Tomb Raider, your digitized voice mail messages, faxes, and word-processing documents written in thirty-seven different typefaces—is still, from the computer’s point of view, just like telegrams, except much longer and demanding of more arithmetic.

The quickest way to get a taste of this is to fire up your web browser, visit a site on the Net, and then select the View/Document Source menu item. You will get a bunch of computer code that looks something like this:

 


 


 

C R Y P T O N O M I C O N

 


 


 








 


 



 


 


 


 


 

This crud is called HTML (HyperText Markup Language) and it is basically a very simple programming language instructing your web browser how to draw a page on a screen. Anyone can learn HTML and many people do. The important thing is that no matter what splendid multimedia web pages they might represent, HTML files are just telegrams.

When Ronald Reagan was a radio announcer, he used to call baseball games that he did not physically attend by reading the terse descriptions that trickled in over the telegraph wire and were printed out on a paper tape. He would sit there, all by himself in a padded room with
a microphone, and the paper tape would creep out of the machine and crawl over the palm of his hand printed with cryptic abbreviations. If the count went to three and two, Reagan would describe the scene as he saw it in his mind’s eye: “The brawny left-hander steps out of the batter’s box to wipe the sweat from his brow. The umpire steps forward to sweep the dirt from home plate,” and so on. When the cryptogram on the paper tape announced a base hit, he would whack the edge of the table with a pencil, creating a little sound effect, and describe the arc of the ball as if he could actually see it. His listeners, many of whom presumably thought that Reagan was actually at the ballpark watching the game, would reconstruct the scene in their minds according to his descriptions.

This is exactly how the World Wide Web works: the HTML files are the pithy description on the paper tape, and your web browser is Ronald Reagan. The same is true of graphical user interfaces in general.

So an OS is a stack of metaphors and abstractions that stands between you and the telegrams, and embodying various tricks the programmer used to convert the information you’re working with—be it images, e-mail messages, movies, or word-processing documents—into the necklaces of bytes that are the only things computers know how to work with. When we used actual telegraph equipment (teletypes) or their higher-tech substitutes (“glass teletypes,” or the MS-DOS command line) to work with our computers, we were very close to the bottom of that stack. When we use most modern operating
systems, though, our interaction with the machine is heavily mediated. Everything we do is interpreted and translated time and again as it works its way down through all of the metaphors and abstractions.

The Macintosh OS was a revolution in both the good and bad senses of that word. Obviously it was true that command line interfaces were not for everyone, and that it would be a good thing to make computers more accessible to a less technical audience—if not for altruistic reasons, then because those sorts of people constituted an incomparably vaster market. It was clear that the Mac’s engineers saw a whole new country stretching out before them; you could almost hear them muttering, “Wow! We don’t have to be bound by files as linear streams of bytes anymore,
vive la revolution
, let’s see how far we can take this!” No command line interface was available on the Macintosh; you talked to it with the mouse, or not at all. This was a statement of sorts, a credential of revolutionary purity. It seemed that the designers of the Mac intended to sweep command line interfaces into the dustbin of history.

My own personal love affair with the Macintosh began in the spring of 1984 in a computer store in Cedar Rap-ids, Iowa, when a friend of mine—coincidentally, the son of the MGB owner—showed me a Macintosh running MacPaint, the revolutionary drawing program. It ended in July of 1995 when I tried to save a big important file on my Macintosh PowerBook and instead of doing so, it annihilated the data so thoroughly that two different disk crash utility programs were unable to find any trace
that it had ever existed. During the intervening ten years, I had a passion for the MacOS that seemed righteous and reasonable at the time but in retrospect strikes me as being exactly the same sort of goofy infatuation that my friend’s dad had with his car.

The introduction of the Mac triggered a sort of holy war in the computer world. Were GUIs a brilliant design innovation that made computers more human-centered and therefore accessible to the masses, leading us toward an unprecedented revolution in human society, or an insulting bit of audiovisual gimcrackery dreamed up by flaky Bay Area hacker types that stripped computers of their power and flexibility and turned the noble and serious work of computing into a childish video game?

This debate actually seems more interesting to me today than it did in the mid-1980s. But people more or less stopped debating it when Microsoft endorsed the idea of GUIs by coming out with the first Windows system. At this point, command-line partisans were relegated to the status of silly old grouches, and a new conflict was touched off: between users of MacOS and users of Windows.
*

There was plenty to argue about. The first Macin
toshes looked different from other PCs even when they were turned off: they consisted of one box containing both CPU (the part of the computer that does arithmetic on bits) and monitor screen. This was billed, at the time, as a philosophical statement of sorts: Apple wanted to make the personal computer into an appliance, like a toaster. But it also reflected the purely technical demands of running a graphical user interface. In a GUI machine, the chips that draw things on the screen have to be integrated with the computer’s central processing unit, or CPU, to a far greater extent than is the case with command line interfaces, which until recently didn’t even know that they weren’t just talking to teletypes.

This distinction was of a technical and abstract nature, but it became clearer when the machine crashed. (It is commonly the case with technologies that you can get the best insight about how they work by watching them fail.) When everything went to hell and the CPU began spewing out random bits, the result, on a CLI machine, was lines and lines of perfectly formed but random characters on the screen—known to cognoscenti as “going Cyrillic.” But to the MacOS, the screen was not a teletype but a place to put graphics; the image on the screen was a bitmap, a literal rendering of the contents of a particular portion of the computer’s memory. When the computer crashed and wrote gibberish into the bitmap, the result was something that looked vaguely like static on a broken television set—a “snow crash.”

And even after the introduction of Windows, the underlying differences endured; when a Windows machine
got into trouble, the old command line interface would fall down over the GUI like an asbestos fire curtain sealing off the proscenium of a burning opera. When a Macintosh got into trouble, it presented you with a cartoon of a bomb, which was funny the first time you saw it.

These were by no means superficial differences. The reversion of Windows to a CLI when it was in distress proved to Mac partisans that Windows was nothing more than a cheap facade, like a garish afghan flung over a rotted-out sofa. They were disturbed and annoyed by the sense that lurking underneath Windows’ ostensibly user-friendly interface was—literally—a subtext.

For their part, Windows fans might have made the sour observation that all computers, even Macintoshes, were built on that same subtext, and that the refusal of Mac owners to admit that fact to themselves seemed to signal a willingness, almost an eagerness, to be duped.

Anyway, a Macintosh had to switch individual bits in the memory chips on the video card, and it had to do it very fast and in arbitrarily complicated patterns. Nowadays this is cheap and easy, but in the technological regime that prevailed in the early 1980s, the only realistic way to do it was to build the motherboard (which contained the CPU) and the video system (which contained the memory that was mapped onto the screen) as a tightly integrated whole—hence the single, hermetically sealed case that made the Macintosh so distinctive.

When Windows came out, it was conspicuous for its ugliness, and its current successors, Windows 95, 98, and Windows NT, are not things that people would pay
money to look at either. Microsoft’s complete disregard for aesthetics gave all of us Mac-lovers plenty of opportunities to look down our noses at them. That Windows looked an awful lot like a direct ripoff of MacOS gave us a burning sense of moral outrage to go with it. Among people who really knew and appreciated computers (hackers, in Steven Levy’s nonpejorative sense of that word), and in a few other niches such as professional musicians, graphic artists, and schoolteachers, the Macintosh, for a while, was simply the computer. It was seen as not only a superb piece of engineering, but an embodiment of certain ideals about the use of technology to benefit mankind, while Windows was seen as both a pathetically clumsy imitation and a sinister world domination plot rolled into one. So, very early, a pattern had been established that endures to this day: people dislike Microsoft, which is okay; but they dislike it for reasons that are poorly considered, and in the end, self-defeating.

Now that the Third Rail has been firmly grasped, it is worth reviewing some basic facts here. Like any other publicly traded, for-profit corporation, Microsoft has, in effect, borrowed a bunch of money from some people (its stockholders) in order to be in the bit business. As an officer of that corporation, Bill Gates has only one responsibility, which is to maximize return on investment. He has done this incredibly well. Any actions taken in the world by Microsoft—any software released by them, for example—are basically epiphenomena, which can’t be interpreted or understood except insofar as they reflect Bill Gates’s execution of his one and only responsibility.

It follows that if Microsoft sells goods that are aesthetically unappealing, or that don’t work very well, it does not mean that they are (respectively) philistines or half-wits. It is because Microsoft’s excellent management has figured out that they can make more money for their
stockholders by releasing stuff with obvious, known imperfections than they can by making it beautiful or bug-free. This is annoying, but (in the end) not half so annoying as watching Apple inscrutably and relentlessly destroy itself.

Hostility toward Microsoft is not difficult to find on the Net, and it blends two strains: resentful people who feel Microsoft is too powerful, and disdainful people who think it’s tacky. This is all strongly reminiscent of the heyday of Communism and Socialism, when the bourgeoisie were hated from both ends: by the proles, because they had all the money, and by the intelligentsia, because of their tendency to spend it on lawn ornaments. Microsoft is the very embodiment of modern high-tech prosperity—it is, in a word, bourgeois—and so it attracts all of the same gripes.

The opening “splash screen” for Microsoft Word 6.0 summed it up pretty neatly: when you started up the program you were treated to a picture of an expensive enamel pen lying across a couple of sheets of fancy-looking handmade writing paper. It was obviously a bid to make the software look classy, and it might have worked for some, but it failed for me, because the pen was a ballpoint, and I’m a fountain pen man. If Apple had done it, they would’ve used a Mont Blanc fountain pen, or maybe a Chinese calligraphy brush. And I doubt that this was an accident. Recently I spent a while reinstalling Windows NT on one of my home computers, and many times had to double-click on the “Control Panel” icon. For reasons that are difficult to fathom, this icon consists of a picture
of a clawhammer and a chisel or screwdriver resting on top of a file folder.

These aesthetic gaffes give one an almost uncontrollable urge to make fun of Microsoft, but again, it is all beside the point—if Microsoft had done focus group testing of possible alternative graphics, they probably would have found that the average mid-level office worker associated fountain pens with effete upper management toffs and was more comfortable with ballpoints. Likewise, the regular guys, the balding dads of the world who probably bear the brunt of setting up and maintaining home computers, can probably relate best to a picture of a clawhammer—while perhaps harboring fantasies of taking a real one to their balky computers.

This is the only way I can explain certain peculiar facts about the current market for operating systems, such as that ninety percent of all customers continue to buy station wagons off the Microsoft lot while free tanks are there for the taking, right across the street.

 

A string of ones and zeroes was not a difficult thing for Bill Gates to distribute, once he’d thought of the idea. The hard part was selling it—reassuring customers that they were actually getting something in return for their money.

Anyone who has ever bought a piece of software in a store has had the curiously deflating experience of taking the bright shrink-wrapped box home, tearing it open, finding that it’s ninety-five percent air, throwing away all the little cards, party favors, and bits of trash, and load
ing the disk into the computer. The end result (after you’ve lost the disk) is nothing except some images on a computer screen, and some capabilities that weren’t there before. Sometimes you don’t even have that—you have a string of error messages instead. But your money is definitely gone. Now we are almost accustomed to this, but twenty years ago it was a very dicey business proposition.

Bill Gates made it work anyway. He didn’t make it work by selling the best software or offering the cheapest price. Instead he somehow got people to believe that they were receiving something valuable in exchange for their money. The streets of every city in the world are filled with those hulking, rattling station wagons. Anyone who doesn’t own one feels a little weird, and wonders, in spite of himself, whether it might not be time to cease resistance and buy one; anyone who does, feels confident that he has acquired some meaningful possession, even on those days when the vehicle is up on a lift in a repair shop.

All of this is perfectly congruent with membership in the bourgeoisie, which is as much a mental as a material state. And it explains why Microsoft is regularly attacked, on the Net and elsewhere, from both sides. People who are inclined to feel poor and oppressed construe everything Microsoft does as some sinister Orwellian plot. People who like to think of themselves as intelligent and informed technology users are driven crazy by the clunkiness of Windows.

Nothing is more annoying to sophisticated people than
to see someone who is rich enough to know better being tacky—unless it is to realize, a moment later, that they probably know they are tacky and they simply don’t care and they are going to go on being tacky, and rich, and happy, forever. Microsoft therefore bears the same relationship to the Silicon Valley elite as the Beverly Hillbillies did to their fussy banker, Mr. Drysdale—who is irritated not so much by the fact that the Clampetts moved to his neighborhood as by the knowledge that when Jethro is seventy years old, he’s still going to be talking like a hillbilly and wearing bib overalls, and he’s still going to be a lot richer than Mr. Drysdale.

Even the hardware that Windows ran on, when compared to the machines put out by Apple, looked like white-trash stuff, and still mostly does. The reason was that Apple was and is a hardware company, while Microsoft was and is a software company. Apple therefore had a monopoly on hardware that could run MacOS, whereas Windows-compatible hardware came out of a free market. The free market seems to have decided that people will not pay for cool-looking computers; PC hardware makers who hire designers to make their stuff look distinctive get their clocks cleaned by Taiwanese clone makers punching out boxes that look as if they belong on cinderblocks in front of someone’s trailer. Apple, on the other hand, could make their hardware as pretty as they wanted to and simply pass the higher prices on to their besotted consumers, like me. Only last week (I am writing this sentence in early January 1999) the technology sections of all the newspapers were filled with adulatory
press coverage of how Apple had released the iMac in several happenin’ new colors like Blueberry and Tangerine.

Apple has always insisted on having a hardware monopoly, except for a brief period in the mid-1990s when they allowed clone-makers to compete with them, before subsequently putting them out of business. Macintosh hardware was, consequently, expensive. You didn’t open it up and fool around with it because doing so would void the warranty. In fact, the first Mac was specifically designed to be difficult to open—you needed a kit of exotic tools, which you could buy through little ads that began to appear in the back pages of magazines a few months after the Mac came out on the market. These ads always had a certain disreputable air about them, like pitches for lock-picking tools in the backs of lurid detective magazines.

This monopolistic policy can be explained in at least three different ways.

 

The charitable explanation
is that the hardware monopoly policy reflected a drive on Apple’s part to provide a seamless, unified blending of hardware, operating system, and software. There is something to this. It is hard enough to make an OS that works well on one specific piece of hardware, designed and tested by engineers who work down the hallway from you, in the same company. Making an OS to work on arbitrary pieces of hardware, cranked out by rabidly entrepreneurial clonemakers on the other side of the international
date line, is very difficult and accounts for much of the troubles people have using Windows.

 

The financial explanation
is that Apple, unlike Microsoft, is and always has been a hardware company. It simply depends on revenue from selling hardware, and cannot exist without it.

 

The not-so-charitable explanation
has to do with Apple’s corporate culture, which is rooted in Bay Area Baby Boomdom.

Now, since I’m going to talk for a moment about culture, full disclosure is probably in order, to protect myself against allegations of conflict of interest and ethical turpitude: (1) Geographically I am a Seattleite, of a Saturnine temperament, and inclined to take a sour view of the Dionysian Bay Area, just as they tend to be annoyed and appalled by us. (2) Chronologically I am post-Baby Boom. I feel that way, at least, because I never experienced the fun and exciting parts of the whole Boomer scene—just spent a lot of time dutifully chuckling at Boomers’ maddeningly pointless anecdotes about just how stoned they got on various occasions, and politely fielding their assertions about how great their music was. But even from this remove it was possible to glean certain patterns. One that recurred as regularly as an urban legend was about how someone would move into a commune populated by sandal-wearing, peace-sign-flashing flower children and eventually discover that, underneath this facade, the guys who ran it were actually control
freaks; and that, as living in a commune, where much lip service was paid to ideals of peace, love, and harmony had deprived them of normal, socially approved outlets for their control-freakdom, it tended to come out in other, invariably more sinister, ways.

Applying this to the case of Apple Computer will be left as an exercise for the reader, and not a very difficult exercise.

It is a bit unsettling, at first, to think of Apple as a control freak, because it is completely at odds with their corporate image. Weren’t these the guys who aired the famous Super Bowl ads showing suited, blindfolded executives marching like lemmings off a cliff? Isn’t this the company that even now runs ads picturing the Dalai Lama (except in Hong Kong) and Einstein and other offbeat rebels?

It is indeed the same company, and the fact that they have been able to plant this image of themselves as creative and rebellious freethinkers in the minds of so many intelligent and media-hardened skeptics really gives one pause. It is testimony to the insidious power of expensive slick ad campaigns and, perhaps, to a certain amount of wishful thinking in the minds of people who fall for them. It also raises the question of why Microsoft is so bad at PR, when the history of Apple demonstrates that by writing large checks to good ad agencies, you can plant a corporate image in the minds of intelligent people that is completely at odds with reality. (The answer, for people who don’t like Damoclean questions, is that since Microsoft has won the hearts and minds of the
silent majority—the bourgeoisie—they don’t give a damn about having a slick image, any more then Dick Nixon did. “I want to believe”—the mantra that Fox Mulder has pinned to his office wall in
The X-Files
—applies in different ways to these two companies: Mac partisans want to believe in the image of Apple purveyed in those ads, and in the notion that Macs are somehow fundamentally different from other computers, while Windows people want to believe that they are getting something for their money, engaging in a respectable business transaction.)

In any event, as of 1987, both MacOS and Windows were out on the market, running on hardware platforms that were radically different from each other, not only in the sense that MacOS used Motorola CPU chips while Windows used Intel, but in the sense—then overlooked, but in the long run, vastly more significant—that the Apple hardware business was a rigid monopoly and the Windows side was a churning free-for-all.

But the full ramifications of this did not become clear until very recently—in fact, they are still unfolding, in remarkably strange ways, as I’ll explain when we get to Linux. The upshot is that millions of people got accustomed to using GUIs in one form or another. By doing so, they made Apple/Microsoft a lot of money. The fortunes of many people have become bound up with the ability of these companies to continue selling products whose salability is very much open to question.

Other books

Hate Crime by William Bernhardt
Little White Lies by Katie Dale
The Other Ida by Amy Mason
Mike at Wrykyn by P.G. Wodehouse
The Song of Hartgrove Hall by Natasha Solomons
Hunt the Dragon by Don Mann



Cryptonomincon by Neal Stephenson