Authors: Donald Luskin,Andrew Greta
Back at Apple, Jobs quickly focused on revitalizing the business. He killed the foundering Apple Newton, a clunky handheld device that was widely lampooned, including a hilarious sequence in the popular comic Doonesbury.
In August 1998, Jobs introduced the iMac, an all-in-one unit encased in a translucent turquoise plastic shell harkening back to the days of the original Macintosh. Critics called it technically unimpressive and predicted it would be hampered by its overreliance on the universal serial bus (USB) for connectivity to peripherals. Once again, the traditionalist critics were wrong. While a nascent technology at the time, USB would become truly universal, allowing standardized connectivity of keyboards, mouses, printers, and portable memory across PCs and Macs alike. The look of the iMac itself would become a design icon of the late 1990s.
At $1,299 a pop, the iMac received over 150,000 preorders and went on to sell 278,000 units in the following six weeks.
42
Strong sales were reported for both first-time computer buyers and those switching from a Windows-based PC. In October, Jobs reported the first profitable fiscal year since 1995. As one Wall Street analyst remarked, “Steve pulled the rabbit out of the hat over the past year. Apple was in disarray. It was at the gate of extinction. Now we have a company turnaround.” The results would catapult Apple back into the mainstream computer market from nearly perishing roadside as an also-ran.
With the company stabilized and on the road to recovery, some CEOs might rest on their laurels and collect a fat bonus. But not Steve Jobs. He doesn't see himself so much as a business executive as an artist always taking new creative risks. In a
Fortune
interview he explained, “If you look at the artists, if they get really good, it always occurs to them at some point that they can do this one thing for the rest of their lives, and they can be really successful to the outside world but not really be successful to themselves. That's the moment that an artist really decides who he or she is. If they keep on risking failure, they're still artists. Dylan and Picasso were always risking failure.”
43
By the end of the decade, music companies were struggling to confront a changing technological landscape. Clinging to old-school models of physical distribution channels, they were helpless in the face of a burgeoning network of Internet connectivity. In the past, music buyers might dub a copy or two of their favorite songs to give to friends. Now the same music buyers could “rip” a CD into a digital file and share it with a worldwide network of millions with the click of a mouse. Why buy a CD when you could get the music for free through file-sharing services like Napster?
For Jobs, music held a special place in his heart, along with respect for intellectual property. He could see the problems emerging in the music industry and was appalled at the spastic response by the record companies. On one hand, they attempted to crack down on criminal piratesâoften a kid in a dorm room who was just enthusiastic about music and was listening to emerging artists. On the other hand, they offered restrictive subscription services on a pay-by-the-month model. Jobs saw a middle path and set out to change the landscape.
As he explained to
Rolling Stone
, he set up meetings with record executives. First, he made it clear that he respected the primacy of intellectual property rightsâwhat individualist wouldn't? “If copyright dies, if patents die, if the protection of intellectual property is eroded, then people will stop investing. That hurts everyone. People need to have the incentive so that if they invest and succeed, they can make a fair profit. But on another level entirely, it's just wrong to steal. Or let's put it this way: It is corrosive to one's character to steal. We want to provide a legal alternative.”
44
Next, he demolished their digital business model. “We told them the music subscription services they were pushing were going to fail. Music Net was gonna fail, Pressplay was gonna fail,” Jobs would say. “Here's why: People don't want to buy their music as a subscription. They bought 45s, then they bought LPs, they bought cassettes, they bought 8-tracks, then they bought CDs. They're going to want to buy downloads. The subscription model of buying music is bankrupt. I think you could make available the Second Coming in a subscription model, and it might not be successful.”
45
Finally, Jobs described the middle path. He would offer an Apple music store. It would be safe from viruses; it would be fast; and it would be high-quality, inexpensive, flexible, and, best of all, legal. In a way only Jobs's mind could synthesize, he struck an elegant balance between artists' rights and customer usability. Once you bought a song, you owned it. You could burn it onto a CD; you could play it directly from your computer or portable device. You could even share it with a few friends. But the embedded technology would prevent mass distribution and wide-scale pirating. At $0.99 per song, it was affordableâan impulse itemâyet artists were compensated for their work. It was brilliant.
And in some ways it took a figure as big and trusted as Jobs to move the industry seized in paralysis as it faced the technological future. Only he had the clout, the appreciation, and the respect to pull an entire industry toward a visionary future. By the end of the decade, Apple iTunes would be selling over a quarter of all music in the United States. Jobs again had rescued and transformed a moribund industryâjust because it was cool.
To Infinity and Beyond
What does the future hold for Steve Jobs? His problems with his health are well-known, but as of this writing he's been able to cheat death as brilliantly as he's been able to overcome technology and business challenges throughout his life.
Someday death will come to him, as it must to all of us. What he's built for the world will make him an immortal figure in the history of technology and business. But he's immortal in another sense, in the way that all self-motivated and self-consistent people areâthat they don't die a little bit every day by compromising themselves, that during their lifetimes they truly live.
What a fellow artist said of Howard Roark in
The Fountainhead
might have been said of Steve Jobs: “I often think he's the only one of us to achieve immortality. I don't mean in the sense of fame, and I don't mean he won't die someday. But he's living it. I think he is what the conception really means.”
Chapter 2
The Mad Collectivist
Paul Krugman as Ellsworth Toohey, the man who preaches socialism from the pages of America's newspaper of record
“We've fixed the coin. Headsâcollectivism, and tailsâcollectivism. . . . Give up your soul to a councilâor give it up to a leader. But give it up, give it up, give it up. My technique . . . don't forget the only purpose you have to accomplish. Kill the individual. Kill man's soul. The rest will happen automatically. Observe the state of the world at the present moment. Do you still think I'm crazy . . . ?”
âThe Fountainhead
Who is Ellsworth Toohey?
In
The Fountainhead
, villain Ellsworth Toohey symbolizes the collectivist, in contrast to the hero, Howard Roark, who symbolizes the individualist.
Toohey was a brilliant and articulateâbut sickly and punyâchild, raised in an impoverished household by a weak father and an overprotective mother. He was envious of his wealthier and stronger classmates, and he used his sharp mind and even sharper tongue to undermine them.
After going through a brief religious phase as a teenager, Toohey becomes an avowed socialist. In adulthood, he drops any overt association with socialist politics, but dedicates his life to promoting collectivism gradually, through his growing influence as a public intellectual.
He writes a book on architecture throughout history, and improbably it becomes a best seller. Toohey parlays that into a regular column in New York's leading newspaper, the
New York Banner
âostensibly for architectural criticism, but quickly evolving into a personal soapbox from which he promotes all manner of collectivist causes.
Beyond obvious advocacy, Toohey's strategy with the column is to promote architects and other artistsâauthors, composers, playwrightsâof no ability, to enshrine mediocrities as superstars. His goal is to corrupt the cultureâto advance collectivism by default, by eliminating from the culture any great individuals who could have offered an alternative.
Toohey singles out the brilliant architect Howard Roark as his most dangerous opponentâan exemplar of the greatness of the individual, not the collective. To defeat Roark, he masterminds a series of complicated plots aimed at discrediting Roark and economically ruining him, at one point causing him to abandon architecture and work as a laborer in a quarry to survive.
Throughout, Roark never lifts a finger to either fight Toohey or defend himself against him. Toohey is simply beneath Roark's notice, demonstrating Rand's belief that evil is small and impotent, and best simply ignored.
Christiane Amanpour's eyes darted back and forth in fear and her mouth twisted in disgust, because she could see where this was going. A guest on her Sunday morning political talk show, ABC's
This Week
, was getting dangerously overexcited, and something very regrettable was about to happen.
She could see that he was winding himself up as he talked about how a recent deficit-reduction panel hadn't been “brave enough”âbecause it failed to endorse the idea of expert panels who would determine what medical services government-funded care wouldn't pay for.
1
When ObamaCare was still being debated in Congress, conservative spokeswoman Sarah Palin had created a media sensation by calling them “death panels,” causing most liberals who supported ObamaCare to quickly distance themselves from any idea of rationing care as being tantamount to murder.
Cut to Amanpour's horrified face. Cut back to the guest. Then it happened.
The guest said, “Some years down the pike, we're going to get the real solution, which is going to be a combination of death panels and sales taxes.”
It was all the more horrifying because the guest was not a conservative, not an opponent of ObamaCare. This guest was an avid liberal, a partisan Democrat, and an enthusiastic supporter of government-run health care. He was
endorsing
death panels, not warning about them. He was saying
death panels are a
good thing
.
Apparently it didn't bother him that his choice of words, “real solution,” had historical parallels that are very disturbing in any conversation about government control over who will live and who will die.
And it was even more horrifying because of who this guest was. This was no fringe lefty wearing a tinfoil hat churning out underground newspapers in his parents' basement. This was an economics professor at Princeton, one of the country's most prestigious universities. This was the winner of the Nobel Prize in economics, the highest honor the profession can bestow. This was a columnist for the
New York Times
, the most influential newspaper in the world.
This was Paul Krugman, live, on national television, endorsing government control over life and death. And while we're at it, let's raise taxes on those who are permitted to live.
The Abysmal Pseudo-Scientist
Who does Paul Krugman think he is to think such things, never mind say them on television?
He'd like to think he's John Maynard Keynes, the venerated British economist who created the intellectual framework for modern government intervention in the economy. Keynes is something of a cult figure for modern liberal economists like Krugman, who read his texts with all the exegetical fervor with which Scientologists read the pulp fiction of L. Ron Hubbard. But Krugman will never live up to Keynes. However politicized his economic theories, Keynes's predictions were so astute that he made himself wealthy as a speculator. Economics is called “the dismal science,” but as we'll see, Krugman's predictions are so laughably bad
his
economics should be called the abysmal pseudo-science.
If Krugman is not Keynes, maybe he's John Nash, the mathematician portrayed in the film
A Beautiful Mind
. They're both Princeton economists. They've both won the Nobel Prize in economics. And they're both bonkers. In Krugman's own words, “My economic theories have no doubt been influenced by my relationship with my cats.”
2
But so far there's been no movie about Krugman, just a cameo appearance as himself in the lowbrow comedy
Get Him to the Greek
in which his lines consist of “Yeah,” “Thank you,” and “Oh boy.”
As a boy, Krugman says his “secret fantasy” was to be Hari Seldon, the “psychohistorian” from science fiction author Isaac Asimov's
Foundation
trilogy, who used what we would now call econometrics to secretly control the progress of human civilization.
3
This inspiration is what drew Krugman to study economics, as he has revealed more than once, as though he were proud of it.
4
Maybe in practice he's more like Dr. Strangelove, the dark Hari Seldon, the cold war madman of Stanley Kubrick's film masterpiece. They both have near-genocidal notions of how government should determine who lives and who diesâespecially what kind of experts they should consult in the decision. Krugman is nostalgic for the cold war era when “the U.S. government employed experts in game theory to analyze strategies of nuclear deterrence. Men with Ph.D.s in economics, like Daniel Ellsberg.”
5
Maybe you thought
real
men don't have PhDs in economics. But Krugman does.
But Paul Krugman isn't Keynes, Nash, Seldon, or Strangeloveâas much as he'd like to be. The indisputable truth is he is the living embodiment of Ellsworth Toohey, the villain from Ayn Rand's first great novel,
The Fountainhead
.
Krugman mocks people who have been inspired by Rand,
6
but he himself is living Rand with every breath he takes. Truly, the parallels between Krugman and Toohey are downright eerie.
The Economist Who Couldn't Shoot Straight
Before we proceed, let's get something important out on the table. Most critiques of Krugman as a public intellectual begin with what is apparently an obligatory disclaimer, usually in the very first sentenceâsomething to the effect that Krugman is a very accomplished and well-respected economist. After all, he won the Nobel Prize. Then comes the “But . . .” and the critique proceeds in earnest, often scathingly.
Why concede any honor at all to Krugman?
So what if he won the Nobel Prize? There are plenty of left-leaning political icons (Jimmy Carter, Al Gore, Barack Obama); witch doctors (Egas Moniz, the doctor who pioneered the frontal lobotomy); and even the odd terrorist (Yasser Arafat) who've been kissed on the forehead by the king of Sweden.
Of what real value or distinction is Krugman's work as an academic economist? There was a time a decade or more ago when his work on international trade and currencies was frequently cited in the economics literature, but far less so now, and some in the profession have come to regard it as rather trivial. Nowadays Krugman's work on trade and currencies is limited to rants in his
Times
column directed at China, advocating protectionism that most serious economists regard as retrograde and naive if not downright dangerous.
17
The real test of Krugman's mettle as an economist is the accuracy of his economic forecasting. The fact is that, with about two decades of evidence now in, Krugman's track record, to use a technical term favored by economists, sucks.
As we'll see, he's not always candid about this. But once, under the pressure of a televised debate with conservative talk-show host Bill O'Reilly, Krugman blurted out an understated if truthful self-evaluation: “Compare me . . . compare me, uh, with anyone else, and I think you'll see that my forecasting record is not great.”
18
The most egregious example of “not great” is Krugman's 1982 utterly incorrect prediction that inflation would soar. He made this prediction from no less lofty a perch than the White House, as staff member of the Council of Economic Advisers (CEA) in the first Reagan administration. Here's how Krugman remembers that time: