Authors: Erik Brynjolfsson,Andrew McAfee
The productivity slowdown in the 1970s, and the subsequent speed-up twenty years later, had an interesting precedent. In the late 1890s, electricity was being introduced to American factories. But the “productivity paradox” of that era was that labor productivity growth did not take off for over twenty years. While the technologies involved were very different, many of the underlying dynamics were quite similar.
University of Chicago economist Chad Syverson looked closely at the underlying productivity data and showed how eerily close this analogy is.
8
As shown in figure 7.2, the slow start and subsequent acceleration of productivity growth in the electricity era matches well with the speed-up that began in the 1990s. The key to understanding this pattern is the realization that, as discussed in chapter 5, GPTs always need complements. Coming up with those can take years, or even decades, and this creates lags between the introduction of a technology and the productivity benefits. We’ve clearly seen this with both electrification and computerization.
FIGURE 7.2
Labor Productivity in Two Eras
Perhaps the most important complementary innovations are the business process changes and organizational coinventions that new technologies make possible. Paul David, an economic historian at Stanford University and the University of Oxford, examined the records of American factories when they first electrified and found that they often retained a similar layout and organization to those that were powered by steam engines.
9
In a steam engine–driven plant, power was transmitted via a large central axle, which in turn drove a series of pulleys, gears, and smaller crankshafts. If the axle was too long the torsion involved would break it, so machines needed to be clustered near the main power source, with those requiring the most power positioned closest. Exploiting all three dimensions, industrial engineers put equipment on floors above and below the central steam engines to minimize the distances involved.
Years later, when that hallowed GPT electricity replaced the steam engine, engineers simply bought the largest electric motors they could find and stuck them where the steam engines used to be. Even when brand-new factories were built, they followed the same design. Perhaps unsurprisingly, records show that the electric motors did not lead to much of an improvement in performance. There might have been less smoke and a little less noise, but the new technology was not always reliable. Overall, productivity barely budged.
Only after thirty years—long enough for the original managers to retire and be replaced by a new generation—did factory layouts change. The new factories looked much like those we see today: a single story spread out over an acre or more. Instead of a single massive engine, each piece of equipment had its own small electric motor. Instead of putting the machines needing the most power closest to the power source, the layout was based on a simple and powerful new principle: the natural workflow of materials.
Productivity didn’t merely inch upward on the resulting assembly lines; it doubled or even tripled. What’s more, for most of the subsequent century, additional complementary innovations, from lean manufacturing and steel minimills to Total Quality Management and Six Sigma principles, continued to boost manufacturing productivity.
As with earlier GPTs, significant organizational innovation is required to capture the full benefit of second machine age technologies. Tim Berners-Lee’s invention of the World Wide Web in 1989, to take an obvious example, initially benefited only a small group of particle physicists. But due in part to the power of digitization and networks to speed the diffusion of ideas, complementary innovations are happening faster than they did in the first machine age. Less than ten years after its introduction, entrepreneurs were finding ways to use the Web to reinvent publishing and retailing.
While less visible, the large enterprise-wide IT systems that companies rolled out in the 1990s have had an even bigger impact on productivity.
10
They did this mainly by making possible a wave of business process redesign. For example, Walmart drove remarkable efficiencies in retailing by introducing systems that shared point-of-sale data with their suppliers. The real key was the introduction of complementary process innovations like vendor managed inventory, cross-docking, and efficient consumer response that have become staple business-school case studies. They not only made it possible to increase sales from $1 billion a week in 1993 to $1 billion every thirty-six hours in 2001, but also helped drive dramatic increases in the entire retailing and distribution industries, accounting for much of the additional productivity growth nationwide during this period.
11
IT investment soared in the 1990s, peaking with a surge of investment in the latter half of the decade as many companies upgraded their systems to take advantage of the Internet, implement large enterprise systems, and avoid the much-hyped Y2K bug. At the same time, innovation in semiconductors took gigantic leaps, so the surging spending on IT delivered even more rapidly increasing levels of computer power. A decade after the computer productivity paradox was popularized, Harvard’s Dale Jorgenson, working with Kevin Stiroh at the New York Federal Reserve Bank did a careful growth accounting and concluded, “A consensus has emerged that a large portion of the acceleration through 2000 can be traced to the sectors of the economy that produce information technology or use IT equipment and software most intensively.”
12
But it’s not just the computer-producing sectors that are doing well. Kevin Stiroh of the New York Federal Reserve Bank found that industries that were heavier
users
of IT tended to be more productive throughout the 1990s. This pattern was even more evident in recent years, according to a careful study by Harvard’s Dale Jorgenson and two coauthors. They found that total factor productivity growth increased more between the 1990s and 2000s in IT-using industries, while it fell slightly in those sectors of the economy that did not use IT extensively.
13
It’s important to note that the correlation between computers and productivity is not just evident at the industry level; it occurs at the level of individual firms as well. In work Erik did with Lorin Hitt of the University of Pennsylvania Wharton School, he found that firms that use more IT tend to have higher levels of productivity and faster productivity growth than their industry competitors.
14
The first five years of the twenty-first century saw a renewed wave of innovation and investment, this time less focused on computer hardware and more focused on a diversified set of applications and process innovations. For instance, as Andy described in a case study he did for Harvard Business School, CVS found that their prescription drug ordering process was a source of customer frustration, so they redesigned and simplified it.
15
By embedding the steps in an enterprise-wide software system, they were able to replicate the drug ordering process in over four thousand locations, dramatically boosting customer satisfaction and ultimately profits. CVS was not atypical. In a statistical analysis of over six hundred firms that Erik did with Lorin Hitt, he found it takes an average five to seven years before full productivity benefits of computers are visible in the productivity of the firms making the investments. This reflects the time and effort required to make the other complementary investments that bring a computerization effort success. In fact, for every dollar of investment in computer hardware, companies need to invest up to another nine dollars in software, training, and business process redesign.
16
The effects of organizational changes like these became increasingly visible in the industry-level productivity statistics.
17
The productivity surge in the 1990s was most visible in computer-producing industries, but overall productivity grew even faster in the early years of the twenty-first century, when a much broader set of industries saw significant productivity gains. Like earlier GPTs, the power of computers was their ability to affect productivity far from their ‘home’ industry.
Overall, American productivity growth in the decade following the year 2000 exceeded even the high growth rates of the roaring 1990s, which in turn was higher than 1970s or 1980s growth rates had been.
18
Today American workers are more productive than they’ve ever been, but a closer look at recent numbers tells a more nuanced story. The good performance since the year 2000 was clustered in the early years of the decade. Since 2005, productivity growth has not been as strong. As noted in chapter 5, this has led to a new wave of worries about the “end of growth” by economists, journalists, and bloggers. We are not convinced by the pessimists. The productivity lull after the introduction of electricity did not mean the end of growth, nor did the lull in the 1970s.
Part of the recent slowdown simply reflects the Great Recession and its aftermath. Recessions are always times of pessimism, which is understandable, and the pessimism invariably spills over into predictions about technology and the future. The financial crisis and burst of the housing bubble led to a collapse of consumer confidence and wealth, which translated into dramatically lower demand and GDP. While the recession technically ended in June 2009, as we write this in 2013 the U.S. economy is still operating well below its potential, with unemployment at 7.6 percent and capacity utilization at 78 percent. During such a slump, any metric that includes output in the numerator, such as labor productivity, will often be at least temporarily depressed. In fact, when you look at history, you see that in the early years of the Great Depression, in the 1930s, productivity didn’t just slow but actually fell for two years in a row—something it never did in the recent slump. Growth pessimists had even more company in the 1930s than they do today, but the following three decades proved to be the best ones of the twentieth century. Go back to figure 7.2 and look most closely at the dashed line charting the years following the dip in productivity in the early 1930s. You’ll see the biggest wave of growth and bounty that the first machine age ever delivered.
The explanation for this productivity surge is in the lags that we always see when GPTs are installed. The benefits of electrification stretched for nearly a century as more and more complementary innovations were implemented. The digital GPTs of the second machine age are no less profound. Even if Moore’s Law ground to a halt today, we could expect decades of complementary innovations to unfold and continue to boost productivity. However, unlike the steam engine or electricity, second machine age technologies continue to improve at a remarkably rapid exponential pace, replicating their power with digital perfection and creating even more opportunities for combinatorial innovation. The path won’t be smooth—for one thing, we haven’t banished the business cycle—but the fundamentals are in place for bounty that vastly exceeds anything we’ve ever seen before.
*
The Rule of 70 (or, more precisely, the rule of 69.3 percent) is based on the following equation: (1 +
x
)
y
= 2 where
x
is the rate of growth and
y
is the number of years. Taking the natural logarithm of both sides gives
y
ln (1 +
x
) = ln 2. The ln (2) is 0.693 and for small
x
, ln (1 +
x
) is roughly equal to
x
, so the equation simplifies to
xy
= 70 percent.
*
One can also measure capital productivity, which is output per unit of capital input; or multifactor productivity, which is output divided by a weighted average of both capital and labor inputs. Economists sometimes use another term for multifactor productivity, the “Solow Residual,” which better reflects the fact that we don’t necessarily know its origins. Robert Solow himself noted that it was less a concrete measure of technological progress than a “measure of our ignorance.”
†
That’s a good thing, because there are natural limits to how much we can increase inputs, especially labor. They’re subject to diminishing returns—no one is going to work more than twenty-four hours a day, or employ more than 100 percent of the labor force. In contrast, productivity growth reflects ability to innovate—it’s limited only by our imaginations.
*
Output divided by labor and physical capital inputs is often more ambitiously called ‘total factor productivity.’ However, that term can be a bit misleading, because there are other inputs to production. For instance, companies can make major investments in intangible organizational capital. The more kinds of inputs we are able to measure, the better we can account for overall output growth. As a result, the residual that we label “productivity” (not explained by growth of inputs) will get smaller.