Read Who Stole the American Dream? Online
Authors: Hedrick Smith
Having a college education helped—but it didn’t generate as much of a gain as people imagine.
The typical college graduate today makes only about $1,000 a year more than in 1980, adjusting for inflation. In the past decade,
entry-level college graduate salaries actually went backward. Their annual pay in 2010 was about $2,000 below their pay in 2000. Young men were averaging $45,000 in 2010 and women were averaging about $38,000. Not bad for starters, but that means typical college graduates, like high school graduates, have been
falling further and further behind the executive elite, such as Carol Bartz, CEO of Yahoo!, or Leslie Moonves, CEO of CBS, who were making about $150,000
a day
.
The enormity of the wealth gap between the top and the middle, Harvard economist Larry Summers said in late 2008, raises “
a critical problem of legitimacy” for American capitalism.
It didn’t have to be this way. Economists have calculated that if the laws and the social contract widely accepted by Corporate America during the middle-class boom of the 1960s and ’70s had continued,
average Americans would be far better off today. Sharing the gains from America’s economic growth from 1979 to 2006 in the same way they were shared from 1945 to 1979 would have given the typical middle-class family $12,000 more per year. Overall, 80 percent of Americans, from the bottom through the entire middle class,
would have earned $743 billion more a year; the richest 1 percent
would have made $673 billion less; and the next 4 percent down from the top would have made $140 billion less.
So it was the changes in our laws and in the way American business decided to divide its revenues that cost average Americans roughly three-quarters of a trillion dollars since the late 1970s. All that money went to the richest 5 percent of Americans.
Of course, hard times are not new to ordinary Americans. Cycles of boom and bust have periodically wreaked havoc with our economy and disrupted the lives of average families for many decades. Unemployment took its cyclical toll, and people tightened their belts. But after the downturns ended and the recovery came, people got their jobs back, the economy expanded, and the middle class got back on the up escalator.
Today, mass layoffs are no longer a cyclical convulsion during hard times, but a permanent grinding reality even in good times. Firings and job cuts, antiseptically clothed in the corporate euphemisms of “restructuring” and “downsizing,” have become a chronic economic malignancy for average Americans in good times as well as bad. In a survey of one thousand companies, the American Management Association found
rising numbers of business managements reporting big job cuts during the boom years of the late 1990s. When times got tough, from 2001 to 2003, roughly 5.4 million people were thrown out of work, mostly for reasons unrelated to their work performance. When they were surveyed in 2004,
one-third had failed to find new jobs, and more than half of those who had found work were making less than before—a pattern repeated in the latest recession.
Overall, more than
fifty-nine thousand factories and production facilities were shut down all across America over the last decade, and employment in the core manufacturing sector fell from
17.1 million to 11.8 million from January 2001 to December 2011, a punishing toll for what historically had been the best sector for steady, good-paying middle-class jobs. By pursuing a deliberate strategy of continual layoffs and by holding down wages, both of which yielded higher
profits for investors, business leaders were not only squeezing their employees, they were slowly strangling the middle-class consumer demand that the nation needed for the next economic expansion.
This trend has made it far harder for the private sector to pull the country out of a slump, and it has increased the need for government action to stimulate the economy. The evidence is clear. With each recession since 1990, it has taken longer and longer for the U.S. economy to dig out of the hole and to regain the jobs lost in recession.
After the 1990 downturn, economists coined the term “jobless recovery” because it took much longer than usual—twenty-one months—to gain back the lost jobs. It was twice as bad after the 2001 recession. Getting the jobs back took forty-six months.
Stephen Roach, as chief economist for Morgan Stanley, called the painfully slow 2002–03 recovery “
the weakest hiring cycle in modern history.” Roach was especially alarmed that even when jobs did come back, they paid less, offered fewer benefits, and provided less security. As he said, 97 percent of the new hiring from the economic bottom in 2002 through mid-2004 was for part-time work. Millions of the better-paying, full-time jobs were gone for good—sent offshore to increase corporate profits. Unemployment was increasingly a long-term structural cancer rather than a cyclical headache from which the middle class could more readily recover.
During the most recent recession, that highly profitable but job-crushing trend accelerated so that by early 2011,
The Wall Street Journal
ran a front-page story about Corporate America
sitting on idle capital amid high unemployment. “No Rush to Hire Even as Profits Soar,” the headline read. Corporations were reporting year-end profits of more than $1 trillion—up 28 percent from a year earlier—and promising dividend increases to affluent shareholders. The Dow Jones Industrial Average ran up above 12,000, while
roughly twenty-nine million Americans were either unemployed, involuntarily working part-time, or dropping out of the labor market in despair. The rich had recovered from the recession, the middle was wounded and in pain, and Corporate America was
hoarding $1.9 trillion in
cash and expanding its overseas operations. That put a crimp on America’s recovery.
The numbers confirmed the pattern of the past three decades—the toll on average middle-class employees was heavy, while Corporate America was enjoying high profits. The old social contract had been withered away.
The burden shift has turned the traditional definition of the American dream “on its ear.”
—
“THE METLIFE STUDY OF THE AMERICAN DREAM”
More and more economic risk has been offloaded by government and corporations onto the increasingly fragile balance sheets of workers and their families. This … is at the root of Americans’ rising anxiety about their economic standing and future.
—
JACOB HACKER
,
The Great Risk Shift
WHEN PAUL TAYLOR AND RICH MORIN
of the Pew Research Center did a poll on how people were faring during the Great Recession, they put a face on America—actually, two faces. They described a revealing dichotomy in the public mood—a schizophrenia in which 55 percent of Americans reported they were in deep trouble, but 45 percent claimed to be holding their own.
Taylor was so struck by these two different portraits that he titled
their report “
One Recession, Two Americas.” That dichotomy in attitudes—and experience—helps explain the nation’s sharp political divisions on such contentious issues as President Obama’s economic stimulus package and raising taxes on the wealthy.
The “Two Americas” report explained the dissonance in people’s experience, such as my own puzzlement at reading newspaper accounts of 15 million Americans being unemployed and 6.7 million families being foreclosed out of their homes, then seeing suburban restaurants jammed with people on a night out, spending as if the economy were strong.
We are literally Two Americas, remarkably out of touch with each other—the fortunate living the American Dream but lacking any practical comprehension of how the other half are suffering, month in and month out, unaware of the enervating toll of economic despair on the unfortunate half, many of whom just two or three years before had counted themselves among the fortunate.
The Pew
survey documented a class split in America. Among the losers, the picture was bleak: Two-thirds said their family’s overall financial condition had worsened; 60 percent said they had to dig into savings or retirement funds to take care of current costs; 42 percent had to borrow money from family and friends to pay their bills; 48 percent had trouble finding medical care or paying for it. The psychological toll was heavy. By contrast, the other half, the relative winners, admitted to some problems such as stock market losses but described their woes as modest and manageable.
The fault lines dividing losers and winners were income and age. Nearly two-thirds of those earning $75,000 or more said they were holding their own, while nearly 70 percent of those making under $50,000 were losing ground. Most seniors over sixty-five, buttressed by Social Security and traditional lifetime pensions, were doing all right. But 60 percent of the people of working age, between eighteen and sixty-four, gloomily reported that they were falling behind.
The one thing the two groups had in common was their verdict that the ten-year period from 2000 through 2009 was the worst decade in more than half a century—the first one in half a century where people had more negative than positive feelings. “
The single most common word or phrase used to characterize the past 10 years,” the Pew Center reported, “is downhill, and other bleak terms such as poor, decline, chaotic, disaster, scary, and depressing are common.”
That language tells how average Americans feel.
The numbers describe the damage. In just one three-month period, the final quarter of 2008, American households lost $5.1 trillion of their wealth through plunging home values and steep stock market losses—the most ever in a single quarter in the fifty-seven years that the Federal Reserve has kept records. During the full year of 2008, American households lost $11.1 trillion, close to one-fifth of their total accumulated private wealth.
More and more trillions evaporated in 2009, 2010, and into 2011. With housing prices falling steadily for five straight years, unemployment stuck at stubbornly high levels, and the stock market bouncing up and down, periodically spooked by fear of a second dip into recession, those astronomical losses became permanently etched into the lives of millions of middle-class families. Their personal safety nets had been shredded.
Translating cold numbers into a graphic picture of the hard economic realities in the lives of ordinary people is a challenge. In the 1990s, economist Edward Hyman of the ISI Group devised the Misery Index to capture the stress on average families by costly, unavoidable items that take a big bite out of family budgets and crimp what families
have left to live on.
The Misery Index tracked four items—income taxes, Social Security taxes, medical costs, and interest payments. In 1960, these four items took 24 percent of family budgets; but by the 1990s, they were taking more than 42 percent. Income taxes were lower, but Social Security payroll taxes had risen along with medical costs and interest payments on mortgages and debt. In sum, necessities, not lavish spending habits, were eating up family income.
More recently, Yale University political economist Jacob Hacker and his research team developed the Economic Insecurity Index, which logs the harshest economic blows a family can face—an income loss of 25 percent or more in a single year; superheavy medical expenses; or the exhaustion of a family’s financial reserves. Using this index, Hacker found that in 1985, roughly 10 percent of all Americans had suffered an acute financial trauma that year. By July 2010, the proportion had jumped to 20 percent—one in five American families suffering from an economic tornado ripping through their lives.
As that Pew Center poll discovered, even middle-class families who avoided the most acute distress have experienced rising economic anxiety in the past two decades.
There is good reason for pervasive middle-class angst. Financial insecurity has been written into the DNA of the New Economy. Not only has the New Economy been more volatile and the economic gains been distributed more unequally than during the era of middle-class prosperity, but Corporate America has rewritten the social contract that once underpinned the security of most average Americans. The company-provided welfare safety net that rank-and-file employees enjoyed from the 1940s into the 1970s has been sharply cut back, and a huge share of the cost burden has been shifted from companies to their employees.
In 1980, for example, 70 percent of Americans who worked at companies with one hundred or more employees got health insurance
coverage
fully paid for
by their employers. But from the 1980s onward, employers began requiring their employees to cover an increasing portion of the health costs. Other employers dropped company-financed health plans entirely, saying they could not afford them. Many small businesses made employees pay for all, or most, of the health insurance costs. As union membership declined in various industries, this trend gained momentum.