Read American Experiment Online
Authors: James MacGregor Burns
Out of the “Enormous Laboratory,” as Max Lerner called it, poured not only new machines and gadgets but the makings of wholly new or immensely enlarged industries—television, antibiotics, electronics, jet aircraft, rocketry. But the actual laboratories that produced this cornucopia of hardware were also the scenes of quiet encounters in one of the oldest intellectual conflicts in America—between the ideal of pure science and the practices of applied science.
Many Americans still venerated the ideal of committed, disinterested science, of free, undirected research, of idle speculation and inspired hunch, of lack of pressure for immediate “practical” results, of a clear separation from the cash nexus—all the more because they could claim only one American in the past century who was comparable to such luminaries of European science as Darwin, Mendel, and Faraday. This was Josiah Willard Gibbs, the Yale mathematician whose work in thermodynamics, vector analysis, and statistical mechanics had belatedly won him an international reputation and whose laws of chemical energetics had enormous impact on processes as varied as the refining of oil, the synthesizing of rubber, and the separation of metals from their ores.
Of scientific eminences the postwar United States had its share—Isador Isaac Rabi and J. Robert Oppenheimer in physics, Hermann Joseph Muller in genetics, George Gaylord Simpson in evolutionary biology, Harlow Shapley in astrophysics, and scores of others. Yet most of these scientists largely depended on the theoretical work of Europeans. Most notably, it was the transformation of theoretical physics undertaken by Einstein, Heisenberg, and others in Germany that had laid the groundwork for atomic fission. Now, as the United States basked in its world economic supremacy, had the time and occasion come for Americans to make great theoretical contributions to pure science?
A century before, Karl Marx had warned that science could not for long be autonomous, that it was a social activity, that the nature of the demand for science was even more important than the quality of its supply. In America, science had to pay the piper. Giant corporations were eager to put vast sums of money into research, but of a special kind, really research
and development.
While the firms varied in their toleration of free research, sooner or later they expected a payoff in new inventions, patents, profits. The R&D departments emphasized team research, committee decisions, pooled facilities, narrowly focused investigation. There was little
encouragement of idle curiosity, messing around, just looking out the window. “The underlying principle, rarely formulated precisely but ever present,” a study concluded, “has been that originality can be organized; that, provided more people can be equipped with technical knowledge and brought together in larger groups, more new ideas must emerge; that mass production will produce originality just as it can produce sausages.” Military needs created even heavier demands for scientific group-think and the organization man.
Politicians and scientists alike attacked the restrictions on Soviet science, but Americans could hardly be complacent. Aside from confronting seductive commercial and military demands on R&D, scientists had to contend with a popular double impulse to worship them and to fear them—the worship leading to unduly high popular expectations followed by disappointments, the fear leading to suspicion of their unorthodoxy and associations, as witness the classification of Robert Oppenheimer as a “security risk.” Pleased by statements such as that of Harvard’s president, James Conant—subsidies should go to persons, not projects—some scientists sought to protect their freedom of inquiry and communication by remaining in the universities. But scholars in the groves of academe were not free from political and press attacks, outside pressures for directed research, the temptations to undertake team projects and group authorship, the enticements of big corporate and military money.
Perhaps the major obstacle to “free science,” however, was the empirical tradition in American scientific thought. The heroes of American popular science were the Thomas Edisons who disdained formal abstract knowledge or theorizing and preferred to tinker “by guess and by God” in their labs. It was this feet-on-the-ground compulsion that had channeled American genius into technology and engineering. If the nation were now to make as well a truly substantial contribution to scientific progress, greater freedom to reflect and to brood, freer play for the creative imagination, were crucial.
Possibly some of the applied scientists, ensconced in their big laboratories and snug in their teams, recalled the lot of Professor Gibbs. He had worked at Yale almost alone and undisturbed. He had no team. He had few close friends and few students. He had no wife or children. He had no pay from Yale for a decade or so, until Johns Hopkins in 1880 offered him a professorship with salary, at $3,000 a year. Only then did Yale put him on its payroll, at $2,000, “with prospects of an early increase.”
One controversial application of “science” related to the men and women who in turn related to machines. Initially called “scientific management,” it was first popularized by Frederick W. Taylor. After brilliant inventions of automatic grinding, forging, and tool-feeding mechanisms, Taylor had moved on at the turn of the century to time-and-motion studies designed to fit workers more closely to the imperatives of the machines and thereby increase industrial efficiency. The production process was functionalized and standardized by dividing it into measurable and controllable units of time and motion. Under Taylor’s leadership the idea was picked up by a host of large corporations, including American Locomotive, Brighton Mills, Yale and Towne Lock. Machines, however, proved more easily manageable than men. Most workers preferred to follow their own motivations, rhythms, craft routines, group standards. A strike of molders in 1911 at the huge Watertown arsenal near Boston led to a government investigation and later a ban on Taylorism in government arsenals. A young assistant secretary, Franklin D. Roosevelt, imposed the ban in navy yards.
Turning away from Taylorism as a system of managerial dictation— Taylor himself declared each worker must become “one of a train of gearwheels”—some “industrial scientists” tried to civilize the production process by “human engineering” or “human relations.” Psychologists and other social scientists were enlisted in this cause. Often benign in intent while manipulative in technique, “humanizing” turned out to be an effort to motivate workers through their own psychological processes rather than through managerial controls. Advocates of the method said that it promoted better communication, involved workers in at least minor decisions, enhanced “group feeling” and a sense of teamwork, fostered “leadership” as opposed to “control.” During and after World War II, the idea of human relations in industry flourished.
Still the workers resisted. When Henry Ford II said that solving “the problem of human relations” would immensely speed up “progress toward lower costs,” men and women on the line could wonder whether their welfare or lower costs and higher profits were the goal. Union heads spoke sarcastically of foremen receiving training in the art of convincing workers “that they really are deeply beloved by the boss,” of employers “trooping to the special classes at Harvard” to learn that while the bosses were in business for a fast buck, workers reported to the plant each morning “for love, affection, and small friendly attentions.”
A thirty-seven-year-old worker, interviewed at home, described what real life was like “on the line.” His job was to spot-weld the front cowling onto an automobile underbody.
“I take a jig off the bench, put it in place and weld the parts together.” The jig was all made up in advance. “Takes me one minute and fifty-two seconds for each job. I walk along the line as it moves. Then I snap the jig off, walk back down the line, throw it on the bench, grab another just in time to start on the next car.”
He did this eight hours a day, with a breather in the morning and afternoon and a half-hour for lunch. “Sometimes the line breaks down. When it does we all yell ‘Whoopee!’ ”
He hated his work. “I like a job where you feel like you’re accomplishing something and doing it right.” But everything was laid out for him. “The big thing is that steady push of the conveyor—a gigantic machine which I can’t control.” He had ideas for improvements but no one asked him. “You go by the bible.”
Why not quit? “I’ll tell you honest. I’m scared to leave.” He was getting good pay, was on the pension plan, the lighting and ventilation were good, he could use the plant hospital. “Sorta trapped—you get what I mean?”
So how did he cope? By sharing the “misery” with his partner. “We gripe about the job 90 percent of the time.” By walking out with the others when something intolerable happened—like when a guy was “bounced” because he was slow on the line. By snapping at his family when he got home, his wife added. The people who ran the plant, the worker said finally, were “pretty good guys themselves.” But “you’re just a number to them. They number the stock and they number you.” He was just so much horsepower. “You’re just a cog in the wheel.”
His wife often wished he’d get another job. “He comes home at night, plops down in a chair and just sits.…”
If workers were not happy with their machines, applied scientists could invent a new machine that had less need of workers. This was automation. Mushrooming during the 1950s, the automatic equipment industry reached annual sales of over $6 billion by the end of the decade. World War II needs had hastened the development of electrical servomechanisms that operated on the principle of input-output flow and feedback in a continuously self-correcting control loop. Stimulated by such advances, the industry took off after the war and was soon integrating digital computers, sophisticated programming techniques, and vast data and memory banks into elaborate remote-control systems, including the automation of whole factories. By 1951 a Ford engine plant was feeding castings, already produced in an automated foundry, into precision broachers that machined the top and bottom of a cylinder block in thirteen seconds. Exclaimed an observer, “It just goes ‘whoosh’ and it is done.”
“Automation is a magical key of creation,” proclaimed the National
Association of Manufacturers. “Guided by electronics, powered by atomic energy, geared to the smooth, effortless workings of automation, the magic carpet of our free economy heads for distant and undreamed of horizons.” Others were less euphoric but argued that automation would shrink the number of boring and degrading repetitive tasks, raise educable workers to higher levels of skill and pay, lessen worker fatigue, depression, and unrest.
Still others were not at all enchanted by the “whoosh.” Union leaders stood
en garde.
The problem was not whether unions were for or against automation, said James B. Carey, president of the International Union of Electrical Workers. “The problem is whether or not the American people and our free society will be subjected to vast dislocations during the coming ten to twenty years, when the automatic operation of many industrial and clerical processes will be introduced.”
Fortune
had published a photograph of the “old production line”—a vast room full of workers individually tending their machines—followed by drawings of the proposed “automatic factory.” Not a worker was to be seen in the drawings—not even the ornery old “parts inspector.” A photoelectric scanning device would do his job.
At a congressional hearing late in 1955 President Walter Reuther of the Automobile Workers roundly denounced the NAM’s portrayal of automation as part of industrialization’s “Second American Revolution.” Had the NAM forgotten the misery that accompanied the first? Reuther asked. Displaced workers would not give up family ties, local roots, and neighborhood belongingness to go off to new jobs, even if they could find them and were young enough to take them. “Will automation mean the creation of whole new communities in some areas, while others are turned into ghost towns? How can we increase the market for goods and services sufficiently, and quickly enough, to match greatly accelerated increases in productivity?” Industry replied that displaced workers could find better jobs under automation, indeed that automation would create a bigger pie and “everybody’s slice will be larger.”
While the argument waxed, so did automation. Ford helped lead the way, with its partially automated cylinder-block line and automated production of crankshafts and small parts. As the number of workers “on the line” increased and the number doing more skilled “bench work” on parts and subassemblies dropped, auto worker militancy fell. It had been the more skilled workers, such as metal trimmers and torch welders, with their comradeship and critical production role, who had sparked the great strikes and demonstrations. “Automated” workers appeared to be psychologically atomized.
It was this wider impact of automation and of the tendencies that accompanied it—toward bigness, bureaucratization, even bondage—that concerned a wide array of social observers. Deep concern over such tendencies was almost as old as the trends themselves. From the rampaging machine wreckers at the dawn of the industrial revolution to the latest walkout in protest against automation, human beings had feared the machine as a threat to their status, income, security, and pride. Marx had seen that productive forces rising from technological-social change both reinforced the social order and undermined it. A century before Ford’s automation William Morris fought to preserve handicrafts against the ravaging advance of the machine.
The Englishman Samuel Butler wrote in his 1872 anti-utopian novel,
Erewhon,
that man is supposed to be the master and the machine the servant, but “the servant glides by imperceptible approaches into the master,” and now man is overly dependent on his “servant,” and his very soul is becoming a machine-made thing. Man is in bondage; he can only “hope that the machines will use us kindly.”
By the 1950s there was less concern over the economic and industrial effects of automation and other technological developments than over the psychological and social. Sociologists feared that the obsessive focus on production, combined with the fragmentation of workers’ lives into numbing pressure on the job and emptiness outside it, in the long run would impair both efficiency and the health of the whole culture. Daniel Bell noted Freud’s observation that work was the chief means of binding an individual to reality. “What will happen, then, when not only the worker but work itself is displaced by the machine?” Many social scientists were influenced by the work of Lewis Mumford, who in
Technics and Civilization
and other writings had graphically pictured the machine as part of a system of power, superfluous production as “purposeless materialism,” and technology as increasingly the master of man. Two technologies existed side by side, Mumford wrote in the wake of the 1950s, “one authoritarian, the other democratic, the first system-centered, immensely powerful, but inherently unstable, the other man-centered, relatively weak, but resourceful and durable.” It was time for human interventions in behalf of human alternatives.