The Glass Cage: Automation and Us (32 page)

BOOK: The Glass Cage: Automation and Us
9.53Mb size Format: txt, pdf, ePub
ads

Ours may be a time of material comfort and technological wonder, but it’s also a time of aimlessness and gloom. During the first decade of this century, the number of Americans taking prescription drugs to treat depression or anxiety rose by nearly a quarter. One in five adults now regularly takes such medications.
19
The suicide rate among middle-aged Americans increased by nearly 30 percent over the same ten years, according to a report from the Centers for Disease Control and Prevention.
20
More than 10 percent of American schoolchildren, and nearly 20 percent of high-school-age boys, have been given a diagnosis of attention deficit hyperactivity disorder, and two-thirds of that group take drugs like Ritalin and Adderall to treat the condition.
21
The reasons for our discontent are many and far from understood. But one of them may be that through the pursuit of a frictionless existence, we’ve succeeded in turning what Merleau-Ponty termed the ground of our lives into a barren place. Drugs that numb the nervous system provide a way to rein in our vital, animal sensorium, to shrink our being to a size that better suits our constricted environs.

F
ROST’S SONNET
also contains, as one of its many whispers, a warning about technology’s ethical hazards. There’s a brutality to the mower’s scythe. It indiscriminately cuts down flowers—those tender, pale orchises—along with the stalks of grass.
*
It frightens innocent animals, like the bright green snake. If technology embodies our dreams, it also embodies other, less benign qualities in our makeup, such as our will to power and the arrogance and insensitivity that accompany it. Frost returns to this theme a little later in
A Boy’s Will
, in a second lyric about cutting hay, “The Tuft of Flowers.” The poem’s narrator comes upon a freshly mown field and, while following the flight of a passing butterfly with his eyes, discovers in the midst of the cut grass a small cluster of flowers, “a leaping tongue of bloom” that “the scythe had spared”:

The mower in the dew had loved them thus,

By leaving them to flourish, not for us,

Nor yet to draw one thought of us to him,

But from sheer morning gladness to the brim.
22

Working with a tool is never just a practical matter, Frost is telling us, with characteristic delicacy. It always entails moral choices and has moral consequences. It’s up to us, as users and makers of tools, to humanize technology, to aim its cold blade wisely. That requires vigilance and care.

The scythe is still employed in subsistence farming in many parts of the world. But it has no place on the modern farm, the development of which, like the development of the modern factory, office, and home, has required ever more complex and efficient equipment. The threshing machine was invented in the 1780s, the mechanical reaper appeared around 1835, the baler came a few years after that, and the combine harvester began to be produced commercially toward the end of the nineteenth century. The pace of technological advance has only accelerated in the decades since, and today the trend is reaching its logical conclusion with the computerization of agriculture. The working of the soil, which Thomas Jefferson saw as the most vigorous and virtuous of occupations, is being offloaded almost entirely to machines. Farmhands are being replaced by “drone tractors” and other robotic systems that, using sensors, satellite signals, and software, plant seeds, fertilize and weed fields, harvest and package crops, and milk cows and tend other livestock.
23
In development are robo-shepherds that guide flocks through pastures. Even if scythes still whispered in the fields of the industrial farm, no one would be around to hear them.

The congeniality of hand tools encourages us to take responsibility for their use. Because we sense the tools as extensions of our bodies, parts of ourselves, we have little choice but to be intimately involved in the ethical choices they present. The scythe doesn’t choose to slash or spare the flowers; the mower does. As we become more expert in the use of a tool, our sense of responsibility for it naturally strengthens. To the novice mower, a scythe may feel like a foreign object in the hands; to the accomplished mower, hands and scythe become one thing. Talent tightens the bond between an instrument and its user. This feeling of physical and ethical entanglement doesn’t have to go away as technologies become more complex. In reporting on his historic solo flight across the Atlantic in 1927, Charles Lindbergh spoke of his plane and himself as if they were a single being: “
We
have made this flight across the ocean, not
I
or
it
.”
24
The airplane was a complicated system encompassing many components, but to a skilled pilot it still had the intimate quality of a hand tool. The love that lays the swale in rows is also the love that parts the clouds for the stick-and-rudder man.

Automation weakens the bond between tool and user not because computer-controlled systems are complex but because they ask so little of us. They hide their workings in secret code. They resist any involvement of the operator beyond the bare minimum. They discourage the development of skillfulness in their use. Automation ends up having an anesthetizing effect. We no longer feel our tools as parts of ourselves. In a seminal 1960 paper called “Man-Computer Symbiosis,” the psychologist and engineer J. C. R. Licklider described the shift in our relation to technology well. “In the man-machine systems of the past,” he wrote, “the human operator supplied the initiative, the direction, the integration, and the criterion. The mechanical parts of the systems were mere extensions, first of the human arm, then of the human eye.” The introduction of the computer changed all that. “ ‘Mechanical extension’ has given way to replacement of men, to automation, and the men who remain are there more to help than to be helped.”
25
The more automated everything gets, the easier it becomes to see technology as a kind of implacable, alien force that lies beyond our control and influence. Attempting to alter the path of its development seems futile. We press the on switch and follow the programmed routine.

To adopt such a submissive posture, however understandable it may be, is to shirk our responsibility for managing progress. A robotic harvesting machine may have no one in the driver’s seat, but it is every bit as much a product of conscious human thought as a humble scythe is. We may not incorporate the machine into our brain maps, as we do the hand tool, but on an ethical level the machine still operates as an extension of our will. Its intentions are our intentions. If a robot scares a bright green snake (or worse), we’re still to blame. We shirk a deeper responsibility as well: that of overseeing the conditions for the construction of the self. As computer systems and software applications come to play an ever larger role in shaping our lives and the world, we have an obligation to be more, not less, involved in decisions about their design and use—before technological momentum forecloses our options. We should be careful about what we make.

If that sounds naive or hopeless, it’s because we have been misled by a metaphor. We’ve defined our relation with technology not as that of body and limb or even that of sibling and sibling but as that of master and slave. The idea goes way back. It took hold at the dawn of Western philosophical thought, emerging first, as Langdon Winner has described, with the ancient Athenians.
26
Aristotle, in discussing the operation of households at the beginning of his
Politics
, argued that slaves and tools are essentially equivalent, the former acting as “animate instruments” and the latter as “inanimate instruments” in the service of the master of the house. If tools could somehow become animate, Aristotle posited, they would be able to substitute directly for the labor of slaves. “There is only one condition on which we can imagine managers not needing subordinates, and masters not needing slaves,” he mused, anticipating the arrival of computer automation and even machine learning. “This condition would be that each [inanimate] instrument could do its own work, at the word of command or by intelligent anticipation.” It would be “as if a shuttle should weave itself, and a plectrum should do its own harp-playing.”
27

The conception of tools as slaves has colored our thinking ever since. It informs society’s recurring dream of emancipation from toil, the one that was voiced by Marx and Wilde and Keynes and that continues to find expression in the works of technophiles and technophobes alike. “Wilde was right,” Evgeny Morozov, the technology critic, wrote in his 2013 book
To Save Everything, Click Here
: “mechanical slavery is the enabler of human liberation.”
28
We’ll all soon have “personal workbots” at our “beck and call,” Kevin Kelly, the technology enthusiast, proclaimed in a
Wired
essay that same year. “They will do jobs we have been doing, and do them much better than we can.” More than that, they will free us to discover “new tasks that expand who we are. They will let us focus on becoming more human than we were.”
29
Mother Jones
’s Kevin Drum, also writing in 2013, declared that “a robotic paradise of leisure and contemplation eventually awaits us.” By 2040, he predicted, our super-smart, super-reliable, super-compliant computer slaves—“they never get tired, they’re never ill-tempered, they never make mistakes”—will have rescued us from labor and delivered us into an upgraded Eden. “Our days are spent however we please, perhaps in study, perhaps playing video games. It’s up to us.”
30

With its roles reversed, the metaphor also informs society’s nightmares about technology. As we become dependent on our technological slaves, the thinking goes, we turn into slaves ourselves. From the eighteenth century on, social critics have routinely portrayed factory machinery as forcing workers into bondage. “Masses of labourers,” wrote Marx and Engels in their
Communist Manifesto
, “are daily and hourly enslaved by the machine.”
31
Today, people complain all the time about feeling like slaves to their appliances and gadgets. “Smart devices are sometimes empowering,” observed
The Economist
in “Slaves to the Smartphone,” an article published in 2012. “But for most people the servant has become the master.”
32
More dramatically still, the idea of a robot uprising, in which computers with artificial intelligence transform themselves from our slaves to our masters, has for a century been a central theme in dystopian fantasies about the future. The very word
robot
, coined by a science-fiction writer in 1920, comes from
robota
, a Czech term for servitude.

The master-slave metaphor, in addition to being morally fraught, distorts the way we look at technology. It reinforces the sense that our tools are separate from ourselves, that our instruments have an agency independent of our own. We start to judge our technologies not on what they enable us to do but rather on their intrinsic qualities as products—their cleverness, their efficiency, their novelty, their style. We choose a tool because it’s new or it’s cool or it’s fast, not because it brings us more fully into the world and expands the ground of our experiences and perceptions. We become mere consumers of technology.

More broadly, the metaphor encourages society to take a simplistic and fatalistic view of technology and progress. If we assume that our tools act as slaves on our behalf, always working in our best interest, then any attempt to place limits on technology becomes hard to defend. Each advance grants us greater freedom and takes us a stride closer to, if not utopia, then at least the best of all possible worlds. Any misstep, we tell ourselves, will be quickly corrected by subsequent innovations. If we just let progress do its thing, it will find remedies for the problems it creates. “Technology is not neutral but serves as an overwhelming positive force in human culture,” writes Kelly, expressing the self-serving Silicon Valley ideology that in recent years has gained wide currency. “We have a moral obligation to increase technology because it increases opportunities.”
33
The sense of moral obligation strengthens with the advance of automation, which, after all, provides us with the most animate of instruments, the slaves that, as Aristotle anticipated, are most capable of releasing us from our labors.

The belief in technology as a benevolent, self-healing, autonomous force is seductive. It allows us to feel optimistic about the future while relieving us of responsibility for that future. It particularly suits the interests of those who have become extraordinarily wealthy through the labor-saving, profit-concentrating effects of automated systems and the computers that control them. It provides our new plutocrats with a heroic narrative in which they play starring roles: recent job losses may be unfortunate, but they’re a necessary evil on the path to the human race’s eventual emancipation by the computerized slaves that our benevolent enterprises are creating. Peter Thiel, a successful entrepreneur and investor who has become one of Silicon Valley’s most prominent thinkers, grants that “a robotics revolution would basically have the effect of people losing their jobs.” But, he hastens to add, “it would have the benefit of freeing people up to do many other things.”
34
Being freed up sounds a lot more pleasant than being fired.

BOOK: The Glass Cage: Automation and Us
9.53Mb size Format: txt, pdf, ePub
ads

Other books

The Midnight Dress by Karen Foxlee
Riding the Bullet by Stephen King
Ride Me Cowboy by Taylor, Alycia
Chains of Mist by T. C. Metivier
Betrayed by Catherine Lloyd
The Last Time She Saw Him by Jane Haseldine
Faerie Tale by Nicola Rhodes