The Glass Cage: Automation and Us (30 page)

BOOK: The Glass Cage: Automation and Us
12.51Mb size Format: txt, pdf, ePub
ads

Social networks push us to present ourselves in ways that conform to the interests and prejudices of the companies that run them. Facebook, through its Timeline and other documentary features, encourages its members to think of their public image as indistinguishable from their identity. It wants to lock them into a single, uniform “self” that persists throughout their lives, unfolding in a coherent narrative beginning in childhood and ending, one presumes, with death. This fits with its founder’s narrow conception of the self and its possibilities. “You have one identity,” Mark Zuckerberg has said. “The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly.” He even argues that “having two identities for yourself is an example of a lack of integrity.”
35
That view, not surprisingly, dovetails with Facebook’s desire to package its members as neat and coherent sets of data for advertisers. It has the added benefit, for the company, of making concerns about personal privacy seem less valid. If having more than one identity indicates a lack of integrity, then a yearning to keep certain thoughts or activities out of public view suggests a weakness of character. But the conception of selfhood that Facebook imposes through its software can be stifling. The self is rarely fixed. It has a protean quality. It emerges through personal exploration, and it shifts with circumstances. That’s especially true in youth, when a person’s self-conception is fluid, subject to testing, experimentation, and revision. To be locked into an identity, particularly early in one’s life, may foreclose opportunities for personal growth and fulfillment.

Every piece of software contains such hidden assumptions. Search engines, in automating intellectual inquiry, give precedence to popularity and recency over diversity of opinion, rigor of argument, or quality of expression. Like all analytical programs, they have a bias toward criteria that lend themselves to statistical analysis, downplaying those that entail the exercise of taste or other subjective judgments. Automated essay-grading algorithms encourage in students a rote mastery of the mechanics of writing. The programs are deaf to tone, uninterested in knowledge’s nuances, and actively resistant to creative expression. The deliberate breaking of a grammatical rule may delight a reader, but it’s anathema to a computer. Recommendation engines, whether suggesting a movie or a potential love interest, cater to our established desires rather than challenging us with the new and unexpected. They assume we prefer custom to adventure, predictability to whimsy. The technologies of home automation, which allow things like lighting, heating, cooking, and entertainment to be meticulously programmed, impose a Taylorist mentality on domestic life. They subtly encourage people to adapt themselves to established routines and schedules, making homes more like workplaces.

The biases in software can distort societal decisions as well as personal ones. In promoting its self-driving cars, Google has suggested that the vehicles will dramatically reduce the number of crashes, if not eliminate them entirely. “Do you know that driving accidents are the number one cause of death for young people?” Sebastian Thrun said in a 2011 speech. “And do you realize that almost all of those are due to human error and not machine error, and can therefore be prevented by machines?”
36
Thrun’s argument is compelling. In regulating hazardous activities like driving, society has long given safety a high priority, and everyone appreciates the role technological innovation can play in reducing the risk of mishaps and injuries. Even here, though, things aren’t as black-and-white as Thrun implies. The ability of autonomous cars to prevent accidents and deaths remains theoretical at this point. As we’ve seen, the relationship between machinery and human error is complicated; it rarely plays out as expected. Society’s goals, moreover, are never one-dimensional. Even the desire for safety requires interrogation. We’ve always recognized that laws and behavioral norms entail trade-offs between safety and liberty, between protecting ourselves and putting ourselves at risk. We allow and sometimes encourage people to engage in dangerous hobbies, sports, and other pursuits. A full life, we know, is not a perfectly insulated life. Even when it comes to setting speed limits on highways, we balance the goal of safety with other aims.

Difficult and often politically contentious, such trade-offs shape the kind of society we live in. The question is, do we want to cede the choices to software companies? When we look to automation as a panacea for human failings, we foreclose other options. A rush to embrace autonomous cars might do more than curtail personal freedom and responsibility; it might preclude us from exploring alternative ways to reduce the probability of traffic accidents, such as strengthening driver education or promoting mass transit.

It’s worth noting that Silicon Valley’s concern with highway safety, though no doubt sincere, has been selective. The distractions caused by cell phones and smartphones have in recent years become a major factor in car crashes. An analysis by the National Safety Council implicated phone use in one-fourth of all accidents on U.S. roads in 2012.
37
Yet Google and other top tech firms have made little or no effort to develop software to prevent people from calling, texting, or using apps while driving—surely a modest undertaking compared with building a car that can drive itself. Google has even sent its lobbyists into state capitals to block bills that would ban drivers from wearing Glass and other distracting eyewear. We should welcome the important contributions computer companies can make to society’s well-being, but we shouldn’t confuse those companies’ interests with our own.

I
F WE
don’t understand the commercial, political, intellectual, and ethical motivations of the people writing our software, or the limitations inherent in automated data processing, we open ourselves to manipulation. We risk, as Latour suggests, replacing our own intentions with those of others, without even realizing that the swap has occurred. The more we habituate ourselves to the technology, the greater the risk grows.

It’s one thing for indoor plumbing to become invisible, to fade from our view as we adapt ourselves, happily, to its presence. Even if we’re incapable of fixing a leaky faucet or troubleshooting a balky toilet, we tend to have a pretty good sense of what the pipes in our homes do—and why. Most technologies that have become invisible to us through their ubiquity are like that. Their workings, and the assumptions and interests underlying their workings, are self-evident, or at least discernible. The technologies may have unintended effects—indoor plumbing changed the way people think about hygiene and privacy
38
—but they rarely have hidden agendas.

It’s a very different thing for information technologies to become invisible. Even when we’re conscious of their presence in our lives, computer systems are opaque to us. Software codes are hidden from our eyes, legally protected as trade secrets in many cases. Even if we could see them, few of us would be able to make sense of them. They’re written in languages we don’t understand. The data fed into algorithms is also concealed from us, often stored in distant, tightly guarded data centers. We have little knowledge of how the data is collected, what it’s used for, or who has access to it. Now that software and data are stored in the cloud, rather than on personal hard drives, we can’t even be sure when the workings of systems have changed. Revisions to popular programs are made all the time without our awareness. The application we used yesterday is probably not the application we use today.

The modern world has always been complicated. Fragmented into specialized domains of skill and knowledge, coiled with economic and other systems, it rebuffs any attempt to comprehend it in its entirety. But now, to a degree far beyond anything we’ve experienced before, the complexity itself is hidden from us. It’s veiled behind the artfully contrived simplicity of the screen, the user-friendly, frictionless interface. We’re surrounded by what the political scientist Langdon Winner has termed “concealed electronic complexity.” The “relationships and connections” that were “once part of mundane experience,” manifest in direct interactions among people and between people and things, have become “enshrouded in abstraction.”
39
When an inscrutable technology becomes an invisible technology, we would be wise to be concerned. At that point, the technology’s assumptions and intentions have infiltrated our own desires and actions. We no longer know whether the software is aiding us or controlling us. We’re behind the wheel, but we can’t be sure who’s driving.

THE LOVE THAT LAYS THE SWALE IN ROWS

T
HERE’S A LINE OF VERSE
I
’M ALWAYS COMING BACK TO,
and it’s been on my mind even more than usual as I’ve worked my way through the manuscript of this book:

The fact is the sweetest dream that labor knows.

It’s the second to last line of one of Robert Frost’s earliest and best poems, a sonnet called “Mowing.” He wrote it just after the turn of the twentieth century, when he was a young man, in his twenties, with a young family. He was working as a farmer, raising chickens and tending a few apple trees on a small plot of land his grandfather had bought for him in Derry, New Hampshire. It was a difficult time in his life. He had little money and few prospects. He had dropped out of two colleges, Dartmouth and Harvard, without earning a degree. He had been unsuccessful in a succession of petty jobs. He was sickly. He had nightmares. His firstborn child, a son, had died of cholera at the age of three. His marriage was troubled. “Life was peremptory,” Frost would later recall, “and threw me into confusion.”
1

But it was during those lonely years in Derry that he came into his own as a writer and an artist. Something about farming—the long, repetitive days, the solitary work, the closeness to nature’s beauty and carelessness—inspired him. The burden of labor eased the burden of life. “If I feel timeless and immortal it is from having lost track of time for five or six years there,” he would write of his stay in Derry. “We gave up winding clocks. Our ideas got untimely from not taking newspapers for a long period. It couldn’t have been more perfect if we had planned it or foreseen what we were getting into.”
2
In the breaks between chores on the farm, Frost somehow managed to write most of the poems for his first book,
A Boy’s Will
; about half the poems for his second book,
North of Boston
; and a good number of other poems that would find their way into subsequent volumes.

“Mowing,” from
A Boy’s Will
, was the greatest of his Derry lyrics. It was the poem in which he found his distinctive voice: plainspoken and conversational, but also sly and dissembling. (To really understand Frost—to really understand anything, including yourself—requires as much mistrust as trust.) As with many of his best works, “Mowing” has an enigmatic, almost hallucinatory quality that belies the simple and homely picture it paints—in this case of a man cutting a field of grass for hay. The more you read the poem, the deeper and stranger it becomes:

There was never a sound beside the wood but one,

And that was my long scythe whispering to the ground.

What was it it whispered? I knew not well myself;

Perhaps it was something about the heat of the sun,

Something, perhaps, about the lack of sound—

And that was why it whispered and did not speak.

It was no dream of the gift of idle hours,

Or easy gold at the hand of fay or elf:

Anything more than the truth would have seemed too weak

To the earnest love that laid the swale in rows,

Not without feeble-pointed spikes of flowers

(Pale orchises), and scared a bright green snake.

The fact is the sweetest dream that labor knows.

My long scythe whispered and left the hay to make.
3

We rarely look to poetry for instruction anymore, but here we see how a poet’s scrutiny of the world can be more subtle and discerning than a scientist’s. Frost understood the meaning of what we now call “flow” and the essence of what we now call “embodied cognition” long before psychologists and neurobiologists delivered the empirical evidence. His mower is not an airbrushed peasant, a romantic caricature. He’s a farmer, a man doing a hard job on a still, hot summer day. He’s not dreaming of “idle hours” or “easy gold.” His mind is on his work—the bodily rhythm of the cutting, the weight of the tool in his hands, the stalks piling up around him. He’s not seeking some greater truth beyond the work. The work is the truth.

The fact is the sweetest dream that labor knows.

There are mysteries in that line. Its power lies in its refusal to mean anything more or less than what it says. But it seems clear that what Frost is getting at, in the line and in the poem, is the centrality of action to both living and knowing. Only through work that brings us into the world do we approach a true understanding of existence, of “the fact.” It’s not an understanding that can be put into words. It can’t be made explicit. It’s nothing more than a whisper. To hear it, you need to get very near its source. Labor, whether of the body or the mind, is more than a way of getting things done. It’s a form of contemplation, a way of seeing the world face-to-face rather than through a glass. Action un-mediates perception, gets us close to the thing itself. It binds us to the earth, Frost implies, as love binds us to one another. The antithesis of transcendence, work puts us in our place.

BOOK: The Glass Cage: Automation and Us
12.51Mb size Format: txt, pdf, ePub
ads

Other books

Darling Jasmine by Bertrice Small
RosyCheeks by Marianne LaCroix
Return of the Outlaw by C. M. Curtis
Dark Summer Dawn by Sara Craven
Pent Up by Damon Suede
Santa Fe Rules by Stuart Woods