A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of Attention (18 page)

BOOK: A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of Attention
11.67Mb size Format: txt, pdf, ePub

He’s thinking about the Bible.

Earlier in the day, an unusually temperate February morning, one of Dr. Atchley’s cognitive neuroscientist students had asked him about the scientific validity of intelligent design. That’s the idea that the existence of the earth and its people are better explained by divine inspiration than by evolution. Dr. Atchley’s initial response was that intelligent design qualifies as “pseudoscience.” But he brought himself here, to this walk, in the open and the silence, to clear his head and ponder the question.

This is a big part of every day for Dr. Atchley, if not the walk, the retreat. He lives here, on twenty-six acres, about a ten-minute drive from campus—in an underground house that he designed a few years ago. It’s around 2,600 square feet, and covered fully by hardscrabble brown dirt on one side, with nearly eight feet of land above the structure of the roof. One side, facing south, is window-lined, inviting light and heat.

His wife, Ruthann, is chair of the psychology department at Kansas. The Drs. Atchley think of this place as their quiet, protective cocoon, and also part architectural novelty that they still seem bemused the bank ever let them borrow to build.

Underground, inside the house, it’s cool in the winter, and mostly in the summer, too. Year-round, there is no cell phone service, at least not inside. They do get Internet access. Outside: owls, frogs, deer, the occasional bobcat, and Dr. Atchley keeps a small colony of bees.

In the garage, he parks a Subaru with a personalized license plate. It reads:
ATTEND
. When asked to explain, he jokes: “Because ‘Turn off your fucking cell phone’ is too long for a license plate.”

ON THE DAY OF
the conversation about intelligent design, Dr. Atchley walked and thought, and later that night, he crafted an email back to his student in which he quoted Hebrews 11:1.

“Now faith is the substance of things hoped for, the evidence of things not seen.”

And then Dr. Atchley continued his note in his own words: “If you try to play the game of using a method that relies on testing observable evidence, it seems to me you ignore the message of faith, which is that faith does not require evidence and should be strong in the face of evidence to the contrary (reread Job for a better lesson on faith despite contrary evidence.)”

In other words: Don’t expect science to prove your faith.

Religion doesn’t come up much for Dr. Atchley, and when it does, he can call on the years he spent at a Jesuit high school. Usually, he’s exploring people’s dedication to a different idol, that of technology.

Why are we so drawn to our devices?

What makes us check them all the time? When sitting at dinner? When behind the wheel of a car?

“Are these devices so attractive that, despite our best intentions, we cannot help ourselves?”

He believes that to be true, but is not relying on his gut. He wants to prove it. “Some of the questions appear to be difficult, even impossible to answer. But they might not be impossible to answer. We may not be able to point to the exact mechanism in the brain, but we can infer it with the right kinds of experiments.”

For Dr. Atchley, who’s doing some of the foremost research in the field, the questions suggest he’s awoken from his own blind faith in technology.

BEFORE THERE WERE COMPUTERS
in Silicon Valley, before the land gave way to industrial design, there was fruit. Oranges, pomegranates, and avocados. Acre after acre. Trees and dust under a hot sun. Perfect farming conditions. Before Silicon Valley was Silicon Valley, it was one big farm, the Valley of the Heart’s Delight.

Talk about innovation. The fruit cup was invented in the Valley of the Heart’s Delight, by Del Monte.

Then World War II came along. And the Varian brothers and Hewlett and Packard arrived. They got their early funding from the federal government. Defense contractors with a high-tech bent.

This intersection of computers, telecommunications, and the military would yield a change arguably as significant and characteristic of modern life as anything in medicine and the industrialization of food. It was the birth of the Internet, the product of a research program initiated in 1973 by a branch of the military called the Defense Advanced Research Projects Agency (DARPA). The aim was to create a communications system that would go across multiple networks, making it less vulnerable to attack or instability. It was, if nothing else, very hearty.

Silicon Valley became an engine for its growth, serving it and feeding from it. Still, there were orchards left, a dwindling handful. Computers and communications technology commingling with open spaces.

A perfect place to be a thirteen-year-old. Particularly one with a bike, an innate curiosity, and a latchkey.

A younger Dr. Atchley, an only child, with working parents, had to entertain himself. His mother, a hippie by orientation, worked as a legal secretary; his stepfather was a physicist and engineer who designed machines that made silicon wafers, which computer microprocessors are built on.

Left to himself, Paul, a slight boy with dark hair, spent hours trucking through the patchwork of fields and residences. He looked for rocks to turn over and ditches to explore. “I can still smell what it smelled like in those summertime dried-up algae-frog-filled catching places,” he says.

He was also fascinated by computers. He vividly remembers the day he went with his dad (actually his stepfather who had legally adopted him) to the local Byte Shop to look at the first Apple computer. It cost more than $1,000. Way beyond the pale. Paul’s own at-home technology was relegated to a black-and-white television he had in his room, and on which he watched local channels and monster movies. He loved fantasy and sci-fi, and read books about what to do to survive a nuclear attack; he imagined he’d live in an underground house.

Finally, he got a computer. It was called a TI-99, made by Texas Instruments, and it was one of the first home computers. Kids used them to play games. Paul did that. But his interest went beyond games.

“It wasn’t that it gave me the ability to play games. It was that it gave me the ability to do anything I wanted with it—program my own games, use the tape drive to store secret information. What it really represented was limitless potential.

“At that point in my life, I was convinced I’d become a botanist on a space station.”

He was serious. Technology had been moving so quickly, doubling, tripling, quadrupling in power. “We were expanding beyond ourselves, beyond our own planet, pushing back frontiers, bettering ourselves.”

And communicating across geographic barriers. From his room, he could reach out across the city, the country, the globe.

He didn’t realize it, but he was in the midst of an extraordinary time.

ONE REASON THINGS WERE
changing so much owed to a familiar technology maxim called Moore’s law. It essentially says computing power doubles every eighteen months to two years.

But there is another key technology axiom. It defines a different kind of change to the world and our lives, and indirectly would drive Paul’s eventual research: Metcalfe’s law. It defines the value of a telecommunications network, say, the Internet, as proportional to the square of the number of users. The more people, the more valuable the network.

It was named after Robert Metcalfe, an electrical engineer and innovator who helped develop the Ethernet computing standard used to connect computers over short distances. According to a history published by Princeton University, Metcalfe’s law was officially christened in 1993, but the principle was first identified in 1980, when Paul was getting his TI-99 and dividing time between it and his bike. It’s not that the concepts eloquently captured by Metcalfe’s law were entirely new; networks had been developing, and their potential significance were defined in the latter half of the century. But Metcalfe put a fine point on what had become a core attribute of media by the end of the twentieth century.

One simple way to think about how much had changed, how much power had come to personal communications, is illustrated by a simple comparison. In World War II, an extraordinary calculating machine commissioned by the U.S. military, the Electronic Numerical Integrator and Computer, could, each second, perform around 350 multiplications or 5,000 simple additions. By 2012, the iPhone 4 made by Apple could do two billion instructions per second. The iPhone 5, in 2013, even more.

The ENIAC weighed thirty tons. The iPhone 5 is less than four ounces. It carries voice communications and the Internet, a crystallization of all the wondrous powers of the previous millennia, a machine in our pockets that, on its face, worked fully in people’s service, the ultimate entertainment and productivity machine.

By the standard of the iPhone, when Dr. Atchley was just teenage Paul, the pace of their communications and the amount and variety of information—print, voice, video—was relatively limited (maybe not by phone but certainly by computer or mobile phone).

As Paul was growing up, a half generation before Reggie came of age, there was a coming together of these two fundamental computing principles, Moore’s law and Metcalfe’s law. One defined the acceleration of computer processing power, which allowed not just speed, but so many capabilities, including, at its core, interactivity; the other captured the rapid expansion of the communications network and its value.

In union, they were combining to provide unprecedented service to humans. But they were also putting a new kind of pressure on the human brain: Moore bringing increased information, ever faster, and Metcalfe making the information so personal as to make the gadgets extraordinarily seductive, even addictive.

A FEW MONTHS BEFORE
Dr. Atchley took that February-morning walk with his dogs, he attended a first-of-its-kind conference in Southern California. About two hundred neuroscientists gathered with support from the National Academy of Sciences to confront a new question: What is technology doing to our brains?

The introductory lecture was given by Clifford Nass, the provocative Stanford University sociologist who two years earlier had gotten Dr. Strayer and Dr. Gazzaley together to think about the science of multitasking. Now, Dr. Nass was pushing scientists to go beyond the existing science and ask hard questions about whether the ubiquity of constantly connected mobile devices could, ultimately, hamper the things that make us most human: empathy, conflict resolution, deep thinking, and, in a way, progress itself.

Near the front sat Dr. Gazzaley, mentally preparing for his own talk to be given shortly. Seated in the upper right, Dr. Strayer wore glasses, his neck slightly hunched.

And in the back row, Dr. Atchley. His wire-rimmed glasses were perched on his nose and his Macintosh laptop was shut beneath him. That was somewhat noteworthy; many in the crowd had laptops open, including a guy just in front of Dr. Atchley who had four windows open, checking email, the news, and a shopping site.

In explaining why he doesn’t open his laptop, Dr. Atchley calls upon a phrase from his Jesuit training. “Lead me not into temptation but deliver me from evil,” he says, paraphrasing Matthew 6:13.

He thinks that if he opens his laptop, he’ll start checking things. Get distracted from the lecture, and his own analysis of it. He doesn’t trust himself to be disciplined, and he says some fundamental neuroscience has emerged to support his fears. There’s plenty of anecdotal evidence, too.

At the University of Kansas, the journalism school does a periodic “media fast,” in which students aren’t allowed to use their devices for twenty-four hours. When the fast took place in the fall of 2011, students reflected afterward about their experience.

To wit:

“How could I abandon my closest friend, my iPhone?”

“My media fast lasted fifteen minutes before I forgot that I was fasting and checked my phone.”

“The withdrawals were too much for me to handle.”

“Five minutes without checking a text message is like the end of the world.”

“I don’t want to do this assignment again.”

Why? Why is this stuff so compelling?

Dr. Atchley says one thing that makes the question fascinating to him is that when people multitask, they often do so in situations that defy common sense, say, trying to concentrate on an in-person conversation while checking a sports score or attempting to drive while dialing a phone. He believes that there are impulses driving these multitaskers that don’t meet the eye. In fact, he says that increasingly technology is appealing to and preying upon deep primitive instincts, parts of us that existed aeons before the phone.

For one: the power of social connection, the need to stay in touch with friends, family, and business connections. Simple, irresistible. “It’s a brain hijack machine,” he says. He’s trying to prove it.

This is what comes next in the study of the science of attention, the latest wave. Are there some things that can so overtake our attention systems as to be addicting? Is one of those things personal communications technology?

A trip to his lab, he says, will help illustrate his quest.

CHAPTER 17

TERRYL

T
OWARD THE END OF
2006, when Terryl first approached Jackie outside of gymnastics and asked how she might help, Jackie showed typical stoicism and said: “I think we’re good.”

Terryl, respecting Jackie’s strength and privacy, let it go. Besides, she still thought the accident had happened in the adjoining county, Box Elder. So she settled in for being a friend.

Just before Christmas that year, the Furfaros and the Warners went to a gymnastics meet in Park City. Jackie piled the girls into the Saturn. Terryl and Alan drove their van, carrying the brood of four: Jayme, the oldest, then twelve years old; Taylor, the boy, age ten at the time; Allyssa, just shy of five years old; and Katie, who was three and suffered from cystic fibrosis and autism.

At the meet, and over meals, talk between the Warners and Furfaros was about the kids and gymnastics, not the accident.

Other books

An Improper Holiday by K.A. Mitchell
Midsummer Madness by Stella Whitelaw
Six Steps to a Girl by Sophie McKenzie
The Bear Pit by Jon Cleary
Dos velas para el diablo by Laura Gallego García
It Won't Hurt a Bit by Yeadon, Jane
Never Let Me Go by Kazuo Ishiguro
Rogues Gallery by Dan Andriacco