Read The Most Human Human Online

Authors: Brian Christian

The Most Human Human (10 page)

BOOK: The Most Human Human
4.13Mb size Format: txt, pdf, ePub
ads

A number of researchers feel that the attempt to break language down with thesauri and grammatical rules is simply not going to crack the translation problem. A new approach abandons these strategies, more or less entirely. For instance, the 2006 NIST machine translation competition was convincingly won by a team from Google, stunning a number of machine translation experts: not a single human on the Google team knew the languages (Arabic and Chinese) used in the competition. And, you might say, neither did the software itself, which didn’t give a whit about meaning or about grammar rules. It simply drew from a massive database of high-quality human translation
17
(mostly from the United Nations minutes, which are proving to be the twenty-first century’s digital Rosetta stone), and patched phrases together according to what had been done in the past. Five years later, these kinds of “statistical” techniques are still imperfect, but they have left the rule-based systems pretty firmly in the dust.

Among the other problems in which statistical, as opposed to rule-based, systems are triumphing? One of our right-hemisphere paragons: object recognition.

UX

Another place where we’re seeing the left-hemisphere, totally deliberative, analytical approach erode is with respect to a concept called
UX, short for User Experience—it refers to the experience a given user has using a piece of software or technology, rather than the purely
technical
capacities of that device. The beginnings of computer science were dominated by concerns for the technical capacities, and the exponential growth in processing power
18
during the twentieth century made the 1990s, for instance, an exciting time. Still, it wasn’t a
beautiful
time. My schoolmate brought us over to show us the new machine he’d bought—it kept overheating, so he’d opened the case up and let the processor and motherboard dangle off the edge of the table by the wires, where he’d set up his room fan to blow the hot air out the window. The keyboard keys stuck when you pressed them. The mouse required a cramped,
T. rex
–claw grip. The monitor was small and tinted its colors slightly. But computationally, the thing could scream.

This seemed the prevailing aesthetic of the day. My first summer job, in eighth grade—rejected as a busboy at the diner, rejected as a caddy at the golf course, rejected as a summer camp counselor—was at a web design firm, where I was the youngest employee by at least a decade, and the lowest paid by a factor of 500 percent, and where my responsibilities in a given day would range from “Brian, why don’t you restock the toilet paper and paper towels in the bathrooms” to “Brian, why don’t you perform some security testing on the new e-commerce intranet platform for Canon.” I remember my mentor figure at the web design company saying, in no uncertain terms, “function over form.”

The industry as a whole seemed to take this mantra so far that function began trumping
function:
for a while, an arms race between hardware and software created the odd situation that computers were getting exponentially faster but no faster at all to
use
, as software made ever-larger demands on system resources, at a rate that
matched and sometimes outpaced hardware improvements. (For instance, Office 2007 running on Windows Vista uses twelve times as much memory and three times as much processing power as Office 2000 running on Windows 2000, with nearly twice as many execution threads as the immediately previous version.) This is sometimes called “Andy and Bill’s Law,” referring to Andy Grove of Intel and Bill Gates of Microsoft: “What Andy giveth, Bill taketh away.” Users were being subjected to the very same lags and lurches on their new machines, despite exponentially increasing computing power, all of which was getting sopped up by new “features.” Two massive companies pouring untold billions of dollars and thousands of man-years into advancing the cutting edge of hardware and software, yet the advances effectively canceled out. The user experience went nowhere.

I think we’re just in the past few years seeing the consumer and corporate attitude changing. Apple’s first product, the Apple I, did not include a keyboard or a monitor—it didn’t even include a
case
to hold the circuit boards. But it wasn’t long before they began to position themselves as prioritizing user experience ahead of power—and ahead of pricing. Now they’re known, by admirers and deriders alike, for machines that manage something which seemed either impossible or irrelevant, or both, until a few years ago—elegance.

Likewise, as computing technology moves increasingly toward mobile devices, product development becomes less about the raw computing horsepower and more about the overall design of the product and its fluidity, reactivity, and ease of use. This fascinating shift in computing emphasis may be the cause, effect, or correlative of a healthier view of human intelligence—not so much that it is complex and powerful, per se, as that it is reactive, responsive, sensitive, nimble. The computers of the twentieth century helped us to see that.

Centering Ourselves

We are computer tacked onto creature, as Sacks puts it. And the point isn’t to denigrate one, or the other, any more than a catamaran ought
to become a canoe. The point isn’t that we’re half lifted out of beastliness by reason and can try to get even further through force of will. The tension is the point. Or, perhaps to put it better, the collaboration, the dialogue, the duet.

The word games Scattergories and Boggle are played differently but scored the same way. Players, each with a list of words they’ve come up with, compare lists and cross off every word that appears on more than one list. The player with the most words remaining on her sheet wins. I’ve always fancied this a rather cruel way of keeping score. Imagine a player who comes up with four words, and each of her four opponents only comes up with one of them. The round is a draw, but it hardly feels like one … As the line of human uniqueness pulls back ever more, we put the eggs of our identity into fewer and fewer baskets; then the computer comes along and takes that final basket, crosses off that final word. And we realize that uniqueness, per se, never had anything to do with it. The ramparts we built to keep other species and other mechanisms out also kept us in. In breaking down that last door, computers have let us out. And back into the light.

Who would have imagined that the computer’s
earliest
achievements would be in the domain of logical analysis, a capacity held to be what made us most different from everything on the planet? That it could drive a car and guide a missile before it could ride a bike? That it could make plausible preludes in the style of Bach before it could make plausible small talk? That it could translate before it could paraphrase? That it could spin half-plausible postmodern theory essays
19
before it could be shown a chair and say, as any toddler can, “chair”?
We forget what the impressive things are. Computers are reminding us.

One of my best friends was a barista in high school: over the course of the day she would make countless subtle adjustments to the espresso being made, to account for everything from the freshness of the beans to the temperature of the machine to the barometric pressure’s effect on the steam volume, meanwhile manipulating the machine with octopus-like dexterity and bantering with all manner of customers on whatever topics came up. Then she goes to college, and lands her first “real” job—rigidly procedural data entry. She thinks longingly back to her barista days—a job that actually made demands of her intelligence.

I think the odd fetishization of analytical thinking, and the concomitant denigration of the creatural—that is, animal—and embodied aspect of life is something we’d do well to leave behind. Perhaps we are finally, in the beginnings of an age of AI, starting to be able to
center
ourselves again, after generations of living “slightly to one side.”

Besides, we know, in our capitalist workforce and precapitalist-workforce education system, that specialization and differentiation are important. There are countless examples, but I think, for instance, of the 2005 book
Blue Ocean Strategy: How to Create Uncontested Market Space and Make the Competition Irrelevant
, whose main idea is to avoid the bloody “red oceans” of strident competition and head for “blue oceans” of uncharted market territory. In a world of only humans and animals, biasing ourselves in favor of the left hemisphere might make some sense. But the arrival of computers on the scene changes that dramatically. The bluest waters aren’t where they used to be.

Add to this that humans’ contempt for “soulless” animals, their unwillingness to think of themselves as descended from their fellow “beasts,” is now cut back on all kinds of fronts: growing secularism and empiricism, growing appreciation for the cognitive and behavioral abilities of organisms other than ourselves, and, not coincidentally,
the entrance onto the scene of a being far more soulless than any common chimpanzee or bonobo—in this sense AI may even turn out to be a boon for animal rights.

Indeed, it’s entirely possible that we’ve seen the high-water mark of the left-hemisphere bias. I think the return of a more balanced view of the brain and mind—and of human identity—is a good thing, one that brings with it a changing perspective on the sophistication of various tasks.

It’s my belief that only experiencing and understanding
truly
disembodied cognition, only seeing the coldness and deadness and disconnectedness of something that truly
does
deal in pure abstraction, divorced from sensory reality, only this can snap us out of it. Only this can bring us, quite literally, back to our senses.

One of my graduate school advisers, poet Richard Kenney, describes poetry as “the mongrel art—speech on song,” an art he likens to lichen: that organism which is actually not an organism at all but a cooperation between fungi and algae so common that the cooperation itself seemed a species. When, in 1867, the Swiss botanist Simon Schwendener first proposed the idea that lichen was in fact two organisms, Europe’s leading lichenologists ridiculed him—including Finnish botanist William Nylander, who had taken to making allusions to “stultitia Schwendeneriana,” fake botanist-Latin for “Schwendener the simpleton.” Of course, Schwendener happened to be completely right. The lichen is an odd “species” to feel kinship with, but there’s something fitting about it.

What appeals to me about this notion—the mongrel art, the lichen, the monkey and robot holding hands—is that it seems to describe the human condition too. Our very essence is a kind of mongrelism. It strikes me that some of the best and most human emotions come from this lichen state of computer/creature interface, the admixture, the estuary of desire and reason in a system aware enough to apprehend its own limits, and to push at them: curiosity, intrigue, enlightenment, wonder, awe.

Ramachandran: “One patient I saw—a neurologist from New York—suddenly at the age of sixty started experiencing epileptic seizures arising from his right temporal lobe. The seizures were alarming, of course, but to his amazement and delight he found himself becoming fascinated by poetry, for the first time in his life. In fact, he began thinking in verse, producing a voluminous outflow of rhyme. He said that such a poetic view gave him a new lease on life, a fresh start just when he was starting to feel a bit jaded.”

Artificial intelligence may very well be such a seizure.

1.
When I first read about this as an undergraduate, it seemed ridiculous, the notion that a nonphysical, nonspatial entity like the soul would somehow
deign
to physicality/locality in order to “attach” itself to the physical, spatial brain at any specific point—it just seemed ridiculous to try to
locate
something
non-localized
. But later that semester, jamming an external wireless card into my old laptop and hopping online, I realized that the idea of accessing something vague, indefinite, all surrounding, and un-locatable—my first reaction to my father explaining how he could “go to the World Wide Web” was to say, “Where’s that?”—through a specific physical component or “access point” was maybe not so prima facie laughable after all.

2.
Depending on your scientific and religious perspectives, the soul/body interface might have to be a special place where normal, deterministic cause-and-effect physics breaks down. This is metaphysically awkward, and so it makes sense that Descartes wants to shrink that physics-violation zone down as much as possible.

3.
!

4.
The word “psyche” has, itself, entered English as a related, but not synonymous, term to “soul”—one of the many quirks of history that make philology and etymology so convoluted and frustrating and interesting.

5.
Fine indeed. “A piece of your brain the size of a grain of sand would contain one hundred thousand neurons, two million axons, and one billion synapses, all ‘talking to’ each other.”

6.
Philolaus’s different but related view was that the soul is a kind of “attunement” of the body.

7.
The Stoics had another interesting theory, which foreshadows nicely some of the computer science developments of the 1990s. Plato’s theory of the tripartite soul could make sense of situations where you feel ambivalent or “of two minds” about something—he could describe it as a clash between two different parts of the soul. But the Stoics only had one soul with one set of functions, and they took pains to describe it as “indivisible.” How, then, to explain ambivalence? In Plutarch’s words, it is “a turning of the single reason in both directions, which we do not notice owing to the sharpness and speed of the change.” In the ’90s, I recall seeing an ad on television for Windows 95, where four different animations were playing, one after the other, as a mouse pointer clicked from one to the other. This represented old operating systems. All of a sudden all four animations began running simultaneously: this represented Windows 95, with
multitasking
. Until around 2007 and onward, when multiprocessor machines became increasingly standard, multitasking was simply—Stoic-style—switching back and forth between processes, just as with the old operating systems the ad disparages, except doing so automatically, and really fast.

8.
This is an interesting nuance, because of how crucial the subjective/objective distinction has been to modern philosophy of mind. In fact, subjective experience seems to be the linchpin, the critical defensive piece, in a number of arguments against things like machine intelligence. The Greeks didn’t seem too concerned with it.

9.
In Hendrik Lorenz’s words: “When the soul makes use of the senses and attends to perceptibles, ‘it strays and is confused and dizzy, as if it were drunk.’ By contrast, when it remains ‘itself by itself’ and investigates intelligibles, its straying comes to an end, and it achieves stability and wisdom.”

10.
The word “or” in English is ambiguous—“Do you want sugar or cream with your coffee?” and “Do you want fries or salad with your burger?” are actually two
different
types of questions. (In the first, “Yes”—meaning “both”—and “No”—meaning “neither”—are perfectly suitable answers, but in the second it’s understood that you will choose one and exactly one of the options.) We respond differently, and appropriately, to each without often consciously noticing the difference. Logicians, to be more precise, use the terms “inclusive or” and “exclusive or” for these two types of questions, respectively. In Boolean logic, “OR” refers to the
inclusive
or, which means “either one or the other,
or both.
” The exclusive or—“either one or the other,
but not both
”—is written “XOR.”

11.
The heart needs the brain just as much as the brain needs the heart. But the heart—with all due respect—is fungible.

12.
The book’s sequel,
The Upside of Irrationality
, is much more sanguine about “irrationality” in its title, if somewhat less so in the text itself.

13.
Neurologist Antonio Damasio showed him a series of extremely emotionally charged pictures—a severed foot, a naked woman, a burning home—to which he barely reacted. Fans of
Blade Runner
or Philip K. Dick will recall this as almost the spitting image of the fictitious “Voigt-Kampff test.” Good thing he didn’t live in the
Blade Runner
universe: Harrison Ford would have decided this man was a “replicant”—and killed him.

14.
The
ultimate
Turing test victory, you might say.

15.
John Mauchly and J. Presper Eckert, of the University of Pennsylvania. ENIAC (Electronic Numerical Integrator and Computer), built in 1946 and initially used in the calculations for the hydrogen bomb, was the first fully electronic and fully general-purpose computing machine.

16.
Recall Turing: “The idea behind digital computers may be explained by saying that these machines are intended to carry out any operations which could be done by a human computer.”

17.
Interestingly, this means that paraphrasing is actually
harder
for computers than translation, because there aren’t huge paraphrase corpora lying around ready to become statistical fodder. The only examples I can think of off the top of my head would be, ironically, competing translations: of famous works of literature and religious texts.

18.
This trend is described by what’s called Moore’s Law, the 1965 prediction of Intel’s co-founder Gordon Moore that the number of transistors in a processor would double every two years.

19.
“If one examines capitalist discourse, one is faced with a choice: either reject nihilism or conclude that the goal of the writer is social comment, given that the premise of Foucaultist power relations is valid.” Or, “Thus, the subject is interpolated into a nihilism that includes consciousness as a paradox.” Two sentences of infinitely many at
www.elsewhere.org/pomo
.

BOOK: The Most Human Human
4.13Mb size Format: txt, pdf, ePub
ads

Other books

The Bar Watcher by Dorien Grey
Open Road by Evelyn Glass
Crystalfire by Kate Douglas
The Early Centuries - Byzantium 01 by John Julius Norwich
One Year After: A Novel by William R. Forstchen
Children of the Wolf by Jane Yolen