Darwin Among the Machines (30 page)

Read Darwin Among the Machines Online

Authors: George B. Dyson

BOOK: Darwin Among the Machines
2.29Mb size Format: txt, pdf, ePub

Von Neumann was
laying
the groundwork for a unified theory of information dynamics, applicable to free-market economies, self-reproducing organisms, neural networks, and, ultimately, the relations between mind and brain. The confluence of the theory of games with the theory of information and communication invites the construction of such a bridge. In his notes for a series of lectures that were preempted by his death, von Neumann drew a number of parallels—and emphasized a greater number of differences—between the computer
and the brain. He said little about mind. He leaves us with an impression, but no exact understanding, of how the evolution of languages (the result of increasing economy in the use of symbols) refines a flow of information into successively more meaningful forms—a hierarchy leading to levels of interpretation manifested as visual perception, natural language, mathematics, and semantic phenomena beyond. Von Neumann was deeply interested in mind. But he wasn't ready to dismantle a concept that could not be reconstructed with the tools available at the time.

Von Neumann's view of the operation of the human nervous system bears more resemblance to the statistically determined behavior of an economic system than to the precisely logical behavior of a digital computer, whether of the 1950s or of today. “The message-system used in the nervous system . . . is of an essentially statistical character,” he wrote in his Silliman lecture notes, published posthumously in 1958. “In other words, what matters are not the precise positions of definite markers, digits, but the statistical characteristics of their occurrence. . . . Thus the nervous system appears to be using a radically different system of notation from the ones we are familiar with in ordinary arithmetics and mathematics: instead of the precise systems of markers where the position—and presence or absence—of every marker counts decisively in determining the meaning of the message, we have here a system of notations in which the meaning is conveyed by the statistical properties of the message. . . . Clearly, other traits of the (statistical) message could also be used: indeed, the frequency referred to is a property of a single train of pulses whereas every one of the relevant nerves consists of a large number of fibers, each of which transmits numerous trains of pulses. It is, therefore, perfectly plausible that certain (statistical) relationships between such trains of pulses should also transmit information. . . . Whatever language the central nervous system is using, it is characterized by less logical and arithmetical depth than what we are normally used to [and] must structurally be essentially different from those languages to which our common experience refers.”
9

Despite the advances of neurobiology and cognitive science over the past forty years, this fundamental picture of the brain as a mechanism for evolving meaning from statistics has not changed. Higher levels of language produce a coherent residue as this underlying flow of statistical information is processed and refined. Information flow in the brain is pulse-frequency coded, rather than digitally coded as in a computer. The resulting tolerance for error is essential for reliable computation by a web of electrically noisy and chemically
sensitive neurons bathed in a saline fluid (or, perhaps, a web of microprocessors bathed in the distractions of the real world). Whether a particular signal is accounted for as excitation or inhibition depends on the individual nature of the synapses that mediate its journey through the net. A two-valued logic, to assume the simplest of possible models, is inherent in the details of the neural architecture—a more robust mechanism than a two-valued code.

Von Neumann's name remains synonymous with serial processing, now implemented by microprocessors adhering to a logical architecture unchanged from that developed at the Institute for Advanced Study in 1946. He was, however, deeply interested in information-processing architectures of a different kind. In 1943, Warren McCulloch and Walter Pitts demonstrated that any computation performed by a network of (idealized) neurons is formally equivalent to some Turing-machine computation that can be performed one step at a time. Von Neumann recognized that in actual practice (of either electronics or biology) combinatorial complexity makes it prohibitively expensive, if not impossible, to keep this correspondence two-way. “Obviously, there is on this level no more profit in the McCulloch-Pitts result,” he noted in 1948, discussing the behavior of complicated neural nets. “There is an equivalence between logical principles and their embodiment in a neural network, and while in the simpler cases the principles might furnish a simplified expression of the network, it is quite possible that in cases of extreme complexity the reverse is true.”
10

Von Neumann believed that a complex network formed its own simplest behavioral description; to attempt to describe its behavior using formal logic might be an intractable problem, no matter how much computational horsepower was available for the job. Many years—and many, many millions of artificial-intelligence research dollars later—Stan Ulam asked Gian-Carlo Rota, “What makes you so sure that mathematical logic corresponds to the way we think?”
11
Ulam's question echoed what von Neumann had concluded thirty years earlier, that “a new, essentially logical, theory is called for in order to understand high-complication automata and, in particular, the central nervous system. It may be, however, that in this process logic will have to undergo a pseudomorphosis to neurology to a much greater extent than the reverse.”
12
Computers, by the 1980s, had evolved perfect memories, but the memory of the computer industry was short. “If your friends in AI persist in ignoring their past, they will be condemned to repeat it, at a high cost that will be borne by the taxpayers,” warned Ulam, who turned out to be right.
13

For a neural network to perform useful computation, pattern recognition, associative memory, or other functions a system of value must be established, assigning the raw material of meaning on an equitable basis to the individual units of information—whether conveyed by marbles, pulses of electricity, hydraulic fluid, charged ions, or whatever else is communicated among the components of the net. This process corresponds to defining a utility function in game theory or mathematical economics, a problem to which von Neumann and Morgenstern devoted a large portion of their book. Only by such a uniform valuation of internal signals can those configurations that represent solutions to external problems be recognized when some characteristic maximum, minimum, or otherwise identifiable value is evolved. These fundamental tokens coalesce into successively more complex structures conveying more and more information at every stage of the game. In the seventeenth century, Thomas Hobbes referred to these mental particles as “parcels,” believing their physical existence to be as demonstrable as that of atoms, using the same logic by which we lead ourselves to believe in the physical existence of “bits” of information today.

Higher-level representations, symbols, abstractions, and perceptions are constructed in a neural network not from solutions arrived at by algorithmic (step-by-step) processing, as in a digital computer, but from the relations between dynamic local maxima and minima generated by a real-time, incomprehensibly complex version of one of von Neumann's games. It is what is known as an
n
-person game, involving, in our case, a subset of the more than 100 billion neurons, interlaced by trillions of synapses, that populate the brain. Von Neumann and Morgenstern demonstrated how to arrive at reasonable solutions among otherwise hopeless combinatorics by means of a finite but unbounded series of coalitions that progressively simplify the search. A successful, if fleeting, coalition, in our mental universe, may surface to be perceived—and perhaps communicated, via recourse to whatever symbolic channels are open at the time—as an idea. It is a dynamic, relational process, and the notion of a discrete idea or mental object possessed of absolute meaning is fundamentally contradictory, just as the notion of a bit having independent existence is contradictory. Each bit represents the difference between two alternatives, not any one thing at one time.

In a neural network, the flow of information behaves like the flow of currency in an economy. Signals do not convey meaning through encoded symbols; they generate meaning depending on where they come from, where they go, and how frequently they arrive. A dollar is
a dollar, whether coming in or going out, and you can choose to spend that same dollar on either gasoline or milk. The output of one neuron can be either debited or credited to another neuron's account, depending on the type of synapse at which it connects. The faint pulses of electric current that flow through a nervous system and the pulses of currency that flow through an economy share a common etymology and a common destiny that continues to unfold. The metaphor has been used both ways. “The currency of our systems is not symbols, but excitation and inhibition,” noted D. E. Rumelhart and J. E. McClelland in their introduction to
Parallel Distributed Processing: Explorations in the Microstructure of Cognition
, a collection of papers that focused a revival of neural network research ten years ago.
14
“Each of those little ‘wires' in the optic nerve sends messages akin to a bank statement where it tells you how much interest was paid this month,” wrote neurophysiologist William Calvin in
The Cerebral Symphony
, a recent tour inside the human mind and brain. “You have to imagine, instead of an eye, a giant bank that is busily mailing out a million statements every second. What maximizes the payout on a single wire? That depends on the bank's rules, and how you play the game.”
15

Raw data generated by the first layer of retinal photoreceptors is refined, through ingenious statistical transformations, into a condensed flow of information conveyed by the optic nerve. This flow of information is then refined, over longer intervals of time, into a representation perceived as vision by the brain. There is no coherent encoding of the image, as generated by a television camera, just a stream of statistics, dealt out, like cards, to the brain. Vision is a game in which the brain bids a continuous series of models and the model that is most successful in matching the next hand wins. Finally, vision is refined into knowledge, and if all goes well, knowledge is condensed into wisdom, over a lifetime, by the mind. These elements of economy associated with the workings of our intelligence are mirrored by elements of intelligence associated with the workings of an economy—a convergence growing more visible as economic processes take electronic form.

This convergence has its origins in the seventeenth century, just as the foundations of electronic logic date back to Hobbes's observation that, given the existence of addition and subtraction, otherwise mindless accounting leads to everything else. “Monies are the sinews of War, and Peace,” observed Hobbes in 1642.
16
In his
Leviathan
of 1651 he elaborated: “Mony passeth from Man to Man, within the Commonwealth; and goes round about, Nourishing (as it passeth) every part thereof. . . . Conduits, and Wayes by which it is conveyed to the
Publique use, are of two sorts; One, that Conveyeth it to the Publique Coffers; The other, that Issueth the same out againe for publique payments. . . . And in this also, the Artificiall Man maintains his resemblance with the Naturall; whose Veins receiving the Bloud from the severall Parts of the Body, carry it to the Heart; where being made Vitall, the Heart by the Arteries sends it out again.”
17

Hobbes drew his analogy with the circulation of the blood, made vital by the heart, not the circulation of an electric fluid in the nerves, made vital by the brain. The galvanic fluid had yet to be identified, and monetary currency had yet to take the leap to abstract, fluid forms. In Hobbes's time, wealth conveyed by the transfer of information rather than substance was rarer than gold. The evolution of financial instruments resembles the evolution of the hierarchies of languages embodied by digital computers; both these developments parallel the evolution, 600 million years earlier, of genetic programs controlling the morphogenesis of multicellular forms of life. A surge of complexity followed. The cellular-programming revolution began in the Cambrian era; the computer software revolution began in the era of von Neumann; the monetary revolution began in the time of Hobbes.

When Hobbes was in Paris during the 1640s, he was joined by a young William Petty, who later founded the science of political economy, relieving Hobbes's ideas of religious controversy and inserting them into the mainstream of economic thought. The son of a Hampshire clothier, Petty (1623–1687) went to sea at age thirteen with only sixpence to his name. After breaking his leg and being put ashore in France, Petty lived by his wits, securing his own education and an introduction to Hobbes, who was keeping a safe distance between himself and the civil war in England until things settled down. Petty assisted Hobbes (“who loved his company”) with his treatise on optics,
Tractatus Opticus
, 1644, drawing “Mr. Hobbes Opticall schemes for him,” reported Aubrey, “which he was pleased to like.”
18
After returning to England, Petty earned a degree in medicine at Oxford in 1649, where, with John Wilkins, he hosted the predecessor of the Royal Society, known as the Philosophical Club. Sir Robert Southwell, matching wits until the end of Petty's life, encouraged his cousin to exercise his mind: “For Intuition of truth may not Relish soe much as Truth that is hunted downe.”
19

Petty became famous in 1650 for saving the life of Anne Green, a servant maid charged with the murder of her premature and evidently stillborn child and sentenced to hang. “After she had suffere'd the law,” wrote Aubrey, “she was cut downe, and carried away in
order to be anatomiz'd by some yong physitians, but Dr. William Petty finding life in her, would not venter upon her, only so farr as to recover her life.”
20
Assisted legally and financially by Petty and his colleagues, Anne Green was married, bore several healthy children, and enjoyed fifteen additional years of life.

Other books

Sharp Shot by Jack Higgins
Stage Door Canteen by Maggie Davis
A Is for Alibi by Grafton, Sue
The Difference a Day Makes by Carole Matthews
A Chalice of Wind by Cate Tiernan
Hot and Bothered by Serena Bell