Darwin Among the Machines (24 page)

Read Darwin Among the Machines Online

Authors: George B. Dyson

BOOK: Darwin Among the Machines
9.91Mb size Format: txt, pdf, ePub

But aren't these analogies deeply flawed? Software is designed, engineered, and reproduced by human beings; programs are not independently self-reproducing organisms selected by an impartial environment from the random variation that drives other evolutionary processes that we characterize as alive. The analogy, however, is valid, because the analog of software in the living world is not a self-reproducing organism, but a self-replicating molecule of DNA. Self-replication and self-reproduction have often been confused. Biological organisms, even single-celled organisms, do not replicate themselves; they host the replication of genetic sequences that assist in reproducing an approximate likeness of themselves. For all but the lowest organisms, there is a lengthy, recursive sequence of nested programs to unfold. An elaborate self-extracting process restores entire directories of compressed genetic programs and reconstructs increasingly complicated levels of hardware on which the operating system runs. That most software is parasitic (or symbiotic) in its dependence on a host metabolism, rather than freely self-replicating, strengthens rather than weakens the analogies with life.

The randomness underlying evolutionary processes has also been overplayed. Many seemingly haphazard genetic processes are revealing themselves to be less random than once thought. By means of higher-level languages, grammars, and error-correcting codes, much of the randomness is removed before a hereditary message is submitted for execution as another cell. A certain collective intelligence adheres to the web of relationships among genetic regulators and operators, a vague and faintly distributed unconscious memory that raises Samuel Butler's ghost. What randomness
does
contribute to evolutionary processes is a small but measurable element of noise. By definition, a Darwinian process has an element of randomness—but it does not have to be a game of chance. By almost any standard, the software industry qualifies as a Darwinian process—from the generation of software (you combine existing code segments, then execute and weed out the bugs) to the management of the industry as a whole: eat, be eaten, or merge.

No one can say what contribution randomness has made to software development so far. Most programs have grown so complex and convoluted that no human being knows where all the code came from or even what some of it actually does. Programmers long ago gave up hope of being able to predict in advance whether a given body of code will work as planned. “It was on one of my journeys between the EDSAC room and the punching equipment,” recalled Maurice Wilkes of one particular day of program testing at Cambridge in 1949, “that ‘hesitating at the angle of the stairs' the realization came over me with full force that a good part of the remainder of my life was going to be spent in finding errors in my own programs.”
39
The software industry has kept track of harmful bugs since the beginning—but there is no way to keep track of the accidents and coincidences that have accumulated slight improvements along the way.

In 1953, Nils Barricelli observed a digital universe in the process of being born. There was only a fraction of a megabyte of random-access memory on planet Earth, and only part of it was working at any given time. “The limited capacity of even the largest calculating machines makes it impossible to operate with more than a few thousand genes at a time instead of the thousands of billions of genes and organisms with which nature operates,” he wrote in 1957. “This makes it impossible to develop anything more than extremely primitive symbioorganisms even if the most suitable laws of reproduction are chosen.”
40
Not so today. Barricelli's universe has expanded explosively, providing numerical organisms with inexhaustible frontiers on which to grow.

Barricelli's contributions have largely been forgotten. Few later writers have searched the literature of the 1950s for signs of artificial life. It is hard to believe that such experiments could be performed with the equipment available in 1953. Von Neumann's outline of a formal theory of self-reproducing automata, developed in the 1950s but published only in 1966, is regarded as the precursor of the field that would develop, decades later, into the study of artificial life. Von Neumann took a theoretical, not an experimental, approach. Although Barricelli came to the Institute at von Neumann's invitation, the same caution with which von Neumann viewed speculations about artificial intelligence was applied to Barricelli's suggestion that numerical symbioorganisms had started to evolve. Times have changed. “A-life” is now a legitimate discipline, and researchers need not preface their papers with disclaimers lest any reviewer think they are suggesting that numerical organisms evolved within a computer might be alive.

Among the most creative of Nils Barricelli's successors is evolutionary biologist Thomas Ray. After a decade of fieldwork in the Central American rain forest (“The rainforest is like a huge cathedral, but the entire structure is alive”) Ray grew impatient with the pace of evolution—too slow to observe except by studying the past. “The greatest obstacles to understanding evolution have been that we have had only a single example of evolution available for study (life on earth), and that in this example, evolution is played out over time spans which are very large compared to a scientific career,” wrote Ray in 1993, explaining his decision to experiment with evolution in a faster form.
41
The result was Tierra, a stripped-down, 5-bit instruction-code virtual computer that can be embedded within any species of host. Tierra (Spanish for “earth”) is designed to provide a fertile and forgiving environment in which self-replicating digital organisms can evolve. Tierra borrows heavily from nature, for instance by allowing the organisms to locate each other using complementary templates, as in molecular biology, rather than by numerical address. Ray expected to fiddle around fruitlessly with his new system, but, as he recounted, “my plans were radically altered by what actually happened on the night of January 3, 1990, the first time that my self-replicating program ran on my virtual computer, without crashing the real computer that it was emulated on. . . . All hell broke loose. The power of evolution had been unleashed inside the machine, but accelerated to . . . megahertz speeds.”
42

“My research program was suddenly converted from one of design, to one of observation,” reported Ray. “I was back in a jungle describing . . . an alien universe, based on a physics and chemistry
totally different than the life forms I knew and loved. Yet forms and processes appeared that were somehow recognizable to the trained eye of a naturalist.”
43
The Tierran organisms were limited to 80 bytes apiece, yet far-reaching speculations were inspired by Ray's glimpse into their world. “While these new living systems are still so young that they remain in their primordial state, it appears that they have embarked on the same kind of journey taken by life on earth, and presumably have the potential to evolve levels of complexity that could lead to sentient and eventually intelligent beings,” he wrote in 1993.
44

Ray came to the same conclusions as had Barricelli concerning the conditions his numerical organisms required to continue to evolve. Communities of coevolving digital species need large, complex spaces in which to grow. The challenge of semipermeable boundaries between diverse and changing habitats is reflected in increasingly diverse and complex organisms. Working with Kurt Thearling of Thinking Machines, Inc., Ray used a massively parallel CM-5 Connection Machine to develop an archipelago of Tierran nodes, each running a slightly different Tierran soup, with occasional migration of creatures between different nodes, similar to the multiple 512-byte universes Barricelli had constructed at the IAS. But even the universe within the Connection Machine was only a local node within the open universe that awaits. “Due to its size, topological complexity, and dynamically changing form and conditions, the global network of computers is the ideal habitat for the evolution of complex digital organisms,” concluded Ray.
45

To conduct a full-scale experiment, Ray proposed (and has now begun to construct) a “biodiversity reserve for digital organisms” distributed across the global net. “Participating nodes will run a network version of Tierra as a low-priority background process, creating a virtual Tierran sub-net embedded within the real net . . . there will be selective pressures for organisms to migrate around the globe in a daily cycle, to keep on the dark side of the planet, and also to develop sensory capabilities for assessing deviations from the expected patterns of energy availability, and skills at navigating the net.”
46

Having experienced firsthand the difficulty of convincing politicians and resource managers of the value of biological reserves, Ray pitched his proposal with the same arguments that apply to preserving the tropical rain forest as a reservoir of irreplaceable genetic code and yet-to-be-discovered drugs. The Tierran reserve is envisioned as a cooperative laboratory for evolving commercially harvestable software of a variety and complexity beyond what we could ever hope to
engineer. “This software will be ‘wild,' living free in the digital biodiversity reserve,” proposed Ray. “In order to reap the rewards, and create useful applications, we will need to domesticate some of the wild digital organisms, much as our ancestors began domesticating the ancestors of dogs and corn thousands of years ago.”
47
The potentially useful products might range from simple code fragments, equivalent to drugs discovered in rain-forest plants, to entire digital organisms, which Ray imagines as evolving into digital counterparts of the one-celled workhorses that give us our daily bread and cheese and beer—and eventually into higher forms of digital life. “It is imagined,” wrote Ray, “that individual digital organisms will be multi-celled and that the cells that constitute an individual will be dispersed over the net. . . . If there are some massively parallel machines participating in the virtual net, digital organisms may choose to deploy their central nervous systems on these arrays.”
48

Ray's proposal leaves one wondering what will keep these wild organisms from making an escape—a scenario labeled Jurassic Park. The answer is that Tierran organisms can survive only within the universe of virtual machines in which they evolved. Outside this special environment they are only data, no more viable than any other data that reside on the machines with which we work and play. Nonetheless, Ray advised that “freely evolving autonomous artificial entities should be seen as potentially dangerous to organic life, and should always be confined by some kind of containment facility, at least until their real potential is well understood. . . . Evolution remains a self-interested process, and even the interests of confined digital organisms may conflict with our own.”
49

In March 1995, Ray convened a symposium on the network implementation of Tierra at which the security issue was addressed. “There was unanimous agreement that this Terminator 2/Jurassic Park scenario is not a security issue,” reported Ray. “Nobody at the workshop considered this to be a realistic possibility, because the digital organisms are securely confined within the virtual net of virtual machines.” What concerned the security experts, led by Tsu-tomu Shimomura, was the possibility of bad people breaking
into
the Tierran system, rather than the possibility of bad organisms breaking
out
. “Tierra is a general purpose computer . . . and a large network implementation would be a very large general purpose computer, perhaps the largest in the world.” This resource might conceivably be expropriated by outside users, which “would be equivalent to cutting down the rain forest to plant bananas,” according to Ray. If Tierran processes were cultivated for illicit purposes, say, for breaking codes, “this would be like cutting down the rain forest to plant cocaine.”
50

Left unsaid in these discussions is the extent to which the computational universe has already become a jungle, teeming with freely evolving life. Whether the powers of digital evolution can be kept in reserve, without running wild, is a question that depends on how you define wild, just as the question of artificial intelligence depends on how you define intelligence, and the question of artificial life depends on how you define alive. The whole point of the Tierran experiment is to adopt successful Tierran code, carefully neutered, into software running outside the boundaries of the reserve. No matter how this arrangement is described, it means that the results of freely evolving processes will be introduced into the digital universe as a whole. Harmful results will be edited out (assuming there are no mistakes), but that's the way distribution of code in biology has always worked. The effect will be to greatly speed up the evolution of digital organisms, whether the underlying mechanism is kept within a securely guarded reserve or not.

All indications point to a convergence between the Tierran universe as envisioned by Tom Ray and the computational universe as a whole. Platform-independent languages such as Java and its siblings are moving in the direction of universal code that runs on a universal virtual machine, spawned on the host processor of the moment, with a subtle shift in computational identity as objects are referred to more by process than by place. Such a language falls somewhere between the mechanical notation of Charles Babbage, able to describe and translate the logical function of any conceivable machine, and the language of DNA, able to encode the construction of proteins in a form that can be translated by diverse forms of life. Data as well as programs are becoming migratory, and objects that are used frequently find themselves mirrored in many places at once. As successful processes proliferate throughout the network, a variety of schemes for addressing objects by template rather than by location are likely to take form. The result will be a profoundly Tierran shift in the computational landscape—but with the direct links to the workings of our own universe that are essential if digital organisms that actually go out and do things are to continue to evolve.

Other books

Candy Store by Bella Andre
Her Mother's Shadow by Diane Chamberlain
Crash Into You by Ellison, Cara
Cause for Murder by Betty Sullivan La Pierre
Best Food Writing 2015 by Holly Hughes
The Hiding Place by Karen Harper
Compassion by Neal, Xavier
Behind Mt. Baldy by Christopher Cummings
Before Him Comes Me by Sure, Alexandria
The New Bottoming Book by Dossie Easton, Janet W. Hardy