Read Darwin Among the Machines Online

Authors: George B. Dyson

Darwin Among the Machines (12 page)

BOOK: Darwin Among the Machines
9.62Mb size Format: txt, pdf, ePub
ads

Finally, Turing showed that his definition of computability was equivalent to the effective calculability of Alonzo Church and the general recursiveness of Stephen Kleene—convincing evidence that these seemingly diverse formalizations of an intuitive concept represent a common and inescapable truth. “By a kind of miracle,” as Gödel himself later referred to Turing's definition, the concept of computability transcends the formalism in which it is expressed.
6

For the Turing machine this was both good news and bad. The good news was that in principle all digital computers are equivalent; any machine that can count, take notes, and follow instructions can compute any computable function, given an unlimited supply of scratch paper and an unlimited length of time. Software (coding) can always be substituted for hardware (switching), enabling rapid adaptation through software while the underlying hardware slowly evolves. The bad news was the existence of mathematical functions that no machine can ever compute in any amount of time.

It is surprising that noncomputable functions, which outnumber computable ones, are so hard to find. It is not just that noncomputable functions are difficult to recognize or awkward to define. We either inhabit a largely computable world or have gravitated toward a
computable frame of mind. The big questions—Is human intelligence a computable function? Are there algorithms for life?—may never be answered. But computable functions appear to be doing most of the work. “Non-computable functions may be the most common type of function in theory, but in practice they hardly ever come up,” Danny Hillis has explained. “In fact, it is difficult to find a well-defined example of a non-computable function that anybody wants to compute. This suggests that there is some deep connection between computability and the physical world and/or the human mind.”
7

“On Computable Numbers” secured a Proctor Fellowship for its author, who went to Princeton University in 1937 to complete his doctoral thesis under Alonzo Church. Princeton had become a leading center for mathematical logic, with Church, von Neumann, and Gödel presiding over a steady flow of distinguished visitors and a growing circle of permanent refugees from abroad. Turing's theoretical device shifted the foundations of mathematics; the implications of general-purpose computers reached far beyond. During his stay in Princeton, Turing grew impatient and set about building parts of a computer with his own hands. “Turing actually designed an electric multiplier and built the first three or four stages to see if it could be made to work,” related Malcolm MacPhail, who lent Turing his key to the small machine shop within the Palmer Physics Laboratory, next to the mathematics department at Fine Hall. “He needed relay-operated switches which, not being commercially available at that time, he built himself. . . . And so, he machined and wound the relays; and to our surprise and delight the calculator worked.”
8

Princeton is a quiet central New Jersey town whose attention still lingers on its importance in the Revolutionary War. Princetonians had rehearsed for a role in the digital revolution more than once. Turing's experiments in 1938 were preceded fifty years earlier by the construction of a mechanical binary logic machine by Allan Marquand (1853–1924). “The new machine was constructed in Princeton during the winter of 1881–82,” Marquand reported in 1885, and was “made from the wood of a red-cedar post, which once formed part of the enclosure of Princeton's oldest homestead.”
9
Marquand was a professor of art history whose foray into mechanical logic attracted the attention of logician Charles Sanders Peirce (1839–1914). In 1886, Peirce wrote to Marquand, “I think you ought to return to the problem, especially as it is by no means hopeless to expect to make a machine for really very difficult mathematical problems. But you would have to proceed step by step. I think electricity would be the best thing to rely on.”
10

Marquand seems to have lost interest in the project, although a search of his manuscripts by Alonzo Church in the early 1950s turned
up a pen-and-ink schematic that “is probably the first circuit of an electrical logic-machine.”
11
It is not known whether the machine was ever built, but George W. Patterson, who believed that “this is very likely the first design of an electric data processor of any kind,” performed a logical and magnetological analysis and was willing “to give the designer the benefit of the doubt” that the machine would have worked, if not exactly as planned.
12

Marquand's logic machine was the successor to the “logical piano” developed in the 1860s by the English economist and logician William Stanley Jevons. The appearance of Marquand's “vastly more clear-headed contrivance” prompted Peirce to compose a short paper titled “Logical Machines” (1887), in which he considered “precisely how much of the business of thinking a machine could possibly be made to perform, and what part of it must be left for the living mind.”
13
Despite the primitive abilities evidenced so far, Peirce advised consideration of “how to pass from such a machine as that to one corresponding to a Jacquard loom.”
14
Ill at ease in more academic surroundings, Peirce spent thirty years working for the U.S. Coast Survey, where his duties included working as a (human) computer on the Nautical Almanac and conducting gravitational research. He compiled large sections of the eight-thousand-page
Century Dictionary
and was both physically and mentally ambidextrous, able to write out a question and its answer using both hands at the same time.

“Every machine is a reasoning machine, in so much as there are certain relations between its parts, which relations involve other relations that were not expressly intended,” observed Peirce.
15
Independently of Babbage and ahead of Turing, he considered the implications of more advanced reasoning by machines. “The machine would be utterly devoid of original initiative, and would only do the special kind of thing it had been calculated to do. This, however, is no defect in a machine; we do not want it to do its own business, but ours. . . . We no more want an original machine, than a house-builder would want an original journeyman, or an American board of college trustees would hire an original professor.” (Peirce had been subjected to this judgment firsthand.) “The logical machines that have thus far been devised can deal with but a limited number of different letters,” continued Peirce. “The unaided mind is also limited in this as in other respects; but, the mind working with a pencil and plenty of paper has no such limitation . . . whatever limits can be assigned to its capacity to-day, may be over-stepped to-morrow.”
16
Peirce recognized the power of unbounded storage—the principle underlying Babbage's analytical engine and later formalized by Turing's universal machine.

Which came first, Turing machine or digital computer? It depends on whether precedence is assigned to the chicken or to the egg. Turing's analysis transcended architectural and genealogical particulars to reveal a universal fellowship among digital machines. At the time of “On Computable Numbers,” large numbers of Turing machines already existed in the form of punched-card tabulating, calculating, and data-processing machines. These devices mimicked their theoretical archetype by reading a mark on a piece of paper, shifting internal state accordingly, and making another mark somewhere else. Punched-card machines formed extensive systems whose components differentiated into the essential functions (input, output, storage, and central processing) that would characterize the vital organs of all computers in the years ahead.

The punched-card information-processing industry was developed by Herman Hollerith (1860–1929), employed as a special agent for the tenth U.S. census, in 1879. The 1880 census took almost seven years to completely count. If methods were not improved, the 1890 census would not be completed by the time the 1900 census began. Hollerith's supervisor, Dr. John S. Billings, encouraged his protégé to tabulate data by means of perforated cards, citing the precedent of railway tickets but not Babbage's engine or the Jacquard loom. Once a card was punched the data could then be read, sorted, and tabulated by machine. As a demonstration project Billings arranged for Hollerith to tabulate vital statistics for the Baltimore Department of Health. Hollerith made the most of this opportunity, although, as his mother-in-law wrote in 1889, “he is completely tired out. He has been punching cards at the rate of 1,000 per day—and each card has at least a dozen holes. He has done it all with a hand punch and his arm was aching and paining dreadfully. He really looked quite badly.”
17
But the system worked.

Hollerith won the contract to tabulate the eleventh U.S. census of 1890, enumerating sixty-two million people using some fifty-six million cards. There were 288 punch positions, storing the equivalent of up to thirty-six 8-bit bytes of information per card. Results more detailed than those of any previous census were completed within two years. Punched-card equipment proliferated relentlessly as techniques developed to meet the ten-year cycle of the census were adapted to other purposes in between. Hollerith incorporated the Tabulating Machine Company in 1896, which was consolidated into the Computing-Tabulating-Recording Company (CTR) in 1911, and then renamed International Business Machines, or IBM, in 1924. Punched cards and perforated tapes not only served to convey and
process information, but began to exercise control. In a 1922
Scientific American
article subtitled “How Strips of Paper Can Endow Inanimate Machines with Brains of Their Own,” Emmanuel Scheyer predicted that “in some uncanny way, things will seem to be running themselves.”
18

By the time Turing awakened us to their powers, the age of discrete-state machines was well under way. “There is a great deal more arithmetic and better arithmetic in the world than there used to be,” reported Vannevar Bush in October 1936. “This is indicated by the fact that 10,000 tons of cards are used per year, a total of four billion cards. . . . The end of the development is not in sight.”
19
Each card was functionally equivalent to one or more cells of a Turing machine tape. A machine might be programmed to scan the whole card at once (a pattern of holes punched among ten rows and eighty columns representing an alphabet of 2
800
possible symbols); a single location on the card (the presence or absence of a hole representing an alphabet of 2 possible symbols); or some intermediate configuration that divided the data into fields. It was characteristic of the data-processing industry at that time (and closer to the original concept of a Turing machine than to the state of data processing today) that most of the complexity was represented by the tape (or card sequence) itself, rather than by the machine's internal state of mind. By repeated sorting and other iterated functions, primitive punched-card machines could perform complex operations, but, like the original Turing machine, they had only a small number of possible states.

The fundamental unit of information was the bit; its explicit definition as the contraction of “binary digit” was first noted in an internal Bell Laboratories memo written by John W. Tukey on 9 January 1947,
20
and first published in Claude Shannon's
Mathematical Theory of Communication
in 1948.
21
Shannon's definition was foreshadowed by Vannevar Bush's analysis, in 1936, of the number of “bits of information” that could be stored on a punched card. In those days bits were assigned only fleetingly to electrical or electronic form. Most bits, most of the time, were bits of paper (or bits of missing paper, represented by the chad that was carted off to landfills by the ton). “There is still a great deal of carrying cards from one machine to another, and each problem is unique,” wrote Bush, best known for his differential analyzer, the granddaddy of analog computers, and less known for his prediction, in the 1930s, of an inevitable shift toward digital machines. “A master control is here conceivable. This process should also be automatic and performed entirely by machine. It should be sufficient, having punched a stack of cards, then to punch a
master card, dictating with complete flexibility the operations to be performed. . . . Such an arrangement would no doubt be soon worked out if there were sufficient commercial demand. . . . Quite a lot can be accomplished by arithmetic alone, if its operations can be performed rapidly enough, and combined. . . . It is time that a numerical machine were built for which the sequence of operations might be varied at will to cover a large field of utility, but just as fully automatic once the sequence is assigned.”
22

Hollerith equipment was used throughout business, industry, and science to tabulate, store, sort, and condense large amounts of disordered information into a more ordered state. Punched-card routines assisted in searching for underlying patterns and analyzing accumulated results. By discovering correspondences between computable functions and incoming data the machines could be used to model relationships and even predict future sequences of events.

A Turing machine can also be configured to run the other way: instead of finding a pattern in an incoming sequence and producing a comprehensible result, it can transform an understandable message into an arbitrarily complicated sequence, producing an incomprehensible result. If scrambled by a computable function (and thereby encoded as a computable number) someone with knowledge of that function can reverse the process to reconstruct the original message at the other end. At the close of World War I, such a cryptographic machine was invented by the German electrical engineer Arthur Scherbius (1878–1929), who proposed it to the German navy, an offer that was declined. Scherbius then founded the Chiffriermaschinen Aktien-Gesellschaft to manufacture the machine, christened the Enigma, for enciphering commercial communications, such as transfers between banks. The machine attracted a modest following, but sales were limited until the German navy changed its mind. Modified versions of the Enigma machine were adopted by the German navy in 1926, the German army in 1928, and the German air force in 1935.

BOOK: Darwin Among the Machines
9.62Mb size Format: txt, pdf, ePub
ads

Other books

The Amateurs by David Halberstam
Scorpio Sons 1: Colton by Nhys Glover
Point of No Return by N.R. Walker
Fortress Draconis by Michael A. Stackpole
Prey by Andrea Speed