Read It Began with Babbage Online
Authors: Subrata Dasgupta
It is in this sense that Wheeler's and Gill's dissertations document a scientific style. Its fundamental trait was operationalismâthe search for rules, procedures, operations, methodsâbut its product is operational knowledge. If a computer
science
was slowly emerging, the production or generation of operational knowledge about humans and machines
cooperating
in support of automatic computing was certainly one of its first manifestations.
 Â
1
. D. J. Wheeler. (1951).
Automatic computing with the EDSAC
. PhD dissertation, University of Cambridge.
 Â
2
. S. Gill. (1952).
The application of an electronic digital computer to problems in mathematics and physics
. PhD dissertation, University of Cambridge.
 Â
3
. S. H. Lavington. (1998).
A history of Manchester computers
. London: The British Computer Society (original work published 1976); S. H. Lavington & C. Burton. (2012).
The Manchester machines
; S. H. Lavington (ed.). (2012).
Alan Turing and his contemporaries
(
chapter 4
). London: British Computer Society.
 Â
4
. A. Newell, A. J. Perlis, & H. A. Simon. (1967). What is computer science?
Science, 157
, 1373â1374.
 Â
5
. P. Wegner. (1970). Three computer cultures: Computer technology, computer mathematics, and computer science. In F. L. Alt (Ed.),
Advances in computers
(Vol. 10, pp. 7â78). New York: Academic Press.
 Â
6
. P. S. Rosenbloom. (2013).
On computing
. Cambridge, MA: MIT Press; P. J. Denning. (2007). Computing is a natural science.
Communications of the ACM, 50
, 13â18; P. J. Denning & P. A. Freeman. (2009). Computing's paradigm.
Communications of the ACM, 52
, 28â30.
 Â
7
. Wheeler, op cit., Preface.
 Â
8
. Gill, op cit., Preface.
 Â
9
. Ibid.
10
. Wheeler, op cit., p. 25.
11
. Ibid., p. 26.
12
. Ibid., p. 49.
13
. S. Dasgupta. (1996).
Technology and creativity
(pp. 33â34). New York: Oxford University Press.
14
. M. Polanyi. (1962).
Personal knowledge
(p. 176). Chicago, IL: University of Chicago Press.
15
. Dasgupta, op cit., pp. 157â158.
16
. Gill, op cit., p. 40.
17
. Ibid., p. 41.
18
. Ibid., p. 203.
19
. Ibid.
20
. Ibid., p. 204.
21
. Ibid., p. 49.
22
. Ibid., pp. 62â87.
23
. S. J. Gould. (1977).
Ontogeny and phylogeny
(p. 483). Cambridge, MA: Belknap Press of Harvard University Press.
24
. See, for example, J. Piaget. (1976).
The child & reality
. Harmondsworth, UK: Penguin Books; M. Donaldson. (1992).
Human minds: An exploration
(p. 190). Harmondsworth, UK: Penguin Books.
25
. Gill, op cit., p. 63.
26
. Ibid.
27
. Ibid.
28
. Ibid., p. 67.
29
. Ibid., p. 71.
30
. Ibid., p. 72.
31
. Ibid., p. 77.
32
. Ibid., p. 78.
33
. The term was apparently coined by another member of the EDSAC group, an Australian, John Bennett (1921â2010â), who was, in fact, the first research student to join the Mathematical Laboratory in Cambridge. See M. V. Wilkes. (1985).
Memoirs of a computer pioneer
(p. 140). Cambridge, MA: MIT Press. See also Gill, op cit., p. 78.
34
. F. P. Brooks, Jr. & K. E. Iverson. (1969).
Automatic data processing: System/360 edition
(pp. 365
ff
). New York: Wiley.
35
. Gill, op cit., p. 80.
36
. Ibid., p. 87.
37
. H. Wolfflin. (1932).
Principles of art history
. New York: Dover Publications.
38
. N. Pevsner. (1962).
An outline of European architecture
. Harmondsworth, UK: Penguin Books.
39
. R. Wollheim. (1984).
Painting as an art
(p. 26 et seq.). Princeton, NJ: Princeton University Press.
40
. See, for example, S. Dasgupta. (2003). Multidisciplinary creativity: The case of Herbert A. Simon.
Cognitive Science, 27
, 683â707.
41
. Ibid.
THE 1940S WITNESSED
the appearance of a handful of scientists who, defying the specialism characteristic of most of 20th-century science, strode easily across borders erected to protect disciplinary territories. They were people who, had they been familiar with the poetry of the Nobel laureate Indian poetâphilosopher Rabindranath Tagore (1861â1941), would have shared his vision of a “heaven of freedom”:
Where the world has not been broken up into
fragments by narrow domestic walls.
1
Norbert Wiener (1894â1964), logician, mathematician, and prodigy, who was awarded a PhD by Harvard at age 17, certainly yearned for this heaven of freedom in the realm of science as the war-weary first half of the 20th century came to an end. He would write that he and his fellow scientist and collaborator Arturo Rosenbluth (1900â1970) had long shared a belief that, although during the past two centuries scientific investigations became increasingly specialized, the most “fruitful” arenas lay in the “no-man's land” between the established fields of science.
2
There were scientific fields, Wiener remarked, that had been studied from different sides, each bestowing its own name to the field, each ignorant of what others had discovered, thus creating work that was “triplicated or quadruplicated” because of mutual ignorance or incomprehension.
3
Wiener, no respecter of “narrow domestic walls” would inhabit such “boundary regions” between mathematics, engineering, biology, and sociology, and create
cybernetics
, a science devoted to the study of feedback systems common to living organisms, machines, and social systems. Here was a science that straddled the no-man's land between
the traditionally separate domains of the natural and the artificial. Wiener's invention of cybernetics after the end of World War II was a marker of a certain spirit of the times when, in the manner in which Wiener expressed his yearning, scientists began to create serious links between nature and artifact.
It is inevitable that this no-man's land between the natural and the artificial should be part of this story. Ingenious
automataâ
devices that replicated, of their own steam (so to speak) certain kinds of actions performed by living things, including humansâhad been known since antiquity (see
Chapter 3
, Section IX). However, the computer was an entirely new genus of automata for it seemed to replicate, not action, but human
thought
.
Ada, Countess of Lovelace, had cautioned her reader not to confuse the Analytical Engine as anything but a machine. It had no power to initiate any thing; it could only do what humans had “ordered” it to do (see
Chapter 2
, Section VIII). However, by the early 1940s, even before the stored-program digital computer of any kind had been conceived, but stimulated by such analog machines as the differential analyzer, human imagination had already stepped into the boundary region separating man from machine, the natural from the artificialâhad straddled and bridged the chasm. The year 1943 was noteworthy in this respect on both sides of the Atlantic.
That year, in Cambridge, England, Kenneth Craik (1914â1945), trained as a philosopher and psychologist and, like his contemporary Maurice Wilkes, a fellow of St. John's College, published a short book called
The Nature of Explanation
. In a chapter titled “Hypothesis on the Nature of Thought,” he explored the neural basis of thought. He suggested that the essence of the thought process is symbol processing, of a kind similar to what we are familiar with in mechanical calculating devices.
4
He drew this analogy, let us remember, in a time when, in Cambridge, the digital computer was still a few years away, when the archetypal calculating machine that he knew was the model differential analyzer,
5
that he may have seen in use in the department of physical chemistry at the university (see
Chapter 8
, Section XI).
Indeed, Craik argued, it was not merely that thought uses symbol processing, but that all of thought
is
symbol processing.
6
The process of thinking, as Craik conceived it, involved “the organism” carrying symbolic representations in the head of aspects of the external world, and symbolic representations of the organism's actions. Thought, then, entails the manipulation of the symbolic models by the represented actionsâthat is, by
simulating
actions symbolically and their effects on external reality. Such symbolic simulation
parallels
the way analog computers (such as the differential analyzer) represent analogically a system and computes on the representation.
Craik, as Wilkes recalled in his memoir, was perhaps unusual among philosophers and psychologists because he was seriously interested in gadgets. Apparently, he made
pocket-size models of objects like steam engines
7
âthus, perhaps, the analogy between thinking and mechanical calculation. Unfortunately, he had no chance to pursue his hypothesis for he was hit and killed by a car while bicycling on a Cambridge street on May 7, 1945, the eve of VE Day.
8
Craik's insight that thinking involves symbolic representations in the nervous system of things in the world, and the processing of such symbolic representations, speculative though it was, makes him one of the earliest figures in the emergence of what much later came to be named cognitive scienceâthe study of mental processes by which humans (and some animals) make meaning of their experiences in the world.
9
However, although his ideas were widely discussed by neurophysiologists and psychologists in Britain,
10
he had no apparent impact on the other side of the Atlantic. But then, America had its own first explorers of the relationship between cerebration and computation who advanced their own, very different, and somewhat more precise views of this relationship. By coincidence, these explorers published their first work on this relationship also in 1943.
That year, an article titled “A Logical Calculus of the Ideas Immanent in Nervous Activity” was published by Warren McCulloch (1898â1968), a physician-turned-neurophysiologist who, like John von Neumann, was a polymath of the kind that would have warmed (and no doubt did warm) the cockles of Wiener's heart, and Walter Pitts (1923â1969), a mathematical logician. The journal in which the article appeared,
Bulletin of Mathematical Biophysics
, suggests that the target reader was a theoretical biologist. This, despite the fact, that the paper cited only three references, all authored by world-renowned logicians.
11
These authors were interested in constructing a formalism for describing neural activity. According to current thinking in theoretical neurophysiology, the nervous system comprised a network of nerve cells, or
neurons
. A neuron connects to others through nerve fibers called axons, which branch out through finer structures called dendrites, and these end on the surfaces of other neurons in the form of entities called
synapses
.
A neuron generally has more than one synapse impinging on it. At any instant, a neuron is activated when the sum of its “input” synapses' activities reach a certain threshold. Such synapses are called
excitatory synapses
. If the threshold is not reached, the neurons remain quiescent. However, there are also
inhibitory synapses
that inhibit the excitation of neurons on which they impinge regardless of the excitation level of excitatory synapses connecting to that same neuron.
A neuron, according to the general understanding of the time, has an “all-or-none” nature; it is either active or inactive. It is, thus, a binary digital device. The neural activity of a network of neurons can be determined by the pattern of binary activity of its constituent neurons. It was this apparent binary character of neurons that prompted McCulloch and Pitts to draw on Boolean (or propositional) logic to describe neural
activityâjust as, 5 years earlier, Claude Shannon had analyzed relay switching circuits using Boolean logic (see
Chapter 5
, Section IV). They imagined the behavior of a neuron or a neuron network in terms of the language of Boolean propositions,
12
and they represented the neuron accordingly.
1. The activity of the neuron is an “all-or-none” process.
2. A certain fixed number of synapses must be excited ⦠in order to excite a neuron at any time.
3. The only significant delay within the nervous system is synaptic delay.
4. The activity of an inhibitory synapse absolutely prevents excitation of the neuron at that time.
5. The structure of the net does not change with time.
13
An example of a McCullochâPitts neuron is shown in
Figure 11.1
. Here,
A
and
B
signify synaptic “inputs” to the neuron, and
C
denotes the “output.” The number in the “neuron” shows the threshold of activation. Suppose at the start of some fixed time interval
t
both
A
and
B
are inactive. Then, at the start of the next time interval,
t
+ 1, the neuron is inactive. If either
A
or
B
is active and the other inactive, then the neuron also remains inactive because the threshold of activation has not been reached. Only if both
A
and
B
are active at the start of time interval
t
, will the neuron be activated at the start of the
t
+ 1 time interval. Here, the neuron functions as a Boolean AND device satisfying the proposition
C
=
A
AND
B
.