Read The Bletchley Park Codebreakers Online
Authors: Michael Smith
Colossus could be used to provide the original 41×31 excesses that were written into the rectangle. Wrens did convergences and became very good at it. Once a convincing pair of patterns was produced, Colossus could be used to get partial patterns for the other chi-wheels, and eventually to give complete patterns. At first this involved Wrens or (much worse) cryptanalysts going behind the machine and fiddling with the gadgets that set up the wheel-patterns. The engineers hated anything
like that and soon provided us with panels at the front of some of the Colossi on which we could do the job without risk to the machines.
The method worked well enough and we sent reliable de-chis to the Testery; but we wondered at first whether they could expand their method of ψ-setting to the much harder job of ψ-breaking. We need not have wondered. In February 1944, they had their first such success. The method was to fit a longish crib, say thirty characters long, that gave a plausible stretch of ψ'; to contract this to a stretch of ψ, perhaps twenty long; and to cycle the twenty-long stretches of ψ
1
, ψ
2
, … through one cycle each. The shortest cycle-length is forty-three and the longest fifty-nine, so that there would be four consecutive complete ψ characters.
The analyst could guess whereabouts they would appear, probably extended, and place them by recognizing a fragment of plain-text. From then on it was a question of extending the plaintext and building up the complete psi-patterns.
This first success was rightly hailed as a triumph, but it was repeated confidently on almost all occasions on which we sent a correct de-chi (and even sometimes when it wasn’t entirely correct). In one very favourable case, the ψs were recovered in 35 minutes.
So by early 1944, we could break the patterns and decipher transmissions on any link/month that had one long enough transmission or a readable depth. But soon we had an addition to our armoury. Sixta discovered that some messages on perforated tape were sent on more than one link. For this to be fruitful three things had to go right: a pair of such transmissions had to be spotted; one of them had to be deciphered; and the plain-text had to be matched against the correct stretch in the other and the resulting key broken. Sixta had useful clues to spotting a pair. Each message in a transmission carried a serial number, which would, of course, be enciphered; when a transmission had been deciphered by its recipient, he often sent, as a receipt, the last two digits of the serial numbers of each message in the transmission. When Sixta found among the receipts of different links the same pair of digits at similar times there was a chance that they included the same message. Since links generally passed more than a hundred messages each day, there were plenty of coincidences that had not arisen from retransmission. Sixta became familiar with pairs of links liable to be sending the same message, and they could often make use of priority signals and other indications.
They would submit to the Testery crib section daily predictions of retransmissions. This crib section read all deciphered messages and could often pick out a message that was likely to have been retransmitted on another link. They would pass such ideas to Sixta and could include the enciphered serial number, from which Sixta could often pick out a likely pair. Most of the successful pairings were spotted in this way.
The plain-text to be used was provided by the Testery and sent with the supposed match to the Newmanry crib section. Robinson was the ideal machine to subtract the plain-text at all likely offsets from stretches of the cipher text, and count for some suitable property of key. Many such properties were proposed and used. When a convincing match was found, the resulting stretch of key was broken by standard methods.
All this called for close liaison with Knockholt, painstaking checking both of plain and cipher text, and ingenuity and mathematics to devise the most efficient tests. The Newmanry crib section worked on 250 cases out of nearly 900 suggested by Sixta. Of these 72 were successful.
Throughout the whole Tunny operation, research and experience led to continual improvement. Experience was responsible for one improvement. The Testery deciphered the transmissions that had been set, and they often found it went wrong before the end. They soon discovered that the cipher tape sent by Knockholt sometimes left out or inserted a character or two, generally because of poor radio reception. We called that a ‘slide’. A slide, of course, would weaken any statistical effect being used to set patterns. We therefore asked for subsequent Colossi to have the facility to ‘span’, to restrict its count to any stretch of the cipher tape that we could specify by giving the positions at which to start and to end. A good count could then be repeated on (say) the first two-thirds and again on the last two-thirds of the tape. Often one was strong and the other weak, suggesting a slide; subsequent spans could home in on the approximate position of the slide and later runs could be limited to the largest span available that seemed to be slide-free.
In the Testery, they sharpened their methods and came to know more and more about the cribs useful for the various links. They commissioned and received a machine called Dragon. This was used on a de-chi; a likely crib was ‘dragged’ against it; it was subtracted at a range of plausible positions to give a stretch of putative ψ'. Dragon removed any repeated characters to give putative unextended ψ, and checked against the known psi-patterns, only recording a possible hit when it could fit all five.
They also refined Turingery by a process (I think Peter Hilton invented it) called Devil Exorcism. In Turingery, characters of Δψ' that look likely to be / are assumed to be so, giving consequent Δχ bits that get cycled throughout. When the assumption is wrong it can proliferate wrong bits of Δχ. Devil exorcism was a technique that limited the damage done by such wrong assumptions.
Both sections kept research books for anyone to record bright ideas. The Newmanry produced over forty of these, containing suggestions of new routines, calculations of significance tests, bits of mathematics that seemed relevant, reports of plain-text statistics and much else. In the Newmanry, tea parties were started; anyone could call one, ideas were bandied about, you came if you could and you brought your own tea. Tea parties didn’t decide things but they led to action on all fronts. Newman ran a comfortably democratic and friendly section. I was not the only one to persuade one of the Wrens to marry him. Mathematicians had occasional weeks off for research or secondment. I remember with particular pleasure a secondment to the Testery where Hilton guided me through a successful Turingery.
Both sections grew steadily. The Testery recruited from the military, the Newmanry from the Wrens, the universities and from Dollis Hill, which also supplied us with one new Colossus each month, as well as most of the other machines. We depended totally on our engineers, who installed incoming machines and maintained the battery already installed. It was only because of them that we could keep on increasing our output.
The Wrens were housed in the splendour they deserved in Woburn Abbey. We eventually had over 250 of them in the Newmanry, and their contribution (not just socially) was outstanding. They controlled the flow of tapes and ordered the runs to be made, with occasional advice from the analyst who was duty officer. They operated the machines, including the Colossi, with aplomb. Some of them would
be in charge of Colossi on their own, moving expertly from run to run. For wheel-breaking, there was always an analyst in charge, but he would be helpless without a Wren to wind the tapes onto the bedstead and plug up the runs.
Several analysts were seconded to us from the US Army and one from the US Navy; we also had highly professional advice at our tea parties from a US liaison officer. Unquestionably the Americans had generously sent us some of their best men. Four of them later held, as civilians, highly important posts in the postwar National Security Agency. They also sent us, near the end of the war, a photographic team. They had developed efficient ways of comparing two streams at all offsets by sliding photographic strips over each other and measuring the amount of light that shone through – instantaneous counting. It had the advantage of speed over Robinson, but came too late to make the impact the equipment deserved.
The Germans saluted the Normandy landings by changing wheelpatterns not monthly but daily. That meant that we had to break the patterns every day for all the links of intelligence interest. Fortunately by then our techniques of converging rectangles and completing the χ-breaking on Colossus were well established; we just had more of it to do. The Newmanry had by then spread into Block H where the wheel-breaking and the crib section were sited. It housed a large roomful of Wrens in Block H, converging rectangles, and the Colossi that had been adapted for wheel-breaking. The Testery had more dechis from which to break the psi- and μ-patterns; and, as the Germans progressively abandoned the autoclave limitations, more depths to disentangle. The job of setting known psi- and μ-patterns on de-chis was taken back into the Newmanry; Colossus was very well adapted for the job and we then no longer had to make de-chi tapes. That meant that, once a day’s patterns had been broken, all the setting was done in the Newmanry (as it had been in September 1943) and the settings were sent to the Testery for them to decipher the transmission.
This period lacked incident except for the fire that we had in one of the Block F rooms. The fire was put out, and on that day, as it so happened, we solved more transmissions than ever before. All-round
success persisted; technical advances and refinements continued. We were still getting better at it when the German war came to an end. Soon after that we were happy to be sent a mobile German Tunny machine; it was probably of more detailed interest to the engineers and telecommunication experts than to the rest of us.
After the war some from both sections joined GCHQ. One Newmanry analyst was rumoured to have set up a bogus university that issued impressive degree documents to anyone willing to buy. Most of the mathematicians ended up teaching at real universities. Two Testery analysts were bold enough to start new organizations: Roy Jenkins later initiated and led the Social Democratic Party; and, most remarkably, Peter Benson started Amnesty, which later became the highly influential Amnesty International. I never lose an opportunity to claim them as wartime colleagues. Whatever we did after the war – ATS, Wrens, engineers, analysts – we all knew that we had been part of a very successful joint venture, and that it had significantly helped the Allied armed forces to win the German war.
Historians describe intelligence as ‘the missing dimension’. Without an understanding of the intelligence those in power were receiving and its influence on the decisions they made, any history will be incomplete. It is now easy to see how important the breaking of the Axis codes and ciphers at Bletchley Park was in influencing the Allies’ conduct during the Second World War. But while details of many of the Cold War intelligence operations have made their way into the public domain, very little is still known about codebreaking successes during that period. We do not as yet know quite how important Sigint was in the decisions taken by Western politicians during the critical moments of the Cold War such as the Cuban missile crisis, the Soviet invasions of Hungary and Czechoslovakia, and the Solidarity-led industrial unrest in Poland. Sigint remains the ‘missing dimension’ to any history of the second half of the twentieth century.
But it is not the only ‘missing dimension’. The second half of the twentieth century was dominated by the computer age. The tentative first steps taken during the war led initially to widespread use of computers within industry, later to the advent of the personal computer and finally to the construction of the Internet. Computers now dominate virtually every aspect of our lives but until only recently there was a ‘missing dimension’ to the history of the computer age, and one in which, like the intelligence resulting from the breaking of the Axis codes and ciphers, Bletchley Park played a groundbreaking role.
There were many remarkable people at Bletchley Park. But in early 1943, two extraordinarily gifted men came together to create Colossus, the world’s first electronic digital computer. Alan Turing did not, as is sometimes assumed, have any role in the construction of Colossus. A telephone engineer called Tommy Flowers built the computer, working on specifications laid down by Max Newman, a Cambridge mathematician. Told that it might affect the course of the war, Flowers had Colossus up and running by the end of 1943. The full details of that story remained secret until only very recently. Even now, Colossus remains something of a ‘missing dimension’ in computing history. Books claiming that the American ENIAC computer was the first electronic digital computer continue to be published. This chapter, by the distinguished computer historian Professor Jack Copeland, tells the true story of the birth of the modern computer.
MS
Colossus was the world’s first large-scale electronic digital computer. In the hands of the Bletchley Park codebreakers, it gave the Allies access to the most secret German radio communications, including messages from Hitler to his front-line generals. It was built during 1943 by Thomas (Tommy) Flowers and his team of engineers and wiremen, a ‘band of brothers’ who worked in utmost secrecy and at terrific speed. The construction of the machine took them ten months, working day and night, pushing themselves until (as Flowers said) their ‘eyes dropped out’. In January 1944, the racks of electronic components were transferred from Flowers’ workshops at the Post Office Research Station at Dollis Hill, north London, to Bletchley
Park, where Colossus was assembled by Flowers’ engineers. Despite its complexity, and the fact that no such machine had previously been attempted, the computer was in working order almost straight away – testimony to the quality of Flowers’ design and the accuracy of the engineering work carried out at Dollis Hill. The name ‘Colossus’ was certainly apt: the machine was the size of a small room and weighed approximately a ton. By February 1944 Colossus was in use by the codebreakers of the Newmanry, a Bletchley section named after its head and founder, Cambridge mathematician Maxwell (Max) Newman.
By the end of the war in Europe, ten Colossi were working in the Newmanry, all built by Flowers’ team at Dollis Hill, and an eleventh was partly assembled. Most Colossi were broken up once hostilities ceased. Some of the electronic panels – counters, shift-registers, and so forth – were taken to Newman’s newly created Computing Machine Laboratory at Manchester University, once all traces of their original use had been removed. Two intact Colossi was retained by GCHQ at Cheltenham. The last Colossus is believed to have stopped running in 1960. During its later years it was used for training purposes.
Those who knew of Colossus were forbidden by the Official Secrets Act from sharing this knowledge. A long time passed before the secret came out that Flowers had built the first working electronic computer. During his lifetime (he died in 1998) he never received the full recognition he deserved. Many history books, even recently published ones, claim that the first electronic digital computer was an American machine built by J. Presper Eckert and John Mauchly – the ENIAC. Started in 1943, the ENIAC was designed to calculate the numerical tables used when aiming artillery in the field. However, the ENIAC was not operational until 1945. Before the 1970s, few had any idea that electronic computation had been used successfully during the Second World War. In 1975, the British Government released a set of captioned photographs of Colossus. Also during the 1970s, articles appeared by two former Newmanry codebreakers. Jack Good and Donald Michie, giving the barest outlines of Colossus, and historian Brian Randell published material derived from interviews with Flowers, Newman, Good, Michie, and others. By 1983, Flowers had received clearance from the British Government to publish an account of the hardware of Colossus I. Details of the later Colossi, and of how Flowers’ computing machinery was actually used by the
cryptanalysts, remained secret. There matters stood, more or less, until 1996, when the US Government released numerous wartime documents concerning Colossus and the cryptanalytical processes in which it figured. Even then, however, one historically vital document, the ‘General Report on Tunny’, remained classified. This detailed two-volume account of the cryptanalytical work involving Colossus was written in 1945 by Good, Michie, and their colleague Geoffrey Timms. Thanks largely to Michie’s tireless campaigning, this report was declassified in 2000, finally ending the secrecy and enabling the full story of Colossus to be told for the first time.
In the original sense of the word, a computer was not a machine at all, but a human being – a mathematical assistant whose task was to calculate by rote, in accordance with a systematic method supplied by an overseer prior to the calculation. The computer, like a filing clerk, might have little detailed knowledge of the end to which his or her work was directed. Many thousands of human computers were employed in business government, and research establishments, doing some of the sorts of calculating work that nowadays is performed by electronic computers.
The term ‘computing machine’ was used increasingly from the 1920s to refer to small calculating machines which mechanized elements of the human computer’s work. For a complex calculation, several dozen human computers might be required, each equipped with a desk-top computing machine. By the 1940s, however, the scale of some calculations required by physicists and engineers had become so great that the work could not easily be done in a reasonable time by even a roomful of human computers with desk-top computing machines. The need to develop high-speed large-scale computing machinery was pressing.
During the late 1940s and early 1950s, with the advent of electronic computing machines, the phrase ‘computing machine’ gave way gradually to ‘computer’. As Turing stated, the new machines were ‘intended to carry out any definite rule of thumb process which could have been done by a human operator working in a disciplined but unintelligent manner’. During the brief period in which the old and new meanings of ‘computer’ co-existed, the prefix ‘electronic’ or ‘digital’ would usually be used to distinguish machine from human.
A computer, in the later sense of the word, is any machine able to do work that could, in principle, be done by a human computer.
Mainframes, laptops, pocket calculators, palm-pilots – all are computing machines, carrying out work that a human rote-worker could do, if he or she worked long enough, and had a plentiful enough supply of paper and pencils.
The Victorian Charles Babbage, many decades ahead of his time, was one of the first to grasp the huge potential of the idea of using machinery to compute. Babbage was Lucasian Professor of Mathematics at the University of Cambridge from 1828 to 1839. Babbage’s long-time collaborator was Ada, Countess of Lovelace (daughter of the poet Byron). Her vision of the potential of computing machines was perhaps in some respects more far-reaching even than Babbage’s own. Lovelace envisaged computing machines going beyond pure number-crunching, suggesting that one of Babbage’s planned ‘Engines’ might be capable of composing elaborate pieces of music.
In about 1820 Babbage proposed an ‘Engine’ for the automatic production of mathematical tables (such as logarithm tables, tide tables, and astronomical tables). He called it the ‘Difference Engine’. This was, of course, the age of the steam engine, and Babbage’s Engine was to consist of more accurately machined versions of the types of components then found in railway locomotives and the like – brass gear wheels, rods, ratchets, pinions, and so forth. Decimal numbers were represented by the positions of 10-toothed metal wheels mounted in columns. Babbage exhibited a small working model of the Engine in 1822. He never built the full-scale machine he had designed, but did complete several parts of it. The largest of these – roughly 10 per cent of the planned machine – is on display in the London Science Museum. Babbage used it successfully to calculate various mathematical tables. In 1990, his ‘Difference Engine No. 2’ was finally built from the original design and is also on display at the London Science Museum – a glorious machine of gleaming brass.
Babbage also proposed the ‘Analytical Engine’, considerably more ambitious than the Difference Engine. Had it been completed, the Analytical Engine would have been an all-purpose mechanical digital computer (whereas the Difference Engine could perform only a few of the tasks that human computers carried out). A large model of the Analytical Engine was under construction at the time of Babbage’s death in 1871, but a full-scale version was never built. The Analytical Engine was to have a memory, or ‘store’ as Babbage called it, and a central processing unit, or ‘mill’. The behaviour of the Analytical
Engine would have been controlled by a program of instructions contained on punched cards connected together by ribbons (an idea Babbage adopted from the Jacquard weaving loom). The Analytical Engine would have been able, like Colossus, to select from alternative actions on the basis of the outcomes of its previous actions – a facility nowadays known as ‘conditional branching’.
Babbage’s idea of a general-purpose calculating engine was well known to some modern pioneers of automatic calculation. In the US during the 1930s, Vannevar Bush and Howard Aiken, both of whom built successful computing machines in the pre-electronic era, spoke of accomplishing what Babbage set out to do. Babbage’s ideas were remembered in Britain also, and his proposed computing machinery was on occasion a topic of lively mealtime discussion at Bletchley.
Two key concepts in the development of the modern computer are the ‘stored program’ and the ‘universal computing machine’. As everyone who can operate a personal computer knows, the way to make the machine perform the task you want – word-processing, say – is to open the appropriate program stored in the computer’s memory. Life was not always so simple. The Analytical Engine did not store programs in its memory. Nor did Colossus. To set up Colossus for a different job, it was necessary to modify by hand some of the machine’s wiring. This modification was done by means of switches and plugs. The larger ENIAC was also programmed in this way. It was something of a nightmare: in order to set up the ENIAC for a fresh job, the operators would have to spend a day or more rerouting cables and setting switches. Colossus, ENIAC, and their like are called ‘program-controlled’ computers, in order to distinguish them from the modern ‘stored-program’ computer.
This basic principle of the modern computer, i.e. controlling the machine’s operations by means of a program of coded instructions stored in the computer’s memory, was thought of by Alan Turing. At the time, Turing was a shy, eccentric student at Cambridge University. He went up to King’s College in October 1931 to read Mathematics and was elected a Fellow of King’s in the spring of 1935, at the age of only twenty-two. In 1936 he left England to study for a Ph.D. in the United States, at Princeton University, returning in 1938. Turing’s ‘universal computing machine’, as he called it – it would soon be known simply as the universal Turing machine – emerged from research that no one would have guessed could have any practical
application. In 1936, Turing was working on an obscure problem in mathematical logic, the so-called ‘decision problem’, which he learned of from lectures on the foundations of mathematics and logic given by Newman. While thinking about this problem, Turing dreamed up an abstract digital computing machine which, as he said, could compute ‘all numbers which could naturally be regarded as computable’.