Magazine | April 16, 2012, Issue

The Great Numbers Crunch

Turing’s Cathedral: The Origins of the Digital Universe, by George Dyson (Pantheon, 432 pp., $29.95)

This year marks the centenary of British mathematician Alan Turing, whose researches in the unlikely and very abstruse field of mathematical logic did much to create the world in which we now live. In 1936, Turing published a paper titled “On Computable Numbers” in the Proceedings of the London Mathematical Society. The paper received almost no attention. “Only two requests for reprints came in,” George Dyson tells us. The reason for this is interesting — is, in fact, one of the main themes in Dyson’s book.

It is an odd thing that, in 1936, digital technologies were old hat. The coming age looked to be entirely analog. (Digital phenomena are staccato, stepping from one value direct to another;  analog is legato, gliding smoothly through all intermediate points.) If you had asked a well-informed person of that date to point to some digital technologies, he would have cited the Western Union man with his green eyeshade and sleeve garters, tapping out Morse code on the telegraph key, or perhaps the Asian shopkeeper working his abacus. The spiffiest new devices were all analog: radio, movies, vinyl discs, and soon TV and radar.

So were the latest grand scientific theories. The spacetime of general relativity flexed analogically to accommodate mass and charge. Quantum mechanics contained some irritatingly digital elements — you can’t have half a quantum — but the underlying equations were written in the comfortingly analogic language of traditional calculus. Only biology had been through a modest digital revolution. The notion of “blending inheritance” (the trait of the offspring falls halfway between the corresponding trait in the parents) had caused much vexation to 19th-century biologists, including Darwin, as it led logically to a population of clones; but no one could come up with an alternative. The 1900 rediscovery of Mendel’s more digital theory resolved the issue.

Genetics aside, the late 1930s was thus a time of analog triumphalism. Across the following decades, everything changed. We now live in a thoroughly digital world. Digital gadgets twitter and beep all around us. At the deepest level, there are serious speculations that spacetime itself may be digital: Scientific American magazine recently did a cover story on the topic. Analog principles and gadgets survive only in a few pockets of the deepest reaction. My own house, for example, contains an analog TV set and a slide rule.

George Dyson tells the story of this great conceptual and technological transformation in Turing’s Cathedral, concentrating on the key years from 1936 to 1958. It was in the latter year that the computer at the Institute for Advanced Study in Princeton was decommissioned after seven years of operation. The IAS computer, which had no formal name (MANIAC, with which it is confused, was a clone machine at Los Alamos), was the brainchild of John von Neumann, one of the most tremendous geniuses who ever lived. He had been one of the first to notice Turing’s 1936 paper — the two shared office space in Princeton. Turing studied for his Ph.D. at the university, from 1936 to 1938; von Neumann had come to the university in 1930, then been given a professorship at the new IAS in 1933.

They shared much else. Though both were brilliant pure mathematicians, neither disdained physical gadgetry. When researching a book on a famous conjecture in pure mathematics, I was surprised to learn that Turing had conceived the idea of a mechanical computing device to disprove the conjecture, and had even cut some of the gear wheels himself in his college’s engineering workshop.

There lay the reason for the lack of interest in Turing’s 1936 paper. In it he had conceived the idea of a universal computing machine: an imaginary device that could duplicate the behavior of any other you might think up. The paper was founded in the purest of pure mathematics, drawing from work by the previous generation of mathematical logicians, who themselves had built on work by David Hilbert, Whitehead and Russell, and earlier enquirers all the way back to Leibniz. The centerpiece of it, though, was that machine. Dyson: “Engineers avoided Turing’s paper because it appeared entirely theoretical,” while “theoreticians avoided it because of the references to paper tape and machines.”

John von Neumann’s career at the Institute for Advanced Study hit the same fault line. The IAS had been conceived as a place where the greatest minds might think their lofty thoughts without the distraction of students, publication schedules, or academic politics — the purest of pure-research institutes. Though not a tinkerer like Turing (“He would have made a lousy engineer,” testified his colleague Herman Goldstine), von Neumann was free of intellectual snobbery. He was in fact a worldly man, a bon vivant even — he never drove anything but Cadillacs — and thus quite opposite to the popular image of a math professor. He believed, he told J. Robert Oppenheimer, that mathematics grew best when nourished by “a certain contact with the strivings and problems of the world.”

#page#Holding such an opinion, von Neumann certainly lived at the right time. Never were there such world-wide strivings and problems as in the middle years of the 20th century; never were the contributions of mathematicians more essential. Job One was of course to win the greatest, most technologically sophisticated war ever fought. Typical problems were the accurate aiming of large artillery pieces and the understanding of the effects of powerful explosions. Both involve large arrays of complex mathematical expressions — differential equations — which must be reduced to arithmetical algorithms to be cranked through by relays of computers.

Hence von Neumann, writing to his wife on his arrival at Los Alamos in September 1943: “Computers are, as you suspected, quite in demand here, too.” This was of interest to Mrs. von Neumann, as she was herself a capable computer. Until well into the 1950s, you see, the word “computer” meant “skilled human calculator.” It was thus defined in the dictionaries of my own childhood. These human computers, making use of a superior kind of electro-mechanical adding machine, had come into their own in support of gunnery projects in World War I, and in the interwar years had proved indispensable in other kinds of research: demography, weather forecasting, and the new science of operations research.

Turing’s 1936 paper had opened up the possibility that these rooms full of human computers might be replaced by a single machine, with a single way of remembering both its data (the particular numbers to be operated on) and its algorithms (the sequences of instructions that define the operations). By the early 1940s this possibility had sunk in with researchers in Britain, Germany, and the U.S. Early prototypes of electronic computers were already in action, though too late for the Manhattan Project, whose numbers were crunched by human computers — initially the wives of physicists, then draftees from the Army’s Special Engineer Detachment.

With the hot war won, all this work passed into the shadow of the H-bomb. The computations here were at a yet higher level, but now true computers were coming into their own, with an assist from lots of cheap war-surplus equipment. Techniques, too, had advanced. One especially dear to von Neumann’s heart was the Monte Carlo method, to which Dyson devotes a whole chapter. Suppose you want to know the likelihood of some event — say, that a stick of given length, cast down onto a planked floor whose boards are a given width, will end by lying across one of the floorboard-cracks. You could do the appropriate math — it’s a straightforward problem in measure theory — or you could just try the thing a few thousand times, throwing the stick at random angles and speeds, and average out the results. The latter is the Monte Carlo method, sometimes unkindly disparaged as: what to do until the mathematician arrives.

Not surprisingly, von Neumann was a keen gambler: He met his wife in the casino at Monte Carlo. The method had great appeal to him, in spite of some knotty conceptual issues around the definition of “random.” It was also well suited to the new devices, and was used to model the paths of neutrons through fissile material. This work all came to triumph on November 1, 1952, with the successful detonation of the first H-bomb. Just six months later James Watson and Francis Crick published their landmark paper on the structure of DNA, and modern science, of both the “dry” and the “wet” variety, entered into digital adulthood.

The true hero of Dyson’s book, it can be seen, is not Alan Turing, though Turing’s momentous contributions are properly described and appreciated. It is John von Neumann who holds the story together.

In July 1955, aged just 51, von Neumann suddenly collapsed, and was diagnosed with advanced cancer. Nineteen agonizing months later, this colossal intellect left the world he had done so much to transform. With his influence decisively removed — it had already been weakened in 1954, when Eisenhower had appointed him to the Atomic Energy Commission — and with the pacifistic, world-government flim-flam favored by Einstein and others in the ascendant, the purists at the IAS made their comeback. IAS physicist Freeman Dyson, the author’s father, said: “When von Neumann tragically died, the snobs took their revenge and got rid of the computer project root and branch.” It would be 22 years before the IAS next had a computer.

In This Issue


Politics & Policy

Polygamy, Too

Presidential candidate Rick Santorum got jeered for comparing the legalization of same-sex marriage to that of polygamy, but, whether or not the comparison is rationally sound, thoughts of the former’s ...


Politics & Policy

Working-Class Wonk

Youngstown, Ohio — Mitt Romney has left the building and the town-hall meeting has ended, but Rob Portman, Ohio’s rail-thin freshman senator, paces across the factory floor to shake hands ...

Books, Arts & Manners

Politics & Policy

Bread and Circuses

The Hunger Games, the expensive adaptation of Suzanne Collins’s young-adult dystopia, has been so cleverly hyped, so successfully marketed, so effectively promoted in all the highways and byways of social ...


Politics & Policy


Missing the Point and Loving It In response to John J. Miller’s article “Friends of the Lorax” (March 19), I must say enthusiastically that I love The Lorax. Now, I don’t ...
Politics & Policy

The Week

‐ We knew Obama would go negative in this campaign. We didn’t expect that his first target would be Rutherford B. Hayes. ‐ The fatal encounter of George Zimmerman and Trayvon ...

Progressively Profane

The official Obama merchandise store isn’t aimed at the 99 percent, unless they can get enough selling plasma to buy that $74 tank top. You can’t imagine anyone in the ...
The Long View


TO: All CNN staff, DOMESTIC ONLY FROM:  Editorial Policy Committee RE:  New racial classifications as of March 2012 To All Producers, Writers, and On-Air Talent: As you know, the recent events in Florida have ...
Politics & Policy


LEAR IN FLORIDA Lear had no grandchildren to aggrandize him, fondle him with kisses, no photos kept in a drawer, to convince him he’s alive: time slices, filial fawning, thin as thought. What denouement but ...
Happy Warrior


As far as the media were concerned, the murder of Jewish schoolchildren in Toulouse and a black teenager in Florida were the same story — literally: Angry white male opens ...

Most Popular


Nordic Welfare States Worsen the Gender Gap

Following International Women's Day 2018, a host of policies have been promoted as ways to advance women's careers. CNBC, for example, has run a story arguing that policies such as parental leave for both parents can raise women’s incomes. In the Huffington Post we can read that adopting the welfare policies of ... Read More

Running With Trump

Jeff Roe, who managed Senator Ted Cruz’s presidential campaign in 2016, has a message for Republican congressional candidates: Don’t run from Trump this year. Instead they should “[f]ix bayonets and charge the hill.” What exactly does this mean? It’s not that they should “support the president’s ... Read More