Turing's Cathedral

by George Dyson

First reviewed January 2013

This book is wrapped around New Hampshire for me, because I read it up there. Not super relevant, but some real Old World New England vibes. Regardless of the cobwebs of my memory, I think the review is way too long. Way too long. I am confused why I tried to use HTML paragraph breaks. And without an editor, it’s pretty challenging to ape the New Yorker vibe that I clearly shot for (and missed).

But—over ten years of writing “reviews” has truly helped me feel comfortable in my writing.

Gotta start somewhere. I would like to tell myself from 10 years ago to get an editor (which will not me from 2023, because…too long).


I was surprised to see a dramatis personae in the opening of Turing’s Cathedral, but it is both apt and necessary. This is a dense character study of the dozens of mathematicians, engineers, and scientists that built the first electronic computing machine; the circumstances of their lives are detailed in lurid and occasionally ponderous detail. It is not a brisk read and it certainly takes some work to access. “Budapest, the city of bridges, produced a string of geniuses who bridged artistic and scientific gaps.” I find this line so pedestrian it borders on insulting, blog fodder that an editor should have culled in the first round. I do not fault the writer, who was likely quite pleased; it falls into the seductive “too cute” territory—a witty rejoinder during a casual conversation, maybe, but a strained metaphor for a written page. I picture a publisher wearily sighing as he or she opts not to edit a math-centric author on this sophomoric literary faux pas.

But something happened about a hundred pages in; as I became used to the style, meeting each member of the cast, complete with detailing on how—and why—they came to Princeton radiated a sense of purpose. The pace is dictated by the nature of the material. These were some of the best and brightest of a generation; modernization as well as two world wars sent people from all over the globe to New Jersey. It give a sense that the subject requires a deep read, and if that sets a meandering pace, then so be it; the background of each player is required for comprehension of the how and why of our digital age. Sections and paragraphs and pages are devoted to minutia of travel or familial background:

The problem was how to squeeze displaced scholars into a shrinking job market without provoking the very anti-Semitism those scholars were trying to escape. The United States offered non-quota visas to teachers and professors, but with insufficient openings for American candidates, finding positions for the refugees, especially in Princeton, was a difficult sell. An invitations to the Institute for Advanced Study allowed Princeton University, historically resistant to Jewish students and faculty, to reap the benefits of the refugee scientists without incurring any of the associated costs. The arrival of Einstein helped open the door. Princeton had become one of the more conservative enclaves in the United States, “a quaint and ceremonious little village of puny demigods on stilts,” as Einstein described it to the Queen of Belgium in 1933.

What you get, then, is not a book about the creation of digital computers, but a book about the people who happened to create computers.

Gödel proved that within any formal system sufficiently powerful to include ordinary arithmetic, there will always be undecidable statements that cannot be proved true, yet cannot be proved false. Turing proved that within any formal (or mechanical) system, not only are there functions that can be given a finite description yet cannot be computed by any finite machine in a finite amount of time, but there is no definite method to distinguish computable from noncomputable functions in advance. That’s the bad news. The good news is that, as Leibniz suggested, we appear to live in the best of all possible worlds, where the computable functions make life predictable enough to be survivable, while the noncomputable functions make life (and mathematical truth) unpredictable enough to remain interesting, no matter how far computers continue to advance.

The mathematical theories are still prevalent, and you will pick up bits of atomic age etymology, like how a “flip-flop” became a “toggle.” “'Flip-flop' is not the right word for a bi-stable circuit which stays in whatever state you put it in. 'Toggle' constituted a far more secure representation of binary data than an element whose state is represented by simply being on or off—where failure is indistinguishable from one of the operational states.” Turing's Cathedral is the genesis chapter of the electronic computer bible, written now before apocrypha finds a way to alter the message. A message that computers, however they may be used now, were born of warfare:

The behavior of both high-explosive detonations and supersonic projectiles depended on the effects of shock waves whose behavior was nonlinear and poorly understood. What happens when a discontinuity is propagated faster than the local speed of information ahead of the disturbance (for pressure waves, this being the speed of sound)? What happens when two (or more) shockwaves collide?

Von Neumann’s theory of reflected shock waves could then be used to maximize a bomb’s effects. “If you had an explosion a little above the ground and you wanted to know how the original wave would hit the ground, form a reflected wave, then combine near the ground with the original wave, and have an extra strong blast wave go out near the ground, that was a problem involving highly non-linear hydrodynamics,” recalls Martin Schwarzschild.<

Dozens of human computers—people with paper, pencil, and patience—would have to work for hundreds of hours to complete the math required for shock wave analysis and bomb anticipation. Anti-aircraft guns, during World War Two, required a gunner to estimate how far to lead and when to light the shell before firing. They were not very accurate. Mathematical tables were brought in; creating the tables took time. Non-human computers made this work practical. It also opened the door, down the road, to the creation of weapons so powerful they defied then-current imagining:

After the Soviet explosion of a nuclear weapon on August 29, 1949, the General Advisory Committee of the Atomic Energy Commission was asked for their opinion on whether the United States should undertake the development of the hydrogen bomb. The answer was no. “It is not a weapon which can be used exclusively for the destruction of material installations of military or semi-military purposes,” Oppenheimer explained in the introduction to the committee’s report. <i>“Its use therefore carries much further than the atomic bomb itself the policy of exterminating civilian populations. We all hope that by one means or another, the development of these weapons can be avoided.”

“Its use would involve a decision to slaughter a vast number of civilians,” the majority, including James Conant as well as Oppenheimer, concurred. “We believe that the psychological effect of the weapon in our hands would be adverse to our interest…. In determining not to proceed to develop the Super bomb, we see a unique opportunity of providing by example some limitations on the totality of war and thus of limiting the fear and arousing the hopes of mankind.”</i>

Before the terrifying spectacles at Bikini Atoll were made possible, electronic computers had to be up to the task. <blockquote>

If we devote in this manner several years to experimentation with such a machine, without a need for immediate applications, we shall be much better off at the end of that period in every respect, including the applications. The importance of accelerating, approximating, and computing mathematics by factors like 10,000 or more, lies not only in that one might thereby do in 10,000 times less time problems which one is now doing, or say 100 times more of them in 100 times less time—but rather that one will be able to handle problems which are considered completely unassailable at present.

That is a quote from John von Neumann, who “...left Europe with an unforgiving hatred for the Nazis, a growing distrust of the Russians, and a determination never again to let the free world fall into a position of military weakness that would force the compromises that had been made with Hitler while the German war machine was gaining strength." He was the driving force behind the digital computing age and a staunch advocate for using the power of computers for weapons testing. Without his ability to secure funding and allow mathematicians and engineers space for creative and non-practical experimentation—all while still providing classified government sponsors usable and politically leveragable wartime and "preventative" weapon capabilities—electronic computers and our current information age would likely have been stymied for decades or denied outright.

Yet even during the darkness of World War Two, the newfound precision that computing power allowed was not used solely for destruction: <blockquote>

While von Neumann was looking for targets that should be bombed, the Institute’s humanists were enlisted (by the American Commission for the Protection and Salvage of Artistic and Historic Monuments in War Areas) to help identify targets that should not be bombed. Erwin Panofsky, the art historian, was responsible for identifying culturally important resources in Germany, while the Institute’s classicists and archaeologists helped supply similar intelligence for the Mediterranean and Middle East. Even Einstein was debriefed.</blockquote>

It is in these small details rather than the breathtaking scope of world-altering events that elucidate just how much research and effort went into Turing’s Cathedral. Bits and quotes from the major players are dropped liberally throughout the text so the reader can create their own unadulterated view and not be forced into the image of events as seen through the lens of the narrator. “‘The ENIAC was limited by storage, not by speed.’” Imagine that you take 20 people, lock them up in a room for 3 years, provide them with 20 desk multipliers, and institute this rule: during the entire proceedings all of them together may never have more than one page written full,' von Neumann observed. ‘They can erase any amount, put it back again, but they are entitled at any given time only to one page. It’s clear where the bottleneck…lies.'" It is beneficial that the author has familial ties to the time and places being discussed, and a reader gets the feeling he is transcribing events as near to accurate as we who were not there are likely to get.

As time marches forward and electronic computers become ubiquitous, it borders on the imperative to understand how and who created the expanding digital universe:

In the real world, most of the time, finding an answer is easier than defining the question. It is easier to draw something that looks like a cat than to define what, exactly, makes something look like a cat. A child scribbles indiscriminately, and eventually something appears that resembles a cat. An answer finds a question, not the other way around. The world starts to make sense, and the meaningless scribbles (the unused neural connections) are left behind. “I agree with you about ‘thinking in analogies,’ but I do not think of the brain as ‘searching for analogies’ so much as having analogies forced upon it by its own limitations,” Turing wrote in 1948.

Turing’s Cathedral imparts a near-biblical understanding of the dawn of the Information Age. "The information in our genes turned out to be more digital, more sequential, and more logical than expected, and the information in our brains turned out to be less digital, less sequential, and less logical than expected." Computers and humanity have become symbiotically intertwined and both sides grow more similar as they take cues from each other:

Search engines and social networks are analog computers of unprecedented scale. Information is being encoded (and operated upon) as continuous (and noise-tolerant) variables such as frequencies (of connection or occurrence) and the topology of what connects where, with location being increasingly defined by a fault-tolerant template rather than by an unforgiving numerical address. Pulse-frequency coding for the Internet is one way to describe the working architecture of a search engine, and PageRank for neurons is one way to describe the working architecture of the brain. These computational structures use digital components, but the analog computing being performed by the systems as a whole exceeds the complexity if the digital code on which it runs.

It is difficult to keep the obvious truth—that computers were recently created by people—in plain view. George Dyson is Hesiod for the Prometheus tale of the men and women of Princeton’s Institute for Advanced Study. Access to accurate documentation of a creation myth—the origin of electronic computers, technology as simultaneously useful and dangerous as fire—makes Turing’s Cathedral a strong recommendation for anyone.

David Dinaburg