Nonfiction Books » Technology » Computing

The best books on The Origins of Computing

recommended by George Dyson

Turing's Cathedral: The Origins of the Digital Universe by George Dyson

Turing's Cathedral: The Origins of the Digital Universe
by George Dyson

Read

As we approach the centenary of Alan Turing’s birth, science historian George Dyson looks back at the achievements of wartime code breaking and the “human computers” who enabled our modern age of iPhones and laptops

Interview by Alec Ash

Turing's Cathedral: The Origins of the Digital Universe by George Dyson

Turing's Cathedral: The Origins of the Digital Universe
by George Dyson

Read
Buy all books

Your new book, Turing’s Cathedral, reveals “the origins of the digital universe”. Can we trace the birth moment of modern computing back to the early 20th century British logician Alan Turing?

Well, there was no birth moment. My book is not about the “first computer” – there was no such identifiable thing. The reason the book is titled after Alan Turing is because we owe the concept of the modern computer to Turing’s paper of 1936 [“On Computable Numbers”].

But that was more of a conception than a birth. The birth itself was extremely complicated. There were electronic computers before Turing, but Turing came up with the concept of universal computation – that if you built one machine in hardware you could do everything else with software. That was in many ways the key concept. How you implement it can be done in different ways.

Could you explain that concept of Turing’s universal machine for us, in lay terms?

A Turing machine is very simply a black box that reads and writes symbols along an unbounded length of tape. A universal Turing machine can imitate any other Turing machine by being given a coded description of the first machine. When we say Turing machine now, we generally mean his universal machine.

Turing established the equivalence between hardware and code – that if you build a computer that can follow elementary instructions, then you can feed it increasingly complex code and it can do pretty much anything you want it to.

That’s the world we live in now. You buy your iPhone and download the apps that do what you want. Today, your smartphone is a universal Turing machine.

I will value it all the more for that. And what is the famous Turing Test, postulating the question of whether machines can think?

The question, rather, is not whether machines can think, but if they do think how do we know it? That’s a very profound question, and the test is whether we can tell it is a computer talking to us. We’re not there yet, but we’re getting there year by year. But I have a different theory. I believe that if there were a true artificial intelligence, it would be smart enough not to reveal itself to us.

Tell us more about the story your own book is telling.

My book is not so much about Turing as about the realisation of his ideas by a group of people [in America in the late 1940s and 50s] who built one of the more successful universal machines. I am especially interested in how we got to the digital matrix we live in today, where every bit in every computer, and now every computer on the Internet has a numerical address.

“I believe that if there were a true artificial intelligence, it would be smart enough not to reveal itself to us.”

We live in a world where everything is referenced by numbers. It’s like building a city. Somebody set up the street grid that we’re all stuck with now, for better or worse. I became obsessed with the question of how that happened.

Who were the chief city planners? Was it John von Neumann, the US mathematician who is at the heart of your story?

In my version of the mythology it was John von Neumann, explicitly following Alan Turing’s blueprint. Turing gave us a one-dimensional model of an endless tape running through a black box of a machine, and von Neumann made it two-dimensional with the address matrix that we live with today. In Turing’s machine, if you want to get to seven miles down the tape you have to go through 6.99 miles of tape first. That can take an absurdly long time. Von Neumann turned it into squares on a two-dimensional grid, like a chessboard. That was much more effective, and became the model that almost all computers followed.

And our iPads, smartphones and PCs are the direct descendants of this work?

Yes. They can be directly traced back to the choice of that address matrix and how it was structured. It’s like the genetic language in our cells – once you choose an alphabet you’re stuck with it.

So it’s not an ideal alphabet?

Not necessarily. If those first engineers looked at the world today, I think they would be quite astonished that we are doing it more or less exactly the way they did. They all thought something better would come along.

Let’s look further back at the road to Turing, with your first selection. Will you introduce this book to us?

Martin Davis is a brilliant mathematical logician who was working with von Neumann at the beginning of the 1950s and came up with some of our best interpretations of Turing’s work. This is a brilliant, accessible and not overlong book looking at the whole evolution of the idea of computing, going back to Gottfried Leibniz – whom I would credit as the grandfather of it all. Davis explains how Leibniz’s ideas became real and important, and he’s fair about the genealogy of whose ideas led to other ideas.

So the book is a history, and an exploration of the theory as well?

Yes, it’s a very good explanation of what computation really is. Because Martin Davis is a mathematician, it’s exactly correct – whereas with a lot of other books, including my own, there are sacrifices to technical accuracy for the sake of a story. Davis is rigorous about what Turing did and didn’t prove, and why it’s important today.

In a nutshell, what is the notion of computation and what is a computer?

Computation is essentially a mapping between sequences of code and memory. It’s a back-and-forth between memory and code, so a profoundly simple thing yet it has powerful consequences. Everything we do, from talking on Skype to watching digital movies, is at heart this extremely simple process. And computers are the engines of that logic.

How was Leibniz the grandfather of these ideas?

Leibniz envisaged that the entire universe could be represented as digital code. That sounded absolutely crazy in the 17th century, but it’s the world we live in today. There’s almost nothing left that isn’t being digitised. Leibniz also built computers. He even designed – in a 1679 unpublished manuscript – a digital computer using black and white marbles running down tracks that behaved in exactly the same way that our computers work today, running electrons through wires. So there’s no doubt in my mind that Leibniz was the original prophet, Turing was the later prophet, and then you can argue over who actually built what.

Leibniz’s theories were also profoundly philosophical, drawing on Aristotle’s notions of perfection as the complete fulfilment of a function – as of course a computer does.

Yes, the philosophical implications of all this are very interesting.

Let’s move onto Andrew Hodges’s biography of Turing.

Alan Turing was born exactly 100 years ago, [editor’s note: this interview was done in 2012] and died aged 41. In those 41 years he led an amazing life that is covered with extraordinary grace, complexity and completeness by Andrew Hodges in this biography. It was first published in 1983 and remains in print.

No one could do a better job than Andrew Hodges, who is himself a mathematician – it is truly a masterpiece. Although it’s not a book about the history of computing per se, it’s a must read if you want to understand how we got to our modern world of computing, and gives you a great picture of the life and times of Alan Turing at this critical period.

The other score on which Hodges understands Alan Turing is that, like Turing, he is gay.

Turing was convicted of gross indecency in 1952 [for homosexual relations, then illegal in Britain] and sentenced to either imprisonment or oestrogen injections [to reduce libido]. He chose the oestrogen injections, which have an effect on personality and were a brutal treatment. He died in 1954 in what people assume was a suicide, although we don’t know for sure.

This book is also very good on the details of what Alan Turing did, and what happened more generally, at Bletchley Park. Although when Hodges wrote the book, in 1983, much of the Bletchley Park material was still secret.

Tell us more about the breaking of the Enigma cipher by the British at Bletchley Park, now that we know more about it.

During the Second World War, the Germans were using a machine called Enigma. The British, thanks to work by Turing and his innumerable colleagues, broke the Enigma cipher at Bletchley Park through a series of clever mathematical and human tricks. It’s unbelievable how much they did with so few computational resources, largely with human ingenuity.

“It’s unbelievable how much they did with so few computational resources, largely with human ingenuity.”

The Germans broke the rule of secrecy of not repeating yourself, and occasionally began messages with the same string of code. The other amazing thing is why this story was kept secret for so long, and not released after the war. It was one of the great achievements of the 20th century, and is finally out in the open.

Did the revolution of computing innovation, which occurred during and immediately after the Second World War, happen because of the war?

Yes. It was all happening during the war, but in secret. The group at Bletchley Park was moving very far ahead in code breaking. And in America [in the late 1940s] we had a computer called the ENIAC which made huge progress, but also in secret. After the war, the secrecy was lifted to some extent, so suddenly these developments that had been incubating came out into the open. I would be the first to say that Great Britain was ahead. In a way, the British invented digital computing and the Americans took the credit.

As ever is the case.

Indeed!

This was, of course, also the time that huge destructive powers were being engineered. John von Neumann himself worked on the atom bomb. Is there a connection?

There’s a very direct connection. The push that allowed von Neumann build his universal machine was to solve hydrodynamic questions to decide whether a hydrogen bomb was possible or not. So to an extent it was a story of cryptography on the British side and nuclear weapons design on the American side, with of course some overlap.

Let’s continue this wartime thread in discussion of Jack Copeland’s book Colossus.

After Enigma, the Germans developed more powerful encryption with higher-speed digital equipment that was much harder to break. The British side, led by [the engineer] Tommy Flowers, built a vacuum tube machine called Colossus – an extremely powerful and sophisticated digital computer which helped the British to break these even stronger codes. By the end of the war there were at least 10 of these Colossus machines.

Essentially, it was the birth of the computing industry at a time when no one else was building 10 copies of the same machine. But it was all kept under wraps at the end of the war.

Why the cloak and dagger?

My suspicion is that it just didn’t go with the heroic history of the war to publicise that it was won partly by breaking codes and not purely by heroism in battle. Also, it may be a valid argument that we still depend on breaking codes and don’t want a new enemy to know how we are doing it. That’s certainly true in the US today. The NSA is a huge organisation that still keeps its secrets. But in the case of Bletchley Park and Colossus, I don’t think it’s done any harm to finally publicise what really happened.

Which Copeland’s book does, taking us through the history and key characters.

Copeland’s book is another masterpiece. I wish I had been able to read it 20 years ago, when I first became interested in this. It reveals in a very technically correct way how the German codes got broken. It’s a marvellous collection of first-person documents, memories and editorial glue to hold it all together.

You mentioned the NSA. Presumably the giant codebreaking computers they have under wraps there are the direct descendants of Colussus?

Yes they are. America sent people to Britain during and after the war, to learn what had been done, and Alan Turing came over to America to debrief the people who became the NSA.

And the security of the American nation is in the hands – or equivalent parts – of these computers.

Very much so. There is still a cat and mouse arms race between computers which both make it easier to write harder codes and easier to break them.

Tell us about your fourth pick, When Computers Were Human.

This book is about the period in which Turing worked, the 1930s. It’s important and often ignored that the world of electronic and mechanical computing didn’t come out of thin air. It came out of a world in which we were doing a large amount of computation, but with people.

In America, during the Great Depression, we had something called the Works Progress Administration. One of the things they did to create jobs was to set up vast engines of human computation to make mathematical tables and suchlike. Much of what later became electronic was first done by these people – before it was mechanised one step at a time.

David Alan Grier’s grandmother was one of these human computers, so he had a very strong interest in digging up their stories. It’s a marvellous book, rich in the detail of what those times were like.

And these people were actually called “computers”?

Yes, they were called computers. Grier himself is an electronic engineer, chairman of the IEEE [Institute of Electrical and Electronics Engineers], and he delved into how today’s algorithms came from what was worked out by human beings.

This is also a valuable book on the history of women in science. To be a computer was the only way his grandmother could apply her knowledge in a scientific world which excluded women.

It’s a very rich subject. Exactly the same process was going on in England and elsewhere, with accounting laboratories and scientific computing centres where problems were fed to large groups of people. In computation, we are moving numbers around. Whether you move them between people with pencil and paper, or on silicon chips between gates at the speed of light, it’s the same process.

What is the final book you are recommending?

Core Memory is completely different from the others, and in no way makes any attempt to be historically complete or chronologically correct. It’s an art book of absolutely stunning, high resolution photographs of computers in the Computer History Museum that moved from Boston to San Francisco.

The photos are utterly gorgeous, and give you a visceral sense of the hand craftsmanship that went into these machines. Hand-soldered wires, massive disc drives five feet in diameter – things like that.

Support Five Books

Five Books interviews are expensive to produce. If you're enjoying this interview, please support us by .

So, how colossal was Colossus?

Colossus was pretty big, and ENIAC was so huge you could go inside it. This book gets into the innards of some of these old machines and turns them into works of art. It’s like a children’s book of dinosaurs – if you’re interested you will go through page after page, and if not you will look at three pages and put it aside.

What are the most recent computers featured in the book?

It goes into the early age of personal computing. It has the first Macintosh, and the first computers you could buy for under a thousand dollars. There’s a few of them in there, as a nod to the world we’re in today. But it’s primarily about the age of what we call “big iron” – the huge mainframe computers that had to be moved around with forklifts and trucks.

What computer do you use, Mr Dyson?

I use an absolutely modern Mac laptop. In fact, I just got a new one. My last one was six years old. My new laptop, quite miraculously, has a solid state drive – it no longer has a magnetic disc spinning around and waiting to crash. That’s a fabulous step ahead. But my boatbuilding business at home still runs on an ancient Mac – not the earliest generation but a completely extinct operating system. And it’s amazing how many companies’ accounting systems still run on punch cards.

Looking to the future, where are we heading? Do you believe in Moore’s law of exponentially increasing computing power?

I think Moore’s law is going to keep going. The question is, what are we going to do with it? Every episode of every bad show that’s ever been on television is on YouTube for free, and we still can’t use all our bits [of processing power]. So what is going to happen next is a very interesting question.

What was Turing’s answer? What did he prophesy for the 21st century?

Turing is not at all a dead prophet who is of historical interest only. Almost every word he wrote can be read today and speak to the future. He believed in true artificial intelligence, and I think he was right. Things like Google are the fruition of his vision, and we’re going to have to wait to see where that goes. The way Google is doing it is to keep everybody happy, make sure everything is free and keep everyone on their side. I don’t subscribe to the Terminator scenario [of computers becoming self-aware and enslaving mankind]. Human beings are a part of this, and are not going to be extinguished by it.

But what, in your view, are the human consequences of the ever-increasing computing power of our times?

I think we need to worry less about whether machines are becoming more intelligent, and more about whether humans are becoming less intelligent. The jury is out on that. You could make the argument that because of smartphones we are losing the ability to visualise maps in our brains. That frees up part of our brain, but what do we use it for?

Get the weekly Five Books newsletter

Do you agree with Nicholas Carr’s theory that we are becoming shallower thinkers with the Internet?

I think he’s partly right, and his concerns are definitely worth worrying about. It’s not clear which way it’s going to go. We have to be very cautious and watch what’s happening with the next generation very carefully – because it is entirely possible that we could start losing some of the intelligence that has evolved over such a long period of time.

As well as being a science historian, you have spent a lot of your life kayaking and in the wilds of British Columbia. At one point in your youth, you lived in a treehouse for three years. Do you worry that, obsessed as we are with technology, we are losing touch with the physical world?

Very much so. We are losing a lot of our craftsmanship, our ability to do things with our hands. That’s sad and a mistake, but it’s happening and we have to make the best of it. I think we should try to preserve human knowledge that will be very hard to reconstruct, like how you rebuild a carburetor. Things that we take for granted are being lost left and right. You don’t want to preserve them as artifacts, you want to preserve a working knowledge of them as much as we can, while leaving space for new skills to develop. One thing is for certain – we’re in a very transitional period.

Interview by Alec Ash

March 14, 2012

Five Books aims to keep its book recommendations and interviews up to date. If you are the interviewee and would like to update your choice of books (or even just what you say about them) please email us at [email protected]

George Dyson

George Dyson

George Dyson is a historian of technology whose publications broadly cover the evolution of technology in relation to the physical environment and the direction of society. He has written on a wide range of topics, including the history of computing, the development of algorithms and intelligence, communication systems, space exploration, and the design of water craft.

George Dyson

George Dyson

George Dyson is a historian of technology whose publications broadly cover the evolution of technology in relation to the physical environment and the direction of society. He has written on a wide range of topics, including the history of computing, the development of algorithms and intelligence, communication systems, space exploration, and the design of water craft.