What We Call the Past Is Built On Bits

BinarySmIt’s impossible to talk about information theory without talking about Claude Shannon. In 1948, the same year that Bell Labs introduced the transistor, Shannon published a monograph in The Bell System Technical Journal titled, A Mathematical Theory of Communication. In it, the 32-year-old who was then part of the Bell Labs mathematical research group coined the word bit, declaring it a unit for measuring information.

Information theory began as a bridge from mathematics to electrical engineering, and from there to computing. It’s a transformation that is chronicled by James Gleick in The Information: A History, A Theory, A Flood. This ambitious book traces the history of communications through the centuries to teach us about the language of drum beats, the decline of oral histories and the persistence of the word, alphabets, patterns, the printing press, electricity, Morse code and telegraphs, telephone switchboards, wires, networks, computers, algorithms, and modern day, social-sharing apps.

We learn that logic descended from the written word, that mathematics followed the invention of writing, and that information is physical. We learn that the first English dictionary was made by Robert Cawdrey, a village schoolmaster and priest, in 1604. That the first Oxford English Dictionary was published in 1933. And that the book of tables by Regiomontanus that Christopher Columbus carried as an aide to navigation was printed in Nuremberg two decades after the invention of moveable type in Europe.

We meet Charles Babbage, Ada Byron, Norbert Wiener, Richard Feynman, Albert Einstein, Stephen Hawking, Claude Shannon, Alan Turning, John von Neumann, and Edgar Allen Poe, the American writer who helped popularize cryptography. As Gleick crisscrosses the disciplines of mathematics, physics and computing we begin to appreciate just how strong the bonds of science and technology really are. And that is probably the point.

One of my favorite stories comes from 1943. Claude Shannon routinely met Alan Turing at teatime in the Bell Labs cafeteria, but they couldn’t discuss their work because it was a secret. It was the height of World War II and they were both cryptanalysts. Instead, Turing showed Shannon a paper he had written seven years earlier, “On Computable Numbers,” about the powers and limitations of computing machines. They talked about the possibility of machines learning to think at a time before transistors and electronic computers even existed. It wasn’t exactly a chance encounter and it gave rise to Turing’s now famous question, “Can machines think?

Turing’s machine never really existed. It was a thought experiment in the early days of information theory and the vision that Shannon and Turing shared actually had more to do with logic than electronics. Gleick explains that what Alan Turing and Claude Shannon had in common was codes.

“Turing encoded instructions as numbers. He encoded decimal numbers as zeroes and ones. Shannon made codes for genes and chromosomes and relays and switches. Both men applied their ingenuity to mapping one set of objects onto another: logical operators and electric circuits; algebraic functions and machine instructions. The play of symbols and the idea of mapping, in the sense of finding rigorous correspondence between two sets, had a prominent place in their mental arsenals.”

These conversations helped Shannon fuel his theory of information and the idea that once information became digital, it could be transmitted without error. It became a unifying theory for all sorts of communications and quickly distinguished him as the father of information theory.

Gleick also tackles the complex subject of quantum computing, which deals in quantum bits, or qubits rather than bits. With roots in quantum physics it’s easy to get lost in this area, but he does a pretty good job of making the concepts understandable for the layman. And he offers some insights into why this kind of computing matters.

“Whereas an object in classical physics, typically composed of billions of particles, can be intercepted, monitored, observed, and passed along, a quantum object cannot. Nor can it be copied or cloned.” And this is precisely why quantum computers—in theory—can solve certain classes of problems that were previously considered infeasible, he says. Cryptography is one common answer to “why quantum computers?” But artificial intelligence is often cited as well.

This book covers a lot of ground. But the common thread throughout is that information is the vital principle. Gleick says it pervades the sciences from top to bottom, transforming every branch of knowledge.

In 1990, the American physicist John Archibald Wheeler suggested that information is fundamental to the physics of the universe, “It from Bit,” he said. “Every particle, every field of force, even the space-time continuum itself, derives its function, its meaning, its very existence entirely from answers to yes-or-no questions, binary choices, bits.”

Shannon introduced us to the science of information theory. Wheeler taught that what we call the past is based on bits. And Gleick connects the dots.

— DJ


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s