What We Call the Past Is Built On Bits

BinarySmIt’s impossible to talk about information theory without talking about Claude Shannon. In 1948, the same year that Bell Labs introduced the transistor, Shannon published a monograph in The Bell System Technical Journal titled, A Mathematical Theory of Communication. In it, the 32-year-old who was then part of the Bell Labs mathematical research group coined the word bit, declaring it a unit for measuring information.

Information theory began as a bridge from mathematics to electrical engineering, and from there to computing. It’s a transformation that is chronicled by James Gleick in The Information: A History, A Theory, A Flood. This ambitious book traces the history of communications through the centuries to teach us about the language of drum beats, the decline of oral histories and the persistence of the word, alphabets, patterns, the printing press, electricity, Morse code and telegraphs, telephone switchboards, wires, networks, computers, algorithms, and modern day, social-sharing apps.

We learn that logic descended from the written word, that mathematics followed the invention of writing, and that information is physical. We learn that the first English dictionary was made by Robert Cawdrey, a village schoolmaster and priest, in 1604. That the first Oxford English Dictionary was published in 1933. And that the book of tables by Regiomontanus that Christopher Columbus carried as an aide to navigation was printed in Nuremberg two decades after the invention of moveable type in Europe.

We meet Charles Babbage, Ada Byron, Norbert Wiener, Richard Feynman, Albert Einstein, Stephen Hawking, Claude Shannon, Alan Turning, John von Neumann, and Edgar Allen Poe, the American writer who helped popularize cryptography. As Gleick crisscrosses the disciplines of mathematics, physics and computing we begin to appreciate just how strong the bonds of science and technology really are. And that is probably the point.

One of my favorite stories comes from 1943. Claude Shannon routinely met Alan Turing at teatime in the Bell Labs cafeteria, but they couldn’t discuss their work because it was a secret. It was the height of World War II and they were both cryptanalysts. Instead, Turing showed Shannon a paper he had written seven years earlier, “On Computable Numbers,” about the powers and limitations of computing machines. They talked about the possibility of machines learning to think at a time before transistors and electronic computers even existed. It wasn’t exactly a chance encounter and it gave rise to Turing’s now famous question, “Can machines think?

Turing’s machine never really existed. It was a thought experiment in the early days of information theory and the vision that Shannon and Turing shared actually had more to do with logic than electronics. Gleick explains that what Alan Turing and Claude Shannon had in common was codes.

“Turing encoded instructions as numbers. He encoded decimal numbers as zeroes and ones. Shannon made codes for genes and chromosomes and relays and switches. Both men applied their ingenuity to mapping one set of objects onto another: logical operators and electric circuits; algebraic functions and machine instructions. The play of symbols and the idea of mapping, in the sense of finding rigorous correspondence between two sets, had a prominent place in their mental arsenals.”

These conversations helped Shannon fuel his theory of information and the idea that once information became digital, it could be transmitted without error. It became a unifying theory for all sorts of communications and quickly distinguished him as the father of information theory.

Gleick also tackles the complex subject of quantum computing, which deals in quantum bits, or qubits rather than bits. With roots in quantum physics it’s easy to get lost in this area, but he does a pretty good job of making the concepts understandable for the layman. And he offers some insights into why this kind of computing matters.

“Whereas an object in classical physics, typically composed of billions of particles, can be intercepted, monitored, observed, and passed along, a quantum object cannot. Nor can it be copied or cloned.” And this is precisely why quantum computers—in theory—can solve certain classes of problems that were previously considered infeasible, he says. Cryptography is one common answer to “why quantum computers?” But artificial intelligence is often cited as well.

This book covers a lot of ground. But the common thread throughout is that information is the vital principle. Gleick says it pervades the sciences from top to bottom, transforming every branch of knowledge.

In 1990, the American physicist John Archibald Wheeler suggested that information is fundamental to the physics of the universe, “It from Bit,” he said. “Every particle, every field of force, even the space-time continuum itself, derives its function, its meaning, its very existence entirely from answers to yes-or-no questions, binary choices, bits.”

Shannon introduced us to the science of information theory. Wheeler taught that what we call the past is based on bits. And Gleick connects the dots.

— DJ

Can Machines Think?

Turings CathedralThat’s a pretty big question and it’s been kicking around since the middle of the last century. The answer, of course, depends on how you define thinking and machines, and whom you ask.

Professor Kevin Warwick, a visiting professor at the University of Reading and Deputy Vice-Chancellor for Research at Coventry University thinks the answer is yes. At least that was his conclusion when “Eugene Goostman,” one of five computer programs, won the Turing Test 2014 Prize at an event held at the Royal Society in London in June.

Described as a Ukrainian teenager with a quirky sense of humor and a pet guinea pig, Eugene managed to convince 33 percent of the Turing test judges that it was human. Organized by University of Reading’s School of Systems Engineering, the event was sponsored in part by RoboLaw, an EU-funded organization that is examining the regulation of emerging robotic technologies.

But the news that Eugene passed the Turing test quickly sparked a debate.

The Guardian reported that Stevan Harnad, professor of cognitive sciences at the University of Quebec in Montreal, said that whatever had happened at the Royal Society, it did not amount to passing the Turing test. “It’s nonsense, complete nonsense,” he said. “We have not passed the Turing test. We are not even close.”

The Turing Test Doesn’t Matter

Then there is Massimo Pigliucci, editor-in-chief of Scientia Salon, who isn’t even arguing about the test results, because he says that The Turing Test Doesn’t Matter.

Turing proposed his famous test back in 1951, calling it “the imitation game.” The idea stemmed out of his famous work on what is now known as the Church-Turing hypothesis, the idea that “computers” (very broadly defined) can carry out any task that can be encoded by an algorithm. Turing was interested in the question of whether machines can think, and he was likely influenced by the then cutting edge research approach in psychology, behaviorism, whose rejection of the idea of internal mental states as either fictional or not accessible scientifically led psychologists for a while to study human behavior from a strictly externalist standpoint.

— Massimo Pigliucci

Pigliucci asks: “When we talk about AI, do we mean intelligence (as the “I” deceptively seems to stand for), computation, self-awareness, all of the above? Without first agreeing at the least on what it is we are trying to do we cannot possibly even conceive of a test to see whether we’ve gotten there.”

He’s got a point.

Turing’s Cathedral

All of this leads us back to the man, Alan Turing, who in 1950 predicted that some time in the next 50 years we would have computers that could trick us into believing they were human at least 30 percent of the time. He introduced us to the Turing test in his seminal work on artificial intelligence, Computing and Machinery and Intelligence.

As a British mathematician and cryptographer, and one of the most influential computer scientists of the last century, Turing is still best known for the Turing test—the famous question and answer game that seeks to answer the question Can Machines Think? His remarkable story is the subject of Turing’s Cathedral: The Origins of the Digital Universe, by George Dyson.

And the question of whether machines can think? Remains questionable. But it sure makes for fascinating reading.

— DJ