Ways Physics and Cosmologists Can Baffle You

Cosmos

Night

I dreamed that I was looking at the universe, the milky way perhaps, on a screen as large as the side of a building. In the picture the stars are in motion, moving in a way that seems remarkably predictable for an expanding cosmic landscape. There is a character standing to the left of the big screen, waiting for me. He has gray hair and looks a little bit like Colonel Sanders, but he is actually the teacher. And he is patient. He is waiting for me to ask him about the scene, but I have been avoiding it.

Day

I have been trying to understand the basic principles of physics for a few years now, mainly because of my interest in quantum computing. But no matter how many books I read on the subject, I am still baffled by most of it. Sometimes my questions seem too big to ask, and yet my interest continues. Thankfully there are a few encouraging teachers, like Leonard Susskind, director of the Stanford Institute for Theoretical Physics, who say it’s perfectly reasonable to feel challenged here.

From Leonard Susskind to Everyone:

“A number of years ago I became aware of the large number of physics enthusiasts out there who have no venue to learn modern physics and cosmology. Fat advanced textbooks are not suitable to people who have no teacher to ask questions of, and the popular literature does not go deeply enough to satisfy these curious people. So I started a series of courses on modern physics at Stanford University, where I am a professor of physics. The courses are specifically aimed at people who know, or once knew, a bit of algebra and calculus, but are more or less beginners.”

The response was overwhelming and it was suggested that Stanford put them up on the internet. You can find them at: http://www.learnoutloud.com/Catalog/Science/Physics/Modern-Theoretical-Physics/23022

Susskind On Why Physics Is So Hard

Susskind says that with physics, you have to go through the initiation rights of learning mathematics, which is why physics is so hard for the general public.

“Look, the process of modern physics has been very much the process of physicists rewiring their brains with abstract mathematics. Nobody, including me or anybody else, can visualize and see four dimensions in their head. But it’s very easy to add to x, y, and z, another letter of the alphabet and simply do with the four letters what you used to do with the three letters.

So you make an abstract mathematical visualization that you can’t see in your head, but through the process of rewiring you learn new ways to think about things. Well, I can’t use that when I talk to people who are not mathematically trained, so you use analogies, metaphors. So it is effective, but always at some level it is wrong. It doesn’t capture everything correctly.”

For more, watch The Cosmic Landscape: Leonard Susskind at the Commonwealth Club of California.

Susskind

— DJ

Night and Day is an online journal that contrasts my dreams with my daytime activities. I refer to these posts as episodes because I only recall my dreams sporadically, and because they are at best loosely connected to my days.

What We Call the Past Is Built On Bits

BinarySmIt’s impossible to talk about information theory without talking about Claude Shannon. In 1948, the same year that Bell Labs introduced the transistor, Shannon published a monograph in The Bell System Technical Journal titled, A Mathematical Theory of Communication. In it, the 32-year-old who was then part of the Bell Labs mathematical research group coined the word bit, declaring it a unit for measuring information.

Information theory began as a bridge from mathematics to electrical engineering, and from there to computing. It’s a transformation that is chronicled by James Gleick in The Information: A History, A Theory, A Flood. This ambitious book traces the history of communications through the centuries to teach us about the language of drum beats, the decline of oral histories and the persistence of the word, alphabets, patterns, the printing press, electricity, Morse code and telegraphs, telephone switchboards, wires, networks, computers, algorithms, and modern day, social-sharing apps.

We learn that logic descended from the written word, that mathematics followed the invention of writing, and that information is physical. We learn that the first English dictionary was made by Robert Cawdrey, a village schoolmaster and priest, in 1604. That the first Oxford English Dictionary was published in 1933. And that the book of tables by Regiomontanus that Christopher Columbus carried as an aide to navigation was printed in Nuremberg two decades after the invention of moveable type in Europe.

We meet Charles Babbage, Ada Byron, Norbert Wiener, Richard Feynman, Albert Einstein, Stephen Hawking, Claude Shannon, Alan Turning, John von Neumann, and Edgar Allen Poe, the American writer who helped popularize cryptography. As Gleick crisscrosses the disciplines of mathematics, physics and computing we begin to appreciate just how strong the bonds of science and technology really are. And that is probably the point.

One of my favorite stories comes from 1943. Claude Shannon routinely met Alan Turing at teatime in the Bell Labs cafeteria, but they couldn’t discuss their work because it was a secret. It was the height of World War II and they were both cryptanalysts. Instead, Turing showed Shannon a paper he had written seven years earlier, “On Computable Numbers,” about the powers and limitations of computing machines. They talked about the possibility of machines learning to think at a time before transistors and electronic computers even existed. It wasn’t exactly a chance encounter and it gave rise to Turing’s now famous question, “Can machines think?

Turing’s machine never really existed. It was a thought experiment in the early days of information theory and the vision that Shannon and Turing shared actually had more to do with logic than electronics. Gleick explains that what Alan Turing and Claude Shannon had in common was codes.

“Turing encoded instructions as numbers. He encoded decimal numbers as zeroes and ones. Shannon made codes for genes and chromosomes and relays and switches. Both men applied their ingenuity to mapping one set of objects onto another: logical operators and electric circuits; algebraic functions and machine instructions. The play of symbols and the idea of mapping, in the sense of finding rigorous correspondence between two sets, had a prominent place in their mental arsenals.”

These conversations helped Shannon fuel his theory of information and the idea that once information became digital, it could be transmitted without error. It became a unifying theory for all sorts of communications and quickly distinguished him as the father of information theory.

Gleick also tackles the complex subject of quantum computing, which deals in quantum bits, or qubits rather than bits. With roots in quantum physics it’s easy to get lost in this area, but he does a pretty good job of making the concepts understandable for the layman. And he offers some insights into why this kind of computing matters.

“Whereas an object in classical physics, typically composed of billions of particles, can be intercepted, monitored, observed, and passed along, a quantum object cannot. Nor can it be copied or cloned.” And this is precisely why quantum computers—in theory—can solve certain classes of problems that were previously considered infeasible, he says. Cryptography is one common answer to “why quantum computers?” But artificial intelligence is often cited as well.

This book covers a lot of ground. But the common thread throughout is that information is the vital principle. Gleick says it pervades the sciences from top to bottom, transforming every branch of knowledge.

In 1990, the American physicist John Archibald Wheeler suggested that information is fundamental to the physics of the universe, “It from Bit,” he said. “Every particle, every field of force, even the space-time continuum itself, derives its function, its meaning, its very existence entirely from answers to yes-or-no questions, binary choices, bits.”

Shannon introduced us to the science of information theory. Wheeler taught that what we call the past is based on bits. And Gleick connects the dots.

— DJ

What Does My Art Mean?

MarinHeadlands

Marin Headlands: Charcoal on Paper, 18 x 24

It doesn’t mean anything. It’s not a political platform. There is no message or social commentary. I’m not trying to provoke anyone’s thinking process or make a statement. It is simply a form of reflection and play. Serious play.

I write for a living so visual activities are the way I relax. It’s like meditation or a form of focused attention for me. There is no inner dialog when I am painting. Simply observations and actions. I don’t use words in my head to mix paints. I see and mix. I pick up colors and put them down. Sometimes I mix paints on the palette, other times I mix them on the canvas or paper. I don’t even worry too much about whether it is the “right” color or the “right” line, because when I am in it, I just want to keep playing with the experience. Which is both good and bad when you are in flow and probably explains why my art can be so hit or miss.

Every time I start a new drawing or painting it begins with the same question: I wonder if I can capture that? My goal is always to make a really good painting, and by good I mean that I like it. But I have lots of failed paintings, and I accept that as part of the process. In fact I have closets full of unfinished and unsatisfactory paintings. For every good painting there are probably five or more failed paintings. But I keep painting because it is fun, and it is a challenge and because I love to play.

— DJ

 

Just So Stories

IMG_0797

Sometimes I make art because it is a form of play. And sometimes I make art to evoke a memory or feeling, or to capture a place in time. When I was a child, I fell in love with the children’s book, Just So Stories by Rudyard Kipling.  I was so taken with the stories and the pictures that I would ask my grandmother to read them to me over and over again. Yes, I loved the stories about the animals and all the strange words and sentences . But I think it was actually the illustrations that stole my heart, because many years later as an adult, tattered book in hand, I pulled out my paints and made this painting of the book’s cover. It was a gift that was returned to its rightful owner this very day. Because I love this painting and the memories it holds for me of a little girl, curled up next to her loving grandmother, dreaming of far away places and magical tales from the kingdom of the wild.

— DJ

“Hear and attend and listen; for this befell and behappened and became and was, O my Best Beloved, when the Tame animals were wild. The Dog was wild, and the Horse was wild, and  the Cow was wild, and the Sheep was wild, and the Pig was wild — as wild as wild could be — and they walked in the Wet Wild Wood by their wild lones. But the wildest of all the wild animals was the Cat. He walked by himself, and all places were alike to him.” — From The Cat that Walked by Himself, by Rudyard Kipling

Silicon Valley’s Creation Myth

Adam&EveOn Wednesday, Oct.1, the New York Times published an interesting piece by Nick Bilton that introduced us to Walter Isaacson’s new book, The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution.

A recurring theme of the book has to do with how the women in technology and their contributions to computer science have been dismissed or forgotten. According to Mr. Isaacson, it has to do with how the creation myth seeks to make heroes out of individuals, rather than the group. And when the contribution of the collective is ignored, he says, it is usually a man who gets the credit.

Bilton writes that, “The exclusion of these women has not only reinforced stereotypes about women and technology, but has arguably had a self-fulfilling effect. In 1985, 37 percent of computer science undergraduate degrees were earned by women. By 2010, that number had fallen by half to 18 percent. Now just 0.4 percent of all female college freshmen say they plan to major in computer science.”

When I was a programmer back in the 1980s it seemed like computer science was a great option for women because it was so new that it lacked the male-dominated culture that was already entrenched in so many other fields. These were the early days of automation and we were custom-coding applications at the time to move companies off of their old, manual bookkeeping systems, which meant writing code for everything from accounts payable to online inventory systems. I actually wrote the first accounts receivable program for Dale Carnegie and Associates in Garden City, NY. We worked hard and had a lot of fun.

So it is troubling to me to see such a small percentage of women pursuing computer science these days. I live and work in Silicon Valley so I have a personal interest in technology and in seeing women thrive in their careers here. I also get to ask the people I work with why there seems to be such a big decline in the number of women taking computer science in school. And the answer I routinely hear is the same, “they are not interested in it.” When I ask why, the young men that I know say they aren’t sure.

I have always believed that the technical opportunities were here for women. And in Silicon Valley most companies operate with a lot of flexibility, which is exactly what women and families need. But at the end of the day, it comes down to a choice. And what I really want to know is why so few women are are choosing to pursue a career in computer science.

It certainly doesn’t help that the New York Times editors decided to publish The Women Tech Forgot in the Fashion and Style section of the paper. That simply reinforces the old stereotype about women being more interested in fashion than anything else.

Where is Jill Abramson when we need her?

— DJ

That Crafty Feeling

Zadie Smith on the Psychology of the Two Types of Writers

If you are a writer who is trying to create a novel, don’t miss this post on  Zadie Smith’s two types of writer personalities. Smith delivered a lecture at Columbia University’s Writing Program in which she described writers as “macro-planners” or “micro-managers.” That and more on the mysteries of writing is captured in this morning’s dispatch from Maria Popova in Brain Pickings, her “cross-disciplinary LEGO treasure chest, full of pieces spanning art, science, psychology, design, philosophy, history, politics, anthropology, and more.”

It’s an insightful and fun read that made me laugh out loud when I realized that I fit the micro-manager profile.

Zadie Smith’s 10 Rules of Writing

  1. When still a child, make sure you read a lot of books. Spend more time doing this than anything else.
  2. When an adult, try to read your own work as a stranger would read it, or even better, as an enemy would.
  3. Don’t romanticise your ‘vocation’. You can either write good sentences or you can’t. There is no ‘writer’s lifestyle’. All that matters is what you leave on the page.
  4. Avoid your weaknesses. But do this without telling yourself that the things you can’t do aren’t worth doing. Don’t mask self-doubt with contempt.
  5. Leave a decent space of time between writing something and editing it.
  6. Avoid cliques, gangs, groups. The presence of a crowd won’t make your writing any better than it is.
  7. Work on a computer that is disconnected from the ­internet.
  8. Protect the time and space in which you write. Keep everybody away from it, even the people who are most important to you.
  9. Don’t confuse honours with achievement.

10. Tell the truth through whichever veil comes to hand — but tell it. Resign yourself to the lifelong sadness that comes from never ­being satisfied.

Zadie

 

–Zadie Smith

— DJ

Memory: When Books Become Paintings

MarcChagall

Marc Chagall

If you asked me to tell you about a book that I read, I would most likely start to remember it with a picture in my mind. I’m not really sure why that is. I think most people remember books verbally, by telling you about the story’s characters and the details of what happened to them. But I tend to remember books by the paintings they leave in my head.

The details of the painting might change depending on the part of the story I am trying to recall, but the primary residence of the story is usually pretty fixed in my mind. Which is to say that stories, for me, usually have a visual home in my memory.

So if a book is well written and the author has provided the material I need —not too much, not too little—to let my mind put me into a story and have a sense of place, then my imagination will get to work. And the first thing I will remember months, even years from now is the book’s painting, because it left a print in my memory.

And like a painting, or a dream, the composition naturally expands and contracts as I think about different parts of the book. Characters arrive and disappear as my memory moves around the landscape. It all happens so quickly that I hardly notice it. I will see the painting and within seconds recall what the point of the book was, how it felt, and why I loved the characters—even if I can’t recall their names. And then, as if waking from a dream, the painting will slip away as the words start to arrive and I say something like, “I loved this book.”

— DJ

Can Machines Think?

Turings CathedralThat’s a pretty big question and it’s been kicking around since the middle of the last century. The answer, of course, depends on how you define thinking and machines, and whom you ask.

Professor Kevin Warwick, a visiting professor at the University of Reading and Deputy Vice-Chancellor for Research at Coventry University thinks the answer is yes. At least that was his conclusion when “Eugene Goostman,” one of five computer programs, won the Turing Test 2014 Prize at an event held at the Royal Society in London in June.

Described as a Ukrainian teenager with a quirky sense of humor and a pet guinea pig, Eugene managed to convince 33 percent of the Turing test judges that it was human. Organized by University of Reading’s School of Systems Engineering, the event was sponsored in part by RoboLaw, an EU-funded organization that is examining the regulation of emerging robotic technologies.

But the news that Eugene passed the Turing test quickly sparked a debate.

The Guardian reported that Stevan Harnad, professor of cognitive sciences at the University of Quebec in Montreal, said that whatever had happened at the Royal Society, it did not amount to passing the Turing test. “It’s nonsense, complete nonsense,” he said. “We have not passed the Turing test. We are not even close.”

The Turing Test Doesn’t Matter

Then there is Massimo Pigliucci, editor-in-chief of Scientia Salon, who isn’t even arguing about the test results, because he says that The Turing Test Doesn’t Matter.

Turing proposed his famous test back in 1951, calling it “the imitation game.” The idea stemmed out of his famous work on what is now known as the Church-Turing hypothesis, the idea that “computers” (very broadly defined) can carry out any task that can be encoded by an algorithm. Turing was interested in the question of whether machines can think, and he was likely influenced by the then cutting edge research approach in psychology, behaviorism, whose rejection of the idea of internal mental states as either fictional or not accessible scientifically led psychologists for a while to study human behavior from a strictly externalist standpoint.

— Massimo Pigliucci

Pigliucci asks: “When we talk about AI, do we mean intelligence (as the “I” deceptively seems to stand for), computation, self-awareness, all of the above? Without first agreeing at the least on what it is we are trying to do we cannot possibly even conceive of a test to see whether we’ve gotten there.”

He’s got a point.

Turing’s Cathedral

All of this leads us back to the man, Alan Turing, who in 1950 predicted that some time in the next 50 years we would have computers that could trick us into believing they were human at least 30 percent of the time. He introduced us to the Turing test in his seminal work on artificial intelligence, Computing and Machinery and Intelligence.

As a British mathematician and cryptographer, and one of the most influential computer scientists of the last century, Turing is still best known for the Turing test—the famous question and answer game that seeks to answer the question Can Machines Think? His remarkable story is the subject of Turing’s Cathedral: The Origins of the Digital Universe, by George Dyson.

And the question of whether machines can think? Remains questionable. But it sure makes for fascinating reading.

— DJ

THE PRICE OF BOOKS, THE VALUE OF CIVILIZATION

Toby Mundy, former CEO of Atlantic Books, shares his thoughts on the current disputes between Amazon and its suppliers, and the unique value that books provide as “the only medium for thick descriptions of the world that human beings possess.” — DJ

Pandaemonium

library george peabody 2

Toby Mundy is one of Britain’s leading publishers and was, until June, CEO of Atlantic Books. He has a new blog; his first post is a superb essay that takes the current fraught struggle between Amazon and publishers as a starting point for a meditation on the significance of books to human life.  I am delighted to repost that essay here.


Toby Mundy
Amazon knows the price of everything.
But does it care about its value?

In his recent piece about Amazon on Techcrunch, John Biggs argued that ‘Books are about to go the way of magazines and newspapers. The value of a large hunk of text in prose form is diminishing and the price people will pay for it is also falling.’ This echoes the views of Russell Grandinetti, Amazon’s senior vice president for Kindle, and now one of the most influential people in world publishing. Here he is…

View original post 1,269 more words

The Tiger’s Wife

TigersWife

Set in an imaginary town somewhere in the Balkans, The Tiger’s Wife, by Tea Obreht, is a beautifully crafted tale that probes the mysteries of life, death, and war. The story begins with a memory. In it, the four-year-old Natalia is following her grandfather to the zoo; a ritual that includes a trolley ride, a blue bag packed with treats for the zoo animals, and an afternoon spent reading passages from The Jungle Book.

In the first pages you learn three important things about her grandfather: He is a doctor, he loves tigers, and he doesn’t want Natalia to look away when bad things happen to people.

Using a rich and descriptive narrative style, Obreht mixes story lines that carry the reader across newly drawn borders in the war-torn Balkans and deep into the past of her grandfather’s childhood hometown, Galina. Natalia becomes a doctor too, and it is while she is on a mission to bring vaccines to the orphans of war across the border that she learns that her grandfather has died.

Natalia’s personal story of loss becomes the overlay for the author’s portrayal of a tragic civil war that has changed borders, separated families, and created deep divisions among previously tolerant religious groups. And there are questions, like what happened to Natalia’s grandfather and why is it that he never referred to the tiger’s wife by name?

As the novel progresses it becomes clear that the author is modeling much of her storytelling around the kind of  oral tradition that is familiar in the Balkans—where facts and gossip trade places and everyday events are elevated, with just a little embellishment, to the status of local mythology. Some of the most unbelievable characters are so convincing that you don’t even notice it after awhile. It’s like, “oh, here is the Deathless Man again. I wonder what he is up to now”?

This is a thoughtfully composed and beautifully written novel that works on so many different levels. If blending small-town superstitions and beliefs with the modern practice of medicine and rational thinking in a war-torn part of the world doesn’t appeal to you, then this is probably not your kind of book. But if you are like me, and you want to read a story that transports you to another place and time and that speaks to the heart in the way only magical stories can, then this book is for you.

— DJ