American Spectator, Sep/Oct2001, Vol. 34 Issue 7, p68
Once upon a time, Caltech's Richard Feynman, Nobel Laureate leader of the last great generation of physicists, threw down the gauntlet to anyone rash enough to doubt the fundamental weirdness, the quark-boson-muon-strewn amusement park landscape of late 20th-century quantum physics. "Things on a very small scale behave like nothing you have direct experience about. They do not behave like waves. They do not behave like particles ...or like anything you have ever seen. Get used to it."
Carver Mead never has.
As Gordon and Betty Moore Professor of Engineering and Applied Science at Caltech, Mead was Feynman's student, colleague and collaborator, as well as Silicon Valley's physicist in residence and leading intellectual. He picks up Feynman's challenge in a new book, Collective Electrodynamics (MIT Press), declaring that a physics that does not make sense, that defies human intuition, is obscurantist: It balks thought and intellectual progress. It blocks the light of the age.
In a career of nearly half a century that has made him the microchip industry's most influential and creative academic, Mead is best known as inventor of a crucial high frequency transistor, author of dominant chip design techniques, progenitor of the movement toward dynamically programmable logic chips, and most recently developer of radical advances in machine-aided perception. In 1999, he won the half-million dollar MIT-Lemelson award for innovation. But any list of accomplishments underrates Mead's role as the most important practical scientist of the late twentieth century. He is now emerging as the boldest theoretical physicist of the twenty-first.
Perhaps more than any other man, Mead has spent his professional life working on intimate terms with matter at the atomic and subatomic levels. He spent ten years exploring the intricacies of quantum tunneling and tunnel diodes, the first electronic devices based on an exclusively quantum process. Unlike most analysts, Mead does not regard tunneling as a mysterious movement of particles through impenetrable barriers. He sees it as an intelligible wave phenomenon, resembling on the microcosmic level the movement of radio waves through walls.
While pursuing these researches, Mead responded to a query from Intel-founder Gordon Moore about the possible size of microelectronic devices. Mead provided the empirical analysis behind Moore's law (predicting a doubling of computer power every 18 months). When single chips held only tens of transistors, he showed that in due course tens of millions would be feasible. In collaboration with Feynman, Mead also developed a definitive course on the physics of computation that has yielded a minor industry of books and tapes and imitators. After a year in Coblenz with Nobel-prize winning physicist-turned-biologist Max Delbruck, Mead pursued a lifelong multi-disciplinary interest in the physics of neural systems. His researches on the human retina led to his invention of the revolutionary Foveon camera that achieves resolution and verisimilitude in cheap silicon superior to the best silver halide films. His study of the cochlea has informed the creation of unique directional hearing aids, produced by Sonic Innovations of Salt Lake City.
Now, in the opening years of the new millennium, Mead believes that it is time to clear up the philosophical and practical confusion of contemporary physics. He revisits the debate between the Copenhagen interpreters of quantum physics--Niels Bohr, Alfred Heisenberg, John von Neumann, Richard Feynman--and the skeptics, principally Albert Einstein and Erwin Schrodinger. Pointing to a series of experiments from the world of microelectronic and photonic technology that still lay in the future when Bohr prevailed in his debates with Einstein, Mead rectifies an injustice and awards a posthumous victory to Einstein.
During a lifetime in the trenches of the semiconductor industry, Mead developed a growing uneasiness about the "standard model" that supposedly governed his field. Mead did not see his electrons and photons as random or incoherent. He regarded the concept of the "point particle" as an otiose legacy from the classical era. Early photodetectors or Geiger counters may have provided both visual and auditory testimony that photons were point particles, but the particulate click coarsely concealed a measurable wave.
Central to Mead's rescue project are a series of discoveries inconsistent with the prevailing conceptions of quantum mechanics. One was the laser. As late as 1956, Bohr and von Neumann, the paragons of quantum theory, arrived at the Columbia laboratories of Charles Townes, who was in the process of describing his invention. With the transistor, the laser is one of the most important inventions of the twentieth century. Designed into every CD player and long distance telephone connection, lasers today are manufactured by the billions. At the heart of laser action is perfect alignment of the crests and troughs of myriad waves of light. Their location and momentum must be theoretically knowable. But this violates the holiest canon of Copenhagen theory: Heisenberg Uncertainty. Bohr and von Neumann proved to be true believers in Heisenberg's rule. Both denied that the laser was possible. When Townes showed them one in operation, they retreated artfully.
In Collective Electrodynamics, Mead cites nine other experimental discoveries, from superconductive currents to masers, to Bose-Einstein condensates predicted by Einstein but not demonstrated until 1995. These discoveries of large-scale, coherent quantum phenomena all occurred after Bohr's triumph over Einstein.
Mead does not banish the mystery from science. He declares that physics is vastly farther away from a fundamental grasp of nature than many of the current exponents of a grand unified theory imagine. But he believes he can explain the nature of the famous mysteries of quantum science, from the two slit experiment where "particles" go through two holes at once to the perplexities of "entanglement," where action on a quantum entity at one point of the universe can affect entities at other remote points at speeds faster than the speed of light. In his new interpretation, quantum physics is united with electromagnetism and the venerable Maxwell Equations are found to be dispensable.
But Mead does not bow humbly before all of Einstein's conceptions. He dismisses the photoelectric effect as an artifact of early twentieth century apparatus. He also believes that General Relativity conceals more than it illuminates about gravitation ."All the important details are smoothed over by Einstein's curvature of space time." Gravity remains shrouded in mystery.
We arrived at Mead's house in Woodside, high above Silicon Valley. It is a modernistic aerie with hardwood floors and cathedral ceilings, perched on the precipitous slopes of the Los Altos Hills. The dense stands of surrounding redwood trees, concealing the valley below, make for a cathedral outside as well as in. We found him eager to discuss his theories and his Promethean book. A short lithe man with a small beard and a taste for undulatory rainbow shirts, Carver speaks with quiet authority, quirky humor and a gentle but inexorable persuasiveness. He conveys the sense that during his fifty years of immersion in technology he has made electrons and photons his friends, and he knows they would never indulge in the outrageous, irrational behavior ascribed to them by physicists. In the process, he is also implicitly coming to the defense of reason, science, history, culture, human dignity and free will.
——
THE AMERICAN SPECTATOR: You open your new book with a dramatic statement. "It is my firm belief that the last seven decades of the twentieth will be characterized in history as the dark ages of theoretical physics." Can you explain that?
CARVER MEAD: Modern science began with mechanics, and in some ways we are still captive to its ideas and images. Newton's success in deriving the planetary orbits from his law of gravitation became the paradigm. To Niels Bohr early in this century, when the quantum theory was invented, the atom was thought of as a miniature solar system, with a nucleus as the sun and electrons as planets. Then, out of the struggle to understand the atom came quantum mechanics. Bohr gathered the early contributors into a clan in Copenhagen, and he encouraged them to believe that they were developing the ultimate theory of nature. He argued vigorously against any opponents.
Among whom was Albert Einstein. He had already scored a triumph with relativity theory by that time. But the history books tells us that he lost the argument with Bohr. Can you explain their dispute? And why do you now award the verdict to Einstein?
Bohr insisted that the laws of physics, at the most fundamental level, are statistical in nature. Physical reality consisted at its base of statistical probabilities governed by Heisenberg uncertainty. Bohr saw these uncertainties as intrinsic to reality itself, and he and his followers enshrined that belief in what came to be known as the "Copenhagen interpretation" of quantum theory. By contrast Einstein famously argued that "the Lord does not throw dice." He believed that electrons were real and he wrote, in 1949, that he was "firmly convinced that the essentially statistical character of contemporary quantum theory is solely to be ascribed to the fact that this [theory] operates with an incomplete description of physical systems."
So how did Bohr and the others come to think of nature as ultimately random, discontinuous?
They took the limitations of their cumbersome experiments as evidence for the nature of reality. Using the crude equipment of the early twentieth century, it's amazing that physicists could get any significant results at all. So I have enormous respect for the people who were able to discern anything profound from these experiments. If they had known about the coherent quantum systems that are commonplace today, they wouldn't have thought of using statistics as the foundation for physics.
Statistics in this sense means what?
That an electron is either here, or there, or some other place, and all you can know is the probability that it is in one place or the other. Bohr ended up saying that the only statements you can make at the fundamental level are statistical. You cannot grasp the reality itself, only probabilities related to it. They really, really, wanted to have the last word, and the only word they had was statistical. So they made their limitations the last word, saying, "Okay, the only knowledge that there is down deed is statistical knowledge. That's all we can know." That's a very dangerous thing to say. It is always possible to gain a deeper understanding as time progresses. But they carried the day.
What about Schrodinger? Back in the 1920s, didn't he say something like what you are saying now?
That's right. He felt that he could develop a wave theory of the electron that could explain how all this worked. But Bohr was more into "principles": the uncertainty principle, the exclusion principle--this, that, and the other. He was very much into the postulational mode. But Schrodinger thought that a continuum theory of the electron could be successful. So he went to Copenhagen to work with Bohr. He felt that it was a matter of getting a "political" consensus; you know, this is a historic thing that is happening. But whenever Schrodinger tried to talk, Bohr would raise his voice and bring up all these counter-examples. Basically he shouted him down.
It sounds like vanity.
Of course. It was a period when physics was full of huge egos. It was still going on when I got into the field. But it doesn't make sense, and it isn't the way science works in the long run. It may forestall people from doing sensible work for a long time, which is what happened. They ended up derailing conceptual physics for the next 70 years.
Let's take a break--tell us a little about how you came to physics.
I was fortunate enough to get introduced to electricity at an early age, and I fell in love with it. By the age of six I was comfortable with all kinds of electrical phenomena.
So practice took precedence over theory?
Yes, but I wanted the theory to understand it. And that took time. But I never lost that intuitive grasp from having actually worked with it.
Tell us about your early life.
I was born in 1934 and grew up in California. We lived in a place called Big Creek, halfway between Yosemite and King's Canyon, up in the Sierra country. A lot of snow falls on those mountains during the winter, and in the spring it runs off. Around the turn of the century they built a series of dams and power plants up there, the Big Creek Project. As late as World War II, it supplied about 90 percent of the power for Los Angeles. It was a marvelous way to grow up because I learned about electricity just by being around it. It was everywhere. My father worked in the power plant, and he taught me as best he could.
You lived near the plant?
We had these places called camps, which were a group of homes around the power plant. Originally they were tents for the construction workers. When I was 12, a guy who was a ham radio operator moved in. My uncle had gotten me started on radio, but then he went off to the war--he worked in Britain on the radar project. Anyway, this guy had a background in electronics and he was willing to teach me what he knew. That was just as the war was ending, so there was all this war-surplus electronics on the market, dirt cheap. With the little bits of money that a kid could earn, I could buy piles of electronics, and try to figure out what they were and why they were that way and how I could modify them. That was how I got my start--you could afford to do experiments, because the stuff was so cheap. You could build up equipment and try things, just to see what happened.
Where did you go to school?
Between two of the camps, way back in the woods, we had a little school. Twenty kids for all eight grades. There was one teacher through 4th grade and then it became a two-teacher school. My grandmother lived in Fresno in the CentralValley. They had a better high school, so I lived with her and went to high school there. Then I interviewed to go to Caltech and I remained there for my whole career.
What about the power plant?
Oh, there were things in the power plant that were just awesome. In the generator there's this big wheel going around with these coils of wire, and this cascading water coming down two thousand feet through these great pipes and rushing through turbines. On the other side, there are these one-inch diameter cables, going down to Los Angeles. As a kid, I would watch them bring a new unit on line. The generator has huge inertia, but almost no friction, so you have to be really careful. You let a little water through and the rotation accelerates. Its speed comes up and up, governed by this instrument called a syncroscope that looks at the relative phase [timing of the troughs and crests of the wave of electricity] on the grid, and the voltage from the generator. Nobody ever gets those phases exactly right, but if you miss by much, the whole power plant goes boom--the difference in phase is enough to shear off the huge bolts, six inches in diameter, that bind the generator to the floor of the power plant. So electricity may be invisible, but it is powerful stuff; it's not invisible really. It's just invisible in the way we normally look at things.
So early on you knew that electrons were real.
The electrons were real, the voltages were real, the phase of the sine-wave was real, the current was real. These were real things. They were just as real as the water going down through the pipes. You listen to the technology, and you know that these things are totally real, and totally intuitive.
But they're also waves, right? Then what are they waving in?
It's interesting, isn't it? That has hung people up ever since the time of Clerk Maxwell, and it's the missing piece of intuition that we need to develop in young people. The electron isn't the disturbance of something else. It is its own thing. The electron is the thing that's wiggling, and the wave is the electron. It is its own medium. You don't need something for it to be in, because if you did it would be buffeted about and all messed up. So the only pure way to have a wave is for it to be its own medium. The electron isn't something that has a fixed physical shape. Waves propagate outwards, and they can be large or small. That's what waves do.
So how big is an electron?
It expands to fit the container it's in. That may be a positive charge that's attracting it--a hydrogen atom--or the walls of a conductor. A piece of wire is a container for electrons. They simply fill out the piece of wire. That's what all waves do. If you try to gather them into a smaller space, the energy level goes up. That's what these Copenhagen guys call the Heisenberg uncertainty principle. But there's nothing uncertain about it. It's just a property of waves. Confine them, and you have more wavelengths in a given space, and that means a higher frequency and higher energy. But a quantum wave also tends to go to the state of lowest energy, so it will expand as long as you let it. You can make an electron that's ten feet across, there's no problem with that. It's its own medium, right? And it gets to be less and less dense as you let it expand. People regularly do experiments with neutrons that are a foot across.
A ten-foot electron! Amazing.
It could be a mile. The electrons in my superconducting magnet are that long.
A mile-long electron! That alters our picture of the world--most people's minds think about atoms as tiny solar systems.
Right, that's what I was brought up on--this little grain of something. Now it's true that if you take a proton and you put it together with an electron, you get something that we call a hydrogen atom. But what that is, in fact, is a self-consistent solution of the two waves interacting with each other. They want to be close together because one's positive and the other is negative, and when they get closer that makes the energy lower. But if they get too close they wiggle too much and that makes the energy higher. So there's a place where they are just right, and that's what determines the size of the hydrogen atom. And that optimum is a self-consistent solution of the Schrodinger equation.
So much for the idea of the quantum world as microscopic...
Bohr and his followers had this notion that you got to the quantum world only when things were very small. Well, that's because the only thing they knew that exhibited quantum characteristics was an atom. They said, "Well, an atom is so small, we'll never see one." Now, it turns out, people have put atoms in cavities and you can see a single atom perfectly well. That experiment has been done many times now. In fact, if you do it properly, you can make atoms totally coherent. Do that with a lot of them, and you get Bose-Einstein condensate--a bunch of atoms in phase that act like one big matter wave. It was first demonstrated in 1995 by Eric Cornell and Carl Wieman in Colorado.
The early experiments that dealt with things like black-body radiation and light passing though double slits--couldn't they detect those effects?
The experiments on which the conceptual foundations of quantum mechanics were based were extremely crude by modern standards. The detectors available--Geiger counters, cloud chambers, and photographic film--had a high degree of randomness built in, and, by their very nature, could register only statistical results. The atomic sources were similarly constrained--large ensembles of atoms, with no mechanism for achieving phase coherence. Understandably, the experiments that could be imagined were all of a statistical sort.
The most famous of those experiments involved a "single" photon that somehow succeeded in going through two holes at once.
That uses a point-particle model for the "photon"--a little bullet carrying energy. If you define the problem this way, of course, you get nonsense. Garbage in, garbage out.
So how should we think of a photon?
John Cramer at the University of Washington was one of the first to describe it as a transaction between two atoms. At the end of his book, Schrodinger's Kittens and the Search for Reality, John Gribbin gives a nice overview of Cramer's interpretation and says that "with any luck at all it will supersede the Copenhagen interpretation as the standard way of thinking about quantum physics for the next generation of scientists."
So that transaction is itself a wave?
The field that describes that transaction is a wave, that's right.
So how about "Schrodinger's cat"--the thought experiment he proposed to illustrate the impossible conundrum of quantum theory. The cat is in a closed box, with a quantum-based trigger that either does or does not release poison. Gribbin summarizes the standard Copenhagen view of the situation: "Neither of the two possibilities has any reality unless it is observed." So is the cat dead or alive? The standard quantum-theory answer--we're quoting Gribbin again--would be: "The cat has neither been killed nor not been killed until we look inside the box to see what happened." In other words, reality is observer-dependent.
That is probably the biggest misconception that has come out of the Copenhagen view. The idea that the observation of some event makes it somehow more "real" became entrenched in the philosophy of quantum mechanics, and, like the other misconceptions, is said to be confirmed by experiment. Even the slightest reflection will show how silly it is. An observer is an assembly of atoms. What is different about the observer's atoms from those of any other object? What if the data are taken by computer? Do the events not happen until the scientist gets home from vacation and looks at the printout? It is ludicrous!
Gribbin goes on to describe an experiment with entangled photons, which shows quantum entities affecting one another at long distances with no passage of time. He says this "proves that there is no underlying reality to the world."
That is the experiment proposed by John Bell, the late Irish physicist, and done in its most definitive form by John Clauser--I'm currently in discussion with him about his fascinating findings. But the results say nothing whatsoever about what is and is not real.
In your book, you ambitiously redraw the boundaries of physics. In the "dark age" of the last 70 years, you say, a fundamental distinction was drawn between classical physics--mechanics, electricity and magnetism-and modem physics, consisting of quantum theory and relativity. Bohr connected the two with his "correspondence principle." What was that?
That was one of the big mistakes they made. They wanted the quantum domain to approximate the classical Newtonian world. And it simply doesn't. But Bohr believed that if you picked a limit where there are enough wavelengths, everything would average out to the same result you get from Newtonian physics.
So by "correspondence," he meant a correspondence between the quantum world and the larger Newtonian world?
Yes. And that was the wrong assumption. When you get to coherent quantum systems, they don't have a Newtonian limit at all. Coherent quantum systems "scale" in a way that is entirely different.
You proposed dividing physics into "coherent" and "incoherent" systems. What's the difference?
Okay. The quantum world is a world of waves, not particles. So we have to think of electron waves and proton waves and so on. Matter is "incoherent" when all its waves have a different wavelength, implying a different momentum. On the other hand, if you take a pure quantum system--the electrons in a superconducting magnet, or the atoms in a laser--they are all in phase with one another, and they demonstrate the wave nature of matter on a large scale. Then you can see quite visibly what matter is down at its heart.
Perhaps we can compare it to water in a bathtub. If you "reinforce" the bath water at the right moment, a big wave will suddenly slosh out onto the floor. That is the macro equivalent of what you are describing. But when the little wavelets lap against one another, then not much happens--incoherence, in other words. is that right?
That's right. In the coherent system, the waves are all in phase. But now, instead of water, let's think of something solid, say a billiard ball. A billiard ball is an incoherent mixture of lots of little matter "waves" that are interfering with one another all the time.
But to our everyday understanding, on the "macro" level, a billiard ball is also "coherent" in the usual sense of that word. It obeys Newton's laws, for example. Throw it with a certain velocity and we can predict where it will land.
Right, but that is a different sense of the word. As I describe them, coherent and incoherent systems are dominated by different sets of physical laws. With the incoherent systems that we see all around us, time is one-directional. And things that come apart don't spontaneously come together again. And the inertia--of the billiard ball, for example--increases linearly with the number of atoms. With coherent systems, on the other hand, time is two-directional, and inertia increases with the square of the number of elements. In a superconducting magnet, the electron inertia increases with the square of the number of electrons. That's foreign to Newtonian thinking, which is why Feynman had trouble with it. A coherent system is not more real, but it is much more pure and fundamental.
Can we finesse this business about time going backwards and forwards? Understanding quantum physics is hard enough as it is! When Bohr proposed the correspondence principle, he wanted to keep a single set of laws: "As above, so below." And yes, in the microcosm, when things are jumbled up and "incoherent," it does approximate the physics of the macro-world. But under appropriate conditions--what you term coherence--the micro-world seems to operate in a quite different way?
Right--Bohr put his foot on the wrong stone, the Newtonian side rather than the quantum side. The underlying reason is that Newtonian physics was phrased in terms of things like position and momentum and force which are all characteristics of particles. Bohr was wedded to particles.
Are coherence and incoherence absolutes--can something be "a little bit pregnant?"
Yes, it can be. Light from an ordinary fluorescent bulb has a certain amount of coherence, but light from incandescent bulbs has almost none. With coherence, all the waves have a common phase. When they're out of phase you get all these fringes and interference patterns.
"Coherence" seems comparable to electricity--it has existed forever, and we could see it in the sky as lightning but only in the nineteenth century were we able to harness it. And only recently have we been able to harness coherent phenomena.
Right. And once we have harnessed them in the laboratory, and begin to understand them, we can start to see them in the universe around us. There are increasing indications that many of the objects in the universe have coherent things going on in them. There are known to be masers in the atmospheres of some stars. It's now thought that a lot of the beaming of pulsars has to do with laser-like action. That's just surmised from the actions of these very mysterious objects--mysterious within the normal realm of incoherent physics. The universe is probably full of coherent physics.
That brings us back to Einstein--experimental results continue to vindicate his viewpoint, no?
The Bose-Einstein condensate, for example, or the quantum hall effect, or the super conducting quantum interference device--I list ten of them in my book, beginning in the mid-1930s and going up through 1995. Not many of your readers will have heard of them. But most people know what lasers and superconductors are, and they demonstrate nature acting in ways that Bohr and Heisenberg did not anticipate--a coherent state. Unfortunately, it was not until the 1960s that those results became widely known. So Einstein didn't have that information. He predicted coherent phenomena, but he didn't have a single example that he could actually get his hands on.
So orthodoxy won the day.
And after Bohr defeated Einstein, nobody else would take on the argument. Because if they put Einstein under, think what they would do to you.
And yet it all turned on some very open questions...
Einstein's basic point was that unpredictability does not mean intrinsic uncertainty. His other complaint was that Bohr was removing understanding from the field of physics. Bohr argued quite passionately that intuitive understanding was just not possible any more, and that you were old-fashioned if you insisted on it.
And so mathematical description was substituted for understanding?
Absolutely. It's conceptual nonsense. You can calculate stuff with the theory, but the words people put around it don't make any sense. That had the effect of driving the more conceptually-oriented students out of physics. We have ended up with more and more mathematicians in the physics departments. Don't get me wrong, there is nothing wrong with mathematics--it's the language we use to express the precise relations of physical law. But there is an increasing tendency to mistake the language for the physics itself. Once we lose the conceptual foundations, the whole thing becomes a shell game. There are very few conceptual workers left in the field. Feynman was one of the last ones, and he wasn't willing to take on the Copenhagen clan. Nobody was, until we come to A. O. Barut, John Dowling, John Cramer, and a few others.
A lot of the trouble seems to came down to the idea of matter being composed of particles, rather than waves.
Point particles got us into terrible trouble. If you take today's standard theory of particle physics, and the standard theory of gravitation, it is well known that the result is "off" by a factor of maybe ten to the power of 50. That's 10 followed by 49 zeroes. The amount of matter in the universe is way, way more than what is observed. And that discrepancy comes, at its heart, from assuming that matter is made made up of point particles.
What's the problem with them?
Because point particles are assumed to occupy no space, they have to be accompanied by infinite charge density, infinite mass density, infinite energy density. Then these infinities get removed once more by something called "renormalization." It's all completely crazy. But our physics community has been hammering away at it for decades. Einstein called it Ptolemaic epicycles all over again.
Hold on... epicycles?
Ptolemaic astronomers assumed that the earth was at the center. But then it became more and more complex to calculate the orbits of visible planets. When you assume the earth is the center, you have to add epicycles to the existing orbits to adjust them. In the same way, when you assume photons are point particles, and all you can calculate is probability, you have to add epicycles of conceptual nonsense to "explain" even the simplest experiment.
So when results don't fit theory...
The theory has to be adjusted, with band-aids stuck on top of one another. This happens all the time with science, but especially with the statistical quantum theory. It takes enormous work to take that theory and work it into a form that is useful for anything except those questions that it was initially devised for. And the band-aid epicycles are then announced as a triumph for the theory. It's amazing how long they have gotten away with it.
Is there a message in all this?
What this is telling us is that we have simply not been thinking about it right. We have to start working through the whole subject again. And that is going to take real work. I've gotten a little start on various pieces of it. Barut and Dowling got some wonderful results with the hydrogen atom. But there's a whole lot more work to do.
Running through your work is the idea that the deeper thing is probably simpler.
It always worked out that when I understood something, it turned out to be simple. Take the connection between the quantum stuff and the electrodynamics in my book. It took me thirty years to figure out, and in the end, it was almost trivial. It's so simple that any freshman could read it and understand it. But it was hard for me to get there because all of this historical junk was in the way.
Much has been made of the philosophical implications of quantum theory.
Once Bohr and Heisenberg won the scientific debates, they went around pontificating about philosophy.
What was the thrust?
They said that if the quantum world is inherently uncertain, if the only information about basic physics is statistical, then we need to rethink our view of all of reality. In a way it was a throwback to the old arguments between science and religion. Newtonians used the ability to predict the planets' positions as a refutation of standard religion, which said, well, "God puts them where he wants and you have just have to have faith about that." Religion didn't need to take a stand against Newton, but it chose to, starting with Galileo. And this terrible polarization set in.
So quantum theorists took us back to the unknowable, where things have to be taken on faith or on authority?
Yes, but as we look out at the universe today, there's nothing that makes it anything but more awesome. In fact, as we look back at those pictures and we think, "Now how could anyone who had any deep sense of faith believe in a God that would make stars by punching little holes in a cardboard sky?"
What was anti-religious about the Newtonian view? He was personally religious.
Nothing, but his followers framed the issue as, "If you can predict it, that shows that religion is wrong." The quantum theorists reopened the question as "No, you can't predict it, because it's basically statistical."
You could say that for some people, the predictability of nature undermined faith in God (although it needn't have done so). Quautum uncertainty undermined faith in science.
I think Einstein was being a scientist in the truest sense in his response to the Copenhagen interpretation. He said that none of us would be scientists if deep down we didn't believe there is a set of regularities in the operation of physical law. That is a matter of faith. It is not something anybody has proven, but none of us would be scientists if we didn't have that faith.
What you're saying is that in a rush to declare science complete, Bohr & co. essentially defined away a key assumption of science?
Faith in physics was undermined. Generations of students were basically driven out of physics because it was no longer comprehensible.
While theory was ailing, though, people were devising all kinds of interesting experiments and practical devices.
It was indeed a time of enlightenment for the experimental side--we had to go off and make our own picture of the world. We got ideas about what experiments would be interesting and went ahead with them. Tony Siegman's book Lasers is the definitive treatment of the device that underlies the whole field of fiber optics. He shows that the statistical quantum assumption just gets in the way. In an 1,200-page tome, he hardly even mentions photons.
What the reaction in the profession to what you are saying?
People are trying to figure out what to make of it. People like the idea that there is a simpler way of thinking about this, but it's a lot to get your head around. The world is full of specialists nowadays, and there aren't that many people any more who try to understand large fractions of what physics is about. So it is going to take time for people to realize this is a much simpler way to teach physics, and that they can grasp a lot more of it than by today's method. And some people have said, "This is great--it never made any sense to me, which is why I quit being a physicist."
You've crossed over into biology yourself--building silicon retinas and cochleas. And this is leading to some real revolutions--super-high-resolution cameras and hearing aids with greatly improved intelligibility. Can you tell us a bit about that?
Sonic Innovations is a company whose hearing aid, for the first time, uses our full knowledge of the human auditory system.
And Foveon, your camera outfit?
Foveon is about making the finest photographic images that have ever been made. We have about 60 employees, some of the most creative people I have ever worked with. We've been making our own low-volume, high-end cameras for two years. Now, the technology is just beginning to go into name brand cameras. You will be amazed!
Does it use coherence?
Every semiconductor derives its properties from the coherent nature of the electrons in it. The Foveon sensor uses these properties in a more fundamental and powerful way than other photosensors.
The computer industry has thrived by doing well what humans do badly, namely calculation. But computers seem to do badly what humans do well--speech, movement, perception.
The effort to build neurally inspired hardware has been much heavier going than I thought.
You write, "Biological solutions are many orders of magnitude more effective than those we've been able to implement using digital methods." You write about the fruit fly as an embarrassment, because its sensory abilities so vastly outstrip the most powerful computer. What's going on?
The fly has an autonomous system that avoids being swatted. It has the ability to see and navigate and make decisions on millisecond time scales. We've never been able to make artificial vision systems that come within orders of magnitude of that, with all the computation we can throw at them.
Why not?
That's what I was trying to find out. It makes us look so stupid. And you don't get popular by saying that. But it's true. And the more we try, the more we realize it's a much harder problem than we thought. What is it about the way that the fly, or the cat, or the fish process their information that makes it so much more effective at computing these things? They use what seems like really slow, slimy computational material, and yet they perform miracles with tiny amounts of power, tiny amounts of space and in real time and very fast.
What's the problem?
We don't know how even to formulate that problem, and we've been working on it since the dawn of computing. Every time we get another order of magnitude in computing capability, somebody says, "Now we've got enough!" But we haven't begun to get it.
It could be that when you find out what's really going on, you'd be even more in awe.
As I have found out more about what's going on, I have become more in awe. I'm amazed, for example, by the chemical complexity of neurological processes. They're not just digital or analog--they're chemical and physical, with dimensions that we do not understand at all.
Now if your faith is correct, behind that awesome complexity lies some simple set of rules. No?
I think there are principles. And I think there are principles of computation that get us this exponential advantage, which don't have to do with whether you do it with chemicals or electronics.
Are you saying, in effect, architectural principles?
You bet. I thought many times that I was on the verge of getting a hold of one of those. I haven't been able to make a crisp statement of one yet, but I feel on the verge. Every time I talk to the biologists, I get all charged up again.
Does biology have a problem analogous to the physics problem--lots of people barking up trees, and not many looking at the forest?
Every scientific discipline does. Our establishment rewards that kind of behavior. It's very, very hard to ask the deeper questions, because you won't get tenure that way.
For years, artificial intelligence research has pursued an approach that comes down to "If we can just write enough code, we can figure out how to make the thing do logic and how to solve problems..." It hasn't worked very well.
I think it just totally failed. Those Al systems can't see. They can't hear. They can't act. And they can't learn. Looking at the principles used by living systems has been much more successful. There have been recent successes in recognizing faces, fingerprints, things like that. The best results I have seen in reverse-engineering the brain have been the auditory processors done by my friend and collaborator Lloyd Watts. He has made remarkable progress by working with auditory neurobiologists and realizing the architecture of a much more capable hearing system in computational form. That's one to watch.
And vision?
Silicon sensors have been built that can recognize motion. But to distinguish between a computer and a car--that is a really, really hard problem. And yet we do it effortlessly, and so do flies. So we don't really know how to ask the question yet.
Sounds like the gluon researchers might be closer.
Oh, I would say so. It's more likely that we will figure out first if there's missing matter in the universe. If so, what it is. And if not, what's wrong with the general theory of relativity. We'll figure that out before we figure out the brain. It's just a really hard problem.
So we shouldn't expect machines to take over any time soon.
Don't lose sleep over it. Anybody who says, "Oh my God. These things are going to take over!"--it is just so far from anything real. People don't even know where to put the decimal point.
Do you have any thoughts about gravitation?
Yes, I've been working on it quite actively. It's funny--the most common force, everyone experiences it, and we just have no clue. It's fascinating when you think about it. The two long-range forces that we have in nature are the electromagnetic force and the gravitational force. The first we understand better than anything in physics, and yet gravity--we basically have no clue what it is. It doesn't fit with any of the other theories. It just gets pasted on. It's really an acute embarrassment.
So there are still lots of mysteries in nature.
We are all just struggling our way in this wonderful realm of nature that we know really very little about. Feynman has this wonderful quote about how the "theory of gravity" once was that the planets were being carried along by a whole flock of invisible angels. Then we ended up with a theory that it is this force between two masses that pulls at right angles to the motion. So he said what we have done is we have gone back to the invisible angels except now they are pushing at a 90-degree angle to the motion.
Not angels but angles...
Once angels were the explanation, but now, for us, it is a "force," or "field." But these are all constructs of the human mind to help us to work with and visualize the regularities of nature. When we grasp onto some regularity, we give it a name, and the temptation is always to think that we really understand it. But the truth is that we're still not even close. Isn't it wonderful that nature is like that? It would be so dreadful if nature were so dull that we, with our pathetic little prejudices, had it all figured out already.