Brain & Behavior, Physics & Mathematics, Space, Technology, Uncategorized

Ever since Austrian scientist Erwin Schrodinger put his unfortunate cat in a box, his fellow physicists have been using something called quantum theory to explain and understand the nature of waves and particles. =

But a new paper by physics professor Andreas Albrecht and graduate student Dan Phillips at the University of California, Davis, makes the case that these quantum fluctuations actually are responsible for the probability of all actions, with far-reaching implications for theories of the universe. Quantum theory is a branch of theoretical physics that strives to understand and predict the properties and behavior of atoms and particles. Without it, we would not be able to build transistors and computers, for example. One aspect of the theory is that the precise properties of a particle are not determined until you observe them and “collapse the wave function” in physics parlance.

Schrodinger’s famous thought experiment extends this idea to our scale. A cat is trapped in a box with a vial of poison that is released when a radioactive atom randomly decays. You cannot tell if the cat is alive or dead without opening the box. Schrodinger argued that until you open the box and look inside, the cat is neither alive nor dead but in an indeterminate state.

For many people, that is a tough concept to accept. But Albrecht says that, as a theoretical physicist, he concluded some years ago that this is how probability works at all scales, although until recently, he did not see it as something with a crucial impact on research. That changed with a 2009 paper by Don Page at the University of Alberta, Canada.

“I realized that how we think about quantum fluctuations and probability affects how we think about our theories of the universe,” said Albrecht, a theoretical cosmologist.

One of the consequences of quantum fluctuations is that every collapsing wave function spits out different realities: one where the cat lives and one where it dies, for example. Reality as we experience it picks its way through this near-infinity of possible alternatives. Multiple universes could be embedded in a vast “multiverse” like so many pockets on a pool table.

There are basically two ways theorists have tried to approach the problem of adapting quantum physics to the “real world,” Albrecht said: You can accept it and the reality of many worlds or multiple universes, or you can assume that there is something wrong or missing from the theory.

Albrecht falls firmly in the first camp.

“Our theories of cosmology say that quantum physics works across the universe,” he said. For example, quantum fluctuations in the early universe explain why galaxies form as they did — a prediction that can be confirmed with direct observations.

The problem with multiple universes, Albrecht said, is that it if there are a huge number of different pocket universes, it becomes very hard to get simple answers to questions from quantum physics, such as the mass of a neutrino, an electrically neutral subatomic particle.

“Don Page showed that the quantum rules of probability simply cannot answer key questions in a large multiverse where we are not sure in which pocket universe we actually reside,” Albrecht said.

One answer to this problem has been to add a new ingredient to the theory: a set of numbers that tells us the probability that we are in each pocket universe. This information can be combined with the quantum theory, and you can get your math (and your calculation of the mass of a neutrino) back on track.

Not so fast, say Albrecht and Phillips. While the probabilities assigned to each pocket universe may seem like just more of the usual thing, they are in fact a radical departure from everyday uses of probabilities because, unlike any other application of probability, these have already been shown to have no basis in the quantum theory.

“If all probability is really quantum theory, then it can’t be done,” Albrecht said. “Pocket universes are much, much more of a departure from current theory than people had assumed.”

The paper is currently posted on the ArXiv.org preprint server and submitted for publication and has already stimulated considerable discussion, Albrecht said.

“It forces us to think about the different kinds of probability, which often get confused, and perhaps can help draw a line between them,” he said.

Good post, Rick.

“Every so-called instinct — habit of mind trained in by mathematics of continuity — finds QM paradoxical — this is true even of classical statistical mechanics whose fundamentals are deterministic but just too complex to model except probabilistically.”

Actually, it’s not because of the complexity that QM relies on probability, it has to do with other factors like Heisenberg’s uncertainty principle and quanta. Einstein said “God does not play dice”, but it turns out he does. Einstein, with the EPR paper, predicted entanglement which he thought was an angle of proving QM was too much of a reach. Basically Occam’s Razor of physics that it was too much of a stretch to explain what he thought ‘hidden variables’ would explain more simply. That is that entangled particles didn’t suddenly ‘collapse a waveform’ to agree on a property from a vast distance (Einstein’s ‘spooky action at a distance’), but rather were simply born with the same properties at conception. With Alain Aspect’s experiment (with help from Bell), he proved QM was right and hidden variables wrong.

** How can we know that the bear is in the toy box?

Unfortunately, even well-trained physicists come to quantum mechanics and relativity with two handicaps: (1) as human beings they share a common world in mom tells us the teddy bear is *really* there when we can’t see him in the toy box; and (2) they first learn that most refined version of bear-there-right-now theory — q-p mechanics known as “Classical” physics.

Every so-called instinct — habit of mind trained in by mathematics of continuity — finds QM paradoxical — this is true even of classical statistical mechanics whose fundamentals are deterministic but just too complex to model except probabilistically. (Rolling dice big time with 10^23 particles in a mole of a perfect gas.)

What is “reality” as ordinarily perceived — even augmented with instruments until about 100 years ago — but a blurring, a sfumatura of discontinuous events? The pedagogy of physics is still mired in retracing the history of science — it’s part of a needless textbook tradition. It’s no accident that the great 19th century ‘anomalies’ — photoelectric effect, light spectra, black box radiation — wedged open a way into admitting into physics the long forbidden jumps of Nature — natura non facit saltum.

“Ordinary” reality does not extend to very great distances or to very small distances — what is “normal” for a proton being whipped into 0.999c at SLAC and thereby enjoying a longer life is not paradoxical; it’s just life in the fast lane of special relativity.

What is “normal” at atomic and sub-atomic dimensions is captured by QM, refined yet again by relativistic QM and QFT, quantum field theory. Determinism turned out not to be an irreducible feature of nature, no need for a Kantian category to keep God’s nose clean in the matter of alleged human free will — natura facit saltum in reality without regard to human wishes about continuity, causality, predictability, uniformity. Dogmas after all.

Ordinary language did not develop to function at near light velocities nor in a quantum world — applying ordinary language metaphors to them fail. Maybe, one day, language models needed to express truths about the quantum world — mathematical models and experimental exemplars — will arise from within these more ‘arcane’ sciences in ways which can be expressed in a more developed ordinary English. But, we’re far from that point.

So, as long as quantum entanglement, for example, strikes even an Einstein as “spooky action at a distance”, all lesser minds will still wonder how to cognize reality coherently on the basis of what QM says and demonstrates.

-> Fill

I also have a fundamental problem with the idea of “probability in the system”, and have yet to see any solid reason why I should believe otherwise. The concept of probability came into existence as a method of trying to predict the outcome of complex systems, for which all the variables cannot be reasonably tracked. Once you have the capability to track all factors, and of course total understanding of their interactions, and you have certainty of outcome.

On the subject of collapsing waveforms though, some have proposed that *consciousness* is the effect required to cause the collapse. Whether or not there’s anything in that idea, I find it fascinating. The tree truly may not make a sound if there’s nobody there to hear it.

Schrodinger’s thought experiment is often misunderstood. Does the cat remain in a state of superimposition (both alive and dead) until ‘observation’? First of all, it isn’t ‘human’ or ‘intelligent’ observation that counts, it is observation by the universe. A person isn’t ‘collapsing the wave function’ by looking or measuring. Believe it or not, the universe exists with or without us as it is. Secondly, regular macro-scale objects can’t inherit the quantum state of a quantum scale object. The outcome of a quantum effect can be applied to a macro-scale object, but the cat’s fate was long determined and resolved before anybody opened the box.

That said, quantum mechanics does have a degree of chance and probability but that doesn’t mean everything we don’t understand fully enough to predict the outcome (and hence, ‘by chance’) is due to quantum effects. For example, I could toss a 6-sided die and with enough knowledge and processing power I could predict the outcome in advance. Because I can’t reliably do that because my senses and brain computational power isn’t fast/powerful enough doesn’t mean it was up to quantum effects to make the outcome.