This Nobel prize winner’s little known technique may lead to a quantum theory of gravity
The Parisi-Wu stochastic quantization method connecting classical and quantum physics through the fifth dimension.
This year’s Nobel Prize in Physics goes to two climate scientists and a stochastic physicist. Most of the news media will be writing about climate science winning the Nobel, but underlying a lot of that is basic science of statistical physics and for that work the Nobel rightly has gone to Giorgio Parisi.
Giorgio Parisi is known for quite a few contributions to physics, both classical and quantum, but I want to focus in this article on his quantization technique, which he introduced with Wu in 1981 and ever after became known as Parisi-Wu stochastic quantization.
Quantization is simply a mechanism for turning a classical theory into a quantum one, and, while quantum physics has a lot more to it than quantization, including special fields and groups, none of it works without quantization.
It is quantization that, for example, turns a simple one dimensional spring system into a quantum harmonic oscillator and the astonishing result that energy becomes quantized, that is, appears only in levels.
The first quantization methods were based on operator theory. You had a wavefunction which represented the state of a system. The wavefunction is like a point that lives in a potentially infinite dimensional space called a Hilbert space. The energy of the wavefunction is represented by an operator which is like an infinite matrix but acts on points in the infinite dimensional Hilbert space. Matrices in linear algebra act on vectors (which can represent points in a space) and transform them linearly into other vectors. You can, for example, have a rotational matrix that rotates a vector. You can also stretch and reflect vectors. An operator that acts on a point in a Hilbert space, likewise, transforms it. In the case of Schrödinger’s equation, the energy operator, called the Hamiltonian matrix, transforms the wavefunction at one time to the wavefunction at a later or earlier time.
Quantization in Schrödinger’s scheme involves replacing the state of an object with a wavefunction and extracting knowledge about that object by applying operators to it. For example, the creation and annihilation (level-up and level-down) operators can increase or decrease the energy level of a wavefunction state.
Constructing Hamiltonians for complex objects isn’t always easy. Some complex systems like atoms have random matrix representations of their Hamiltonians, for example, that exist in a statistical space rather than a particular fixed one.
Another issue is that the Schrödinger approach (called canonical quantization) ignores a fundamental feature of physics, which is the principle of least action. The principle of least action has been a core component of physics since Newton’s day and was formalized in the 18th century. All physical systems in classical physics have a least action path they follow.
There is a clear correspondence between action and Hamiltonian in classical physics, but that correspondence is lost in canonical quantization. How to transform the Hamiltonian into an action was a source of consternation to a young Richard Feynman.
In the 1940’s Feynman developed his “path integral” quantization method as his Ph.D. thesis. The path integral provided the connection between the quantum Hamiltonian as an operator and the principle of least action. Feynman showed how particles and all quantum systems behave much like stochastic systems (systems containing random noise) in classical physics. They do not obey the principle of least action but instead have a distribution about the least action. The least action is the maximum probability in that distribution, so the paths particles follow are selected randomly around it and interfere with one another destructively and constructively like waves in the “path integral”.
Most physicists stuck to canonical quantization because it more closely resembles Newtonian mechanics while Feynman’s approach looked like statistical mechanics a bogeyman to some. Most physics students don’t see Feynman’s approach till their third quantum course when they study quantum field theory.
In the 1950s Mikio Namiki began to associate Feynman’s path integral with random path dynamics, not just statistics. While quantum physics was being developed, mathematicians and physicists, Langevin, Fokker, Kac, Planck, and, yes, Feynman again, were making headway into the study of random paths in classical physics. These random paths were the underpinning of the theory of Brownian motion, which Albert Einstein had explained in one of his 1905 papers along with special relativity and the photoelectric effect.
Paul Langevin developed the mathematics of random behavior, where a classical system can be described as having both a smoothly changing, non-random part based on its principle of least action, and a random noise component that forces it to vibrate around that path. Fokker, Planck, Feynman, and Kac associated that randomness to a probability distribution that changes with time. So, for example, if I throw a ball, its smooth motion is disturbed by random eddies of air, cross winds and convection currents. That randomness can be captured in noise in a Langevin equation. That Langevin equation can then be associated with either a Fokker-Planck or a Feynman-Kac equation for its probability distribution depending on whether I care more about the distribution at the end of its trajectory (Fokker-Planck) or where it likely came from (Feynman-Kac).
These are sometimes called the Kolmogorov forward and backward equations respectively and when applied to particle positions, the Fokker-Planck is called the Smoluchowski equation. Which goes to show how many great minds were having the same ideas at around the same time.
The key observation that Namiki made was that if you develop a Langevin equation for a quantum path rather than just a particle, in an extra dimension (not time or space as we know it) that Namiki called a fictitious time, you could describe a quantum system. The path integral, then, is associated with the Fokker-Planck equation for that Langevin equation when the probability distribution is stationary in this fictitious dimension, meaning that the probability distribution doesn’t change.
Many physicists and mathematicians in subsequent decades came up with different ways of trying to do this “stochastic quantization” including Namiki, but there wasn’t too much interest until computers were sufficiently powerful to enable simulations using these equations.
The Parisi-Wu stochastic quantization mechanism is perhaps one of the simplest and most widely used now. (Namiki’s is quite cumbersome in comparison but can quantize a few weird systems that Parisi-Wu can’t.) You just add the extra “fictitious time” to all the fields in it and figure out the Langevin equation. From there you can either compute all your probabilities directly or go to Fokker-Planck.
Parisi and Wu proposed the method as a way to compute quantum predictions without doing something called gauge fixing, but the main interest was that it provides a close connection between quantum and classical theory.
In my own research, I have attempted to construct theories where the fictitious time becomes a real dimension. The implications of such a reality are tremendous because, if that dimension becomes real, it means that wavefunctions technically exist spread out across that dimension, perpendicular to space and time. This would also imply that many parallel realities could be realized and interact by simply traveling through that dimension. Yet, not all possible realities would exist, only those that occur via the Langevin equation which is constrained by the system’s classical behavior.
If this is true then the connection that Parisi and others recognized is far more than a mere mathematical curiosity, it is a fundamental description of a five dimensional universe. This could have radical implications to quantum gravity.
Not a medium member? Join here and support my work: https://medium.com/@andersentda/membership
Parisi, Georgio, and Yong Shi Wu. “Perturbation theory without gauge fixing.” Sci. Sin 24.4 (1981): 483–496.
Namiki, Mikio. Stochastic quantization. Vol. 9. Springer Science & Business Media, 2008.