Is information, or constraints on inference, all there is?

“What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds.” — Ariel Caticha (eprint: 0710.1068)

“Perhaps physics is nothing but inference after all.” — Ariel Caticha (eprint: 0808.1260)

“Physics is the ability to win a bet.” — Attributed to J. R. Buck by C. A. Fuchs (eprint: quant-ph/0105039, p. 125)

Some theories present us with intruiging conceptual puzzles. This is the case with probability theory and statistics. Originally the notion of ‘probability’ was introduced in the study of games of chance, where players who are uncertain about outcomes in a game need to decide on a strategy. Over time, and especially with the development of mathematical statistics, the interpretations of probability have diversified and become the subject of controversy. In many applications, e.g., statistical mechanics or population genetics, it is very natural (though some would say incorrect) to think of probabilities as representing something objective, like a relative frequency or a propensity. Since statistical mechanics and population genetics are about objective features of the macroscopic world and evolving populations, respectively, it would be surprising if a concept that plays an important within the theories represented something subjective. In other applications, e.g. to futures markets and to sports betting, it seems most natural to think of probabilities as representing subjective expectations and (in)completeness of available knowledge. I won’t say anything here about the interrelations, strengths and weaknesses of different interpretations. I’ll instead focus on interpretations that are more or less in line with the original idea of probability as related to someone’s uncertainty about events.

Bayesianism is the modern form of the view that probabilities represent an agent’s degree of belief. Bayesians differ among themselves in whose degree of belief they have in mind and to what extent degrees of belief are taken to be subjective/objective. At one end of the range of Bayesian views we find the idea that probabilities are the degrees of belief an idealized fully rational inference machine should have. In the other end is the idea that probabilities are entirely subjective; rational agents must represent their degrees of beliefs as probabilities (lest they wouldn’t be rational on this view), but the values of the probabilities can only be assigned subjectively. An important addition to Bayesianism was introduced in 1950s by Jaynes, who argued that the maximum entropy principle used in statistical mechanics to derive the probabilities of microscopic events in a system at thermodynamic equilibrium is, in fact, a general principle of statistical inference. Far from being restricted to microscopic events and equilibrium, this principle is applicable virtually in every case we need to assign probabilities (and a maximizer of entropy actually exists). Define a sample space, list known/assumed constraints on the probability distribution, and maximize the Shannon entropy of the probability distribution. The maximizer of the Shannon entropy is the probability distribution you should use and adjust your degrees of belief to, according to Jaynes and the many people who find his arguments convincing.

This is all very reasonable; probabilities do satisfy conditions that make them suitable as (idealized) representations of degrees of beliefs and the criterion of maximum entropy also has appealing properties. Where it to my mind fails to be convincing is the issue of uniqueness and generality: there are other ways to formalize reasoning under uncertainty and there are infinitely many alternative entropy measures besides Shannon entropy. (There is, for instance, an ongoing discussion in the physics literature about a statistical mechanics of metastable systems based on Tsallis entropy rather than Shannon entropy.) It’s even possible to change the direction of the maximum entropy principle: first choose a probability distribution, then determine an alternative entropy measure for which it is the maximizer! Though I’m not convinced of the Bayesian maximum entropy philosophy, I find it almost endlessly intriguing. The reason for my fascination is the sheer generality Jaynes and his successors attribute to Bayesianism and maximum entropy. Jaynes did not accept any counter-example to the view that probabilities are degrees of belief, not physical properties. He argued against relative frequency interpretations of probability in statistics and he showed explicitly that much of equilibrium statistical mechanics can be cast as an application of his maximum entropy principle. Indeed, he would probably say that all of equilibrium statistical mechanics can be obtained this way, though I wonder if the strange order parameters (e.g. staggered magnetic fields in the Ising model) that sometimes arise in connection with phase transitions can be understood in this framework. In regard to quantum mechanics, which many think of as an indeterministic theory about physical probabilities, he argued, less convincingly, that we just haven’t yet found the more fundamental theory that will succeed quantum mechanics and be consistent with the Bayesian maximum entropy principle.

That was just the beginning of the claims to generality. Most later Bayesians have accepted that probability theory is inadequate for reproducing quantum mechanics. It seems impossible to fully reproduce quantization, the non-existence of a joint probability distribution over non-commuting quantities, EPR-correlations, etc. within probability theory. Even the holiest principle of all, Bayes’ rule (or, more properly, probabilistic conditionalization as the update rule), needs to be modified to fit quantum mechanics. (Update: Corrected mistakes in discussion of Bayes’ rule.) When non-commuting variables are considered, the non-existence of a joint probability distribution prevents Bayesian conditionalization. Only when attention is restricted to a single complete set of commuting variables, which define a basis to expand quantum states in, does Bayes’ rule hold. The most promising Bayesian way forward is to seek a generalization of the mathematical framework of probability theory that can reproduce quantum mechanics and be consistent with a (lightly modified) Bayesianism. A radical proposal along these lines has been put forth by Fuchs (and co-workers). Fuchs takes quantum states to be entirely subjective “states of belief” and the probabilities that one can compute from quantum states are thus degrees of belief. Like many other Bayesians, Fuchs understands degrees of belief in terms of betting behavior that a psychologist could in principle study by presenting a number of gambling situations and noting which odds and sides of bets are accepted. Compared to ordinary Bayesianism, Fuchs’ quantum Bayesianism differs in that states of belief are vectors in a Hilbert space, not probability distributions, but the interpretation of probabilities remains the same. Also, Bayes rule is of course replaced by the equation above for updating quantum states after measurements (actually, Fuchs considers a more general kind of measurement and needs a more general replacement for Bayes rule). It would be odd to have an objective time-evolution for a subjective state of belief, so the subjectivity time-evolution and updating after measurements is also embraced. After this, not much is left of quantum mechanics that represents objective features of the world. Quantum mechanics has, at hands of Fuchs, become a framework for agents to do the accounting work related to their subjective inferences rather than a descriptive theory of how the world behaves. Timpson summarizes the philosophical view:

“In the quantum Bayesian picture, quantum mechanics is the theory which, it is urged, should not be thought of in standard realist terms; either, for example, by being a realist about the quantum state (e.g., Everett, GRW) or by seeking to add further realist components to the formalism (e.g., hidden variable theories, modal interpretations, consistent histories). Rather, we are invited to recognise quantum mechanics as being the best we can do, given that the world will not admit of a straightforward realist description. That best, it is suggested, is not a theory whose central theoretical elements—quantum states, measurements and general time evolutions—are supposed to correspond to properties or features of things and processes in the world; rather, it is a structure which is to be understood in broadly pragmatic terms: it represents our best means for dealing with (that is, for forming our expectations and making predictions regarding) a world which turns out to be recalcitrant at a fundamental level; resistant to our traditional—and natural—descriptive desires.

But we need not give up our realism on account of this! Rather, the project must be transformed. Granted, our traditional realist descriptive project has been stymied: we are to take seriously the suggestion that quantum mechanics (with any of the paraphernalia of familiar realist interpretrations of the formalism eschewed) is the best theory that one can arrive at. Better—and closer to the descriptive ideal—cannot be achieved, runs the thought. But if this is so then our realist desires must be served indirectly. One need not give up the task of getting a handle on how the world is at the fundamental level just because no direct description is possible; one can seek an indirect route in instead. Quantum mechanics may not be a descriptive theory, we may grant, but it is a significant feature that we have been driven to a theory with just this characteristic (and unusual) form in our attempts to deal with and systematize the world. The structure of that theory is not arbitrary: it has been forced on us. Thus by studying in close detail the structure and internal functioning of this (largely) non-descriptive theory we have been driven to, and by comparing and contrasting with other theoretical structures, we may ultimately be able to gain indirect insight into the fundamental features of the world that were eluding us on any direct approach; learn what the physical features are that are responsible for us requiring a theory of just this form, rather than any other. And what more could be available if the hypothesis that the world precludes a direct description at the fundamental level is true? All we can do is essay this ingenious indirect approach.”

Timpson goes on to argue that this view is not as unreasonable as it may seem at first sight, addressing many objections. However, he ultimately rejects it because it’s hard to get a satisfactory understanding of the more explanatory results in quantum mechanics. It is hard to swallow that the difference between an insulator and a conductor (explained quantum mechanically in terms of band gaps between occupied and unoccupied one-particle states or, at a deeper level, in terms of localization of the wave function) is to be understood in terms of entirely subjective beliefs. One can say the same about the quantum mechanical treatment of surprising phenomena like the Aharanov-Bohm effect. Bub and Clifton et al. formulate their own, less radical, versions of quantum Bayesianism.

Not even classical physics is beyond the ambitions of current-day Bayesians. In a number of articles, Caticha has explored the possibility of deriving classical mechanics and the general theory of relativity from the Bayesian maximum entropy principle. Caticha and Cafaro, for example, assume a natural metric on the probability distributions over configuration space (particle coordinates) and show that the most likely trajectory satisfies Newton’s equation of motion. Particle mass is related to the width of a Gaussian-like probability distribution in this model. It will be interesting to watch future developments in this area.

Finally, to relate this post to a previous post, I think these attempts to delegate as much as possible from physics to statistical inference, in particular of the Bayesian variety, show just how much metaphysics is left underdetermined by science.

References

J. Bub. Quantum probabilities as degrees of belief. Stud. Hist. Phil. Sci. B 38:232 (2007)

A. Caticha. From Inference to Physics. eprint: 0808.1260 (2008)

A. Caticha and C. Cafaro. From Information Geometry to Newtonian Dynamics. eprint: 0710.1071 (2007)

A. Caticha. The Information Geometry of Space and Time. eprint: gr-qc/0508108 (2005)

R. Clifton, J. Bub and H. Halvorson. Characterizing quantum theory in terms of information-theoretic constraints. Found. Phys. 33:1561 (2003), eprint: quant-ph/0211089

C. A. Fuchs and R. Schack. Quantum-Bayesian Coherence. eprint: 0906.2187 (2009)

C. A. Fuchs. Quantum Mechanics as Quantum Information (and only a little more). eprint: quant-ph/0205039 (2002)

C. G. Timpson. Quantum Bayesianism: A study. Stud. Hist. Phil. Sci. B 39:579 (2008), eprint: 0804.2047

Advertisements

Comments are closed.

%d bloggers like this: