Wikipedia:Reference desk/Archives/Mathematics/2021 July 2

From Wikipedia, the free encyclopedia
Mathematics desk
< July 1 << Jun | July | Aug >> July 3 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


July 2[edit]

E. T. Jaynes and probability interpretation[edit]

I keep hearing about E. T. Jaynes having proposed some novel interpretation of probability and written wondrous books and other publications about it. His biography doesn't say much, and the linked pages mention a radical form of Bayesianism without saying what is radical. Is there a TLDR about this? Is it something I should study, if I'm interested in probability and statistics as topics in math? At the moment I have a little bit of understanding of classical probability (I mean à la Kolmogorov) and almost none about statistics. To the extent that I comprehend what Bayesianism is, it doesn't seem like a mathematical topic, but maybe I'm wrong. Thanks. 2601:648:8200:970:0:0:0:23D0 (talk) 08:35, 2 July 2021 (UTC)[reply]

Jaynes is mentioned near the end of the article on Bayesian probability. That article summarizes of Bayesianism if you want details. I think it is a mathematical topic, though perhaps the philosophy of mathematics would be a better classification. I'd say you should definitely know what Bayesianism is, as well as other Probability interpretations, if you're going to study probability and statistics. You can do computations without knowing the underlying philosophy, but without the philosophy the results are just numbers with no interpretation. --RDBury (talk) 09:25, 3 July 2021 (UTC)[reply]
This review of Jaynes's book Probability Theory: The Logic of Science gives a capsule description of his "Principle of Maximum Entropy". In a nutshell, where Baysesianists of the subjective persuasion freely allow the practitioner to select a prior depending on one's whims or superstitions, the maximum entropy principle assigns a prior in an objective way. I don't know how this compares to other objective approaches to Bayesianism (such as those based on Cox's theorem). Ming Li and Vitányi have established a link between the maximum entropy principle and Kolmogorov complexity.[1]  --Lambiam 09:50, 3 July 2021 (UTC)[reply]
Cox's theorem is the foundation of Jayne's book - as is noted in Cox's theorem#Interpretation and further discussion. It's not clear the Jayne is claiming originality for this novel interpretation of probability, but he does offer a lucid exposition of it. catslash (talk) 15:04, 4 July 2021 (UTC)[reply]
However, while Cox's postulates provide no guidance regarding the selection of the prior, this is the essence of Jaynes's contribution.  --Lambiam 17:27, 4 July 2021 (UTC)[reply]

Thanks all, I think you are saying Jaynes' book is standard Bayesian statistics except he says to use a maximum entropy prior. I had had the impression that some of his readers considered the book to be earth shattering. I think there is a Bayesian principle/theorem that even if you start with a not so good prior, after enough updates, your n-times-posterior distribution will converge to the "true" distribution if that means anything. Anyway I'll look at the book and a few others that I've found. 2602:24A:DE47:BA60:8FCB:EA4E:7FBD:4814 (talk) 04:15, 8 July 2021 (UTC)[reply]