# Prior probability

“Prior prob­a­bil­ity”, “prior odds”, or just “prior” refers to a state of be­lief that ob­tained be­fore see­ing a piece of new ev­i­dence. Sup­pose there are two sus­pects in a mur­der, Colonel Mus­tard and Miss Scar­let. After de­ter­min­ing that the vic­tim was poi­soned, you think Mus­tard and Scar­let are re­spec­tively 25% and 75% likely to have com­mit­ted the mur­der. Be­fore de­ter­min­ing that the vic­tim was poi­soned, per­haps, you thought Mus­tard and Scar­let were equally likely to have com­mit­ted the mur­der (50% and 50%). In this case, your “prior prob­a­bil­ity” of Miss Scar­let com­mit­ting the mur­der was 50%, and your “pos­te­rior prob­a­bil­ity” af­ter see­ing the ev­i­dence was 75%.

The prior prob­a­bil­ity of a hy­poth­e­sis $$H$$ is of­ten be­ing writ­ten with the un­con­di­tioned no­ta­tion $$\mathbb P(H)$$, while the pos­te­rior af­ter see­ing the ev­i­dence $$e$$ is of­ten be­ing de­noted by the con­di­tional prob­a­bil­ity $$\mathbb P(H\mid e).$$noteE. T. Jaynes was known to in­sist on us­ing the ex­plicit no­ta­tion $$\mathbb P (H\mid I_0)$$ to de­note the prior prob­a­bil­ity of $$H$$, with $$I_0$$ de­not­ing the prior, and never try­ing to write any en­tirely un­con­di­tional prob­a­bil­ity $$\mathbb P(X)$$. Since, said Jaynes, we always have some prior in­for­ma­tion.

knows-req­ui­site(Math 2): This how­ever is a heuris­tic rather than a law, and might be false in­side some com­pli­cated prob­lems. If we’ve already seen $$e_0$$ and are now up­dat­ing on $$e_1$$, then in this new prob­lem the new prior will be $$\mathbb P(H\mid e_0)$$ and the new pos­te­rior will be $$\mathbb P(H\mid e_1 \wedge e_0).$$

For ques­tions about how pri­ors are “ul­ti­mately” de­ter­mined, see Solomonoff in­duc­tion.

Parents:

• Bayesian reasoning

A prob­a­bil­ity-the­ory-based view of the world; a co­her­ent way of chang­ing prob­a­bil­is­tic be­liefs based on ev­i­dence.