Prior

Our (po­ten­tially rich or com­plex) state of knowl­edge and propen­sity to learn, be­fore see­ing the ev­i­dence, ex­pressed as a prob­a­bil­ity func­tion. This is a deeper and more gen­eral con­cept than ‘prior prob­a­bil­ity’. A prior prob­a­bil­ity is like guess­ing the chance that it will be cloudy out­side, in ad­vance of look­ing out a win­dow. The more gen­eral no­tion of a Bayesian prior would in­clude prob­a­bil­ity dis­tri­bu­tions that an­swered the ques­tion, “Sup­pose I saw the Sun ris­ing on 999 suc­ces­sive days; would I af­ter­wards think the prob­a­bil­ity of the Sun ris­ing on the next day was more like 1000/​1001, 12, or 1 − 10^-6?” In a sense, a baby can be said to have a ‘prior’ be­fore it opens its eyes, and then to de­velop a model of the world by up­dat­ing on the ev­i­dence it sees af­ter that point. The baby’s ‘prior’ ex­presses not just its cur­rent ig­no­rance, but the differ­ent kinds of wor­lds the baby would end up be­liev­ing in, de­pend­ing on what sen­sory ev­i­dence they saw over the rest of their lives. Key sub­con­cepts in­clude ig­no­rance pri­ors and in­duc­tive pri­ors, and key ex­am­ples are Laplace’s Rule of Suc­ces­sion and Solomonoff in­duc­tion.

Parents:

  • Bayesian reasoning

    A prob­a­bil­ity-the­ory-based view of the world; a co­her­ent way of chang­ing prob­a­bil­is­tic be­liefs based on ev­i­dence.