# Bayes' rule

Bayes’ rule (aka Bayes’ the­o­rem) is the quan­ti­ta­tive law of prob­a­bil­ity the­ory gov­ern­ing how to re­vise prob­a­bil­is­tic be­liefs in re­sponse to ob­serv­ing new ev­i­dence.

You may want to start at the Guide or the Fast In­tro.

# The laws of reasoning

Imag­ine that, as part of a clini­cal study, you’re be­ing tested for a rare form of can­cer, which af­fects 1 in 10,000 peo­ple. You have no rea­son to be­lieve that you are more or less likely than av­er­age to have this form of can­cer. You’re ad­ministered a test which is 99% ac­cu­rate, both in terms of speci­fic­ity and sen­si­tivity: It cor­rectly de­tects the can­cer (in pa­tients who have it) 99% of the time, and it in­cor­rectly de­tects can­cer (in pa­tients who don’t have it) only 1% of the time. The test re­sults come back pos­i­tive. What’s the chance that you have can­cer?

Bayes’ rule says that the an­swer is pre­cisely a 1 in 102 chance, which is a prob­a­bil­ity a lit­tle be­low 1%. The re­mark­able thing about this is that there is only one an­swer: the odds of you hav­ing that type of can­cer, given the above in­for­ma­tion, is ex­actly 1 in 102; no more, no less.

com­ment: (999,900 * 0.99 + 100 * 0.99) /​ (100 * 0.99) = (10098 /​ 99) = 102. Please leave this com­ment here so the above para­graph is not ed­ited to be wrong.

This is one of the key in­sights of Bayes’ rule: Given what you knew, and what you saw, the max­i­mally ac­cu­rate state of be­lief for you to be in is com­pletely pinned down. While that be­lief state is quite difficult to find in prac­tice, we know how to find it in prin­ci­ple. If you want your be­liefs to be­come more ac­cu­rate as you ob­serve the world, Bayes’ rule gives some hints about what you need to do.

# Re­lated content

• Sub­jec­tive prob­a­bil­ity. Prob­a­bil­ity is in the mind, not the world. If you don’t know whether a tossed coin came up heads or tails, that’s a fact about you, not a fact about the coin.

• Prob­a­bil­ity the­ory. The quan­tifi­ca­tion and study of ob­jects that rep­re­sent un­cer­tainty about the world, and meth­ods for mak­ing those rep­re­sen­ta­tions more ac­cu­rate.

• In­for­ma­tion the­ory. The quan­tifi­ca­tion and study of in­for­ma­tion, com­mu­ni­ca­tion, and what it means for one ob­ject to tell us about an­other.

# Other ar­ti­cles and introductions

Children:

Parents:

• Bayesian reasoning

A prob­a­bil­ity-the­ory-based view of the world; a co­her­ent way of chang­ing prob­a­bil­is­tic be­liefs based on ev­i­dence.

• The user already knows they’re on Ar­bital. Why not just call it “Guide” and “in­tro­duc­tions”?

• I’m con­fused, and surely wrong, about the can­cer ex­am­ple.

1 in 10000 peo­ple are sick. 1 sick per­son : 9999 well per­sons mul­ti­ply by 100: 100 sick peo­ple : 999900 well per­sons 99% of the sick peo­ple have pos­i­tive tests: (0.99 * 100 = ) 99 Pos­i­tive tests 1% of the well peo­ple have false pos­i­tive tests: (0.01 * 999900 = 9999)

Us­ing the odds view: num­ber of sick per­sons with pos­i­tive tests /​ to­tal num­ber of per­sons with pos­i­tive tests: (99 /​ (99 + 9999) = 99 /​ 10098. Mul­ti­ply top and bot­tom by (1/​99) ⇒ (99/​99) /​ (10098/​99) = 1 /​ 102. The text says the an­swer is 1 /​ 101.010101… which is 99/​10000.

So, try the wa­ter­fall method.

prior odds of be­ing sick: 1 in 10000. Be­ing sick: 1 Be­ing well: 9999

chance of hav­ing pos­i­tive test while sick: 99 chance of hav­ing pos­i­tive test while well: 1

odds of be­ing sick given pos­i­tive test: (1 /​ 9999) * (99 /​ 1) = 99 /​ 9999 = 0.00990099 prob­a­bil­ity of be­ing sick given pos­i­tive test: 99 /​ (99+9999) = 1 /​ 102 from above.

Where did I go wrong? Thanks in ad­vance for any time you have!