Odds: Introduction

Lets say we have a bag con­tain­ing twice as many blue mar­bles as red mar­bles. Then, if you reach in with­out look­ing and pick out a mar­ble at ran­dom, the odds are 2 : 1 in fa­vor of draw­ing a blue mar­ble as op­posed to a red one.

Odds ex­press rel­a­tive quan­tities. 2 : 1 odds are the same as 4 : 2 odds are the same as 600 : 300 odds. For ex­am­ple, if the bag con­tains 1 red mar­ble and 2 blue mar­bles, or 2 red mar­bles and 4 blue mar­bles, then your chance of pul­ling out a red mar­ble is the same in both cases:

In other words, given odds of \((x : y)\) we can scale it by any pos­i­tive num­ber \(\alpha\) to get equiv­a­lent odds of \((\alpha x : \alpha y).\)

Con­vert­ing odds to probabilities

If there were also green mar­bles, the rel­a­tive odds for red ver­sus blue would still be (1 : 2), but the prob­a­bil­ity of draw­ing a red mar­ble would be lower.

If red, blue, and green are the only kinds of mar­bles in the bag, then we can turn odds of \((r : b : g)\) into prob­a­bil­ities \((p_r : p_b : p_g)\) that say the prob­a­bil­ity of draw­ing each kind of mar­ble. Be­cause red, blue, and green are the only pos­si­bil­ities, \(p_r + p_g + p_b\) must equal 1, so \((p_r : p_b : p_g)\) must be odds equiv­a­lent to \((r : b : g)\) but “nor­mal­ized” such that it sums to one. For ex­am­ple, \((1 : 2 : 1)\) would nor­mal­ize to \(\frac{1}{4} : \frac{2}{4} : \frac{1}{4},\) which are the prob­a­bil­ities of draw­ing a red /​ blue /​ green mar­ble (re­spec­tively) from the bag on the right above.

Note that if red and blue are not the only pos­si­bil­ities, then it doesn’t make sense to con­vert the odds \((r : b)\) of red vs blue into a prob­a­bil­ity. For ex­am­ple, if there are 100 green mar­bles, one red mar­ble, and two blue mar­bles, then the odds of red vs blue are 1 : 2, but the prob­a­bil­ity of draw­ing a red mar­ble is much lower than 1/​3! Odds can only be con­verted into prob­a­bil­ities if its terms are mu­tu­ally ex­clu­sive and ex­haus­tive.

Imag­ine a for­est with some sick trees and some healthy trees, where the odds of a tree be­ing sick (as op­posed to heathy) are (2 : 3), and ev­ery tree is ei­ther sick or healthy (there are no in-be­tween states). Then the prob­a­bil­ity of ran­domly pick­ing a sick tree from among all trees is 2 /​ 5, be­cause 2 out of ev­ery (2 + 3) trees is sick.

In gen­eral, the op­er­a­tion we’re do­ing here is tak­ing rel­a­tive odds like \((a : b : c \ldots)\) and di­vid­ing each term by the sum \((a + b + c \ldots)\) to pro­duce

$$\left(\frac{a}{a + b + c \ldots} : \frac{b}{a + b + c \ldots} : \frac{c}{a + b + c \ldots}\ldots\right)$$
Di­vid­ing each term by the sum of all terms gives us an equiv­a­lent set of odds (be­cause each el­e­ment is di­vided by the same amount) whose terms sum to 1.

This pro­cess of di­vid­ing a set of odds by the sum of its terms to get a set of prob­a­bil­ities that sum to 1 is called nor­mal­iza­tion.

Con­vert­ing prob­a­bil­ities to odds

Let’s say we have two events R and B, which might be things like “I draw a red mar­ble” and “I draw a blue mar­ble.” Say \(\mathbb P(R) = \frac{1}{4}\) and \(\mathbb P(B) = \frac{1}{2}.\) What are the odds of R vs B?\(\mathbb P(R) : \mathbb P(B) = \left(\frac{1}{4} : \frac{1}{2}\right),\) of course.

Equiv­a­lently, we can take the odds \(\left(\frac{\mathbb P(R)}{\mathbb P(B)} : 1\right)\), be­cause \(\frac{\mathbb P(R)}{\mathbb P(B)}\) is how many more times likely R is than B. In this ex­am­ple, \(\frac{\mathbb P(R)}{\mathbb P(B)} = \frac{1}{2},\) be­cause R is half as likely as B. Some­times, the quan­tity \(\frac{\mathbb P(R)}{\mathbb P(B)}\) is called the “odds ra­tio of R vs B,” in which case it is un­der­stood that the odds for R vs B are \(\left(\frac{\mathbb P(R)}{\mathbb P(B)} : 1\right).\)

Odds to ratios

When there are only two terms \(x\) and \(y\) in a set of odds, the odds can be writ­ten as a ra­tio \(\frac{x}{y}.\) The odds ra­tio \(\frac{x}{y}\) refers to the odds \((x : y),\) or, equiv­a­lently, \(\left(\frac{x}{y} : 1\right).\)

Parents:

  • Odds

    Odds ex­press a rel­a­tive prob­a­bil­ity.

    • Probability theory

      The logic of sci­ence; co­her­ence re­la­tions on quan­ti­ta­tive de­grees of be­lief.