# Bayesian Inference IV – Odds

“The odds are good for my favourite team.”, might somebody say but what do they mean with odds?

## Odds

We will talk today about odds. Not just because to understand the language of sports betters but also because they are quite important for Bayesian updating. Odds can reduce our calculations and therefore our computation if we program an algorithm.

Odds tell us the ratio of that an event occurs versus that it doesn’t occur. More formal it says:

The odds for event E versus event E’ are the ratio of their probabilities $\frac{ P(E) }{ P(E\prime) }$. If undefined the second event is assumed to be the complement $E^{ c }$. So the odds of E are:

$O(E)=\frac{ P(E) }{ P(E^{ c }) }$

Example: O(rain)=2 means that the probability for rain is twice the probability for no rain ($\frac{ 2 }{ 3 }$ versus $\frac{ 1 }{ 3 }$). We could also say that the odds for rain are 2 to 1.

## Conversions formulas

We can convert probabilities to odds and odds to probabilities as following:

If P(E)=p then: $O(E)=\frac{ p }{ 1-p }$

If O(E)=q then: $P(E)=\frac{ q }{ 1+q }$

Example: Recall our coin example from the last posts. We had the following Bayesian Updating table:

 Hypothesis Prior Likelihood unnormalised Posterior normalised Posterior H P(H) P(D|H) P(D|H)P(H) P(H|D) A 0.5 0.5 0.25 0.4 B 0.25 0.6 0.15 0.24 C 0.25 0.9 0.225 0.36 Total 1 0.625 1

What are the prior odds and posterior odds of A?

Answer: We can calculate the prior and posterior odds with the above given formula:

$O(A)=\frac{ P(A) }{ P(A^{ c }) }$ where $P(A^{ c })=P(B)+P(C)$. We then have: $O(A)=\frac{ P(A) }{ P(B)+P(C) }=\frac{ 0.5 }{ 0.5 }=1$

$O(A|D)=\frac{ P(A|D) }{ P(A^{ c }|D) }=\frac{ P(D|A)P(A) }{ P(D|A^{ c })P(A^{ c }) }$ where $P(D|A^{ c })P(A^{ c }) = P(D|B)P(B)+P(D|C)P(C)$. We then have: $O(A|D)=\frac{ P(D|A)P(A) }{ P(D|B)P(B)+P(D|C)P(C) }=\frac{ 0.25 }{ 0.375 }=\frac{ 2 }{ 3 }$

We can write the process of updating odds as following:

$O(H|D)=\frac{ P(H|D) }{ P(H^{ c }|D) }=\frac{ P(D|H)P(H) }{ P(D|H^{ c })P(H^{ c }) }$

That is because the normalising factor would be the same in both, the ratio is therefore the same.

## Bayes’ Factors (BF)

We can see that the posterior odds are $\frac{ 2 }{ 3 }$ times smaller than the prior odds. This factor is called the Bayes’ Factor. The Bayes’ Factor is given by the following formula:

$Bayes\;Factor = \frac{ P(D|H) }{ P(D|H^{ c }) }$

Where in the process of updating odds does the Bayes’ Factor occur?

$O(H|D)=\frac{ P(H|D) }{ P(H^{ c }|D) }=\frac{ P(D|H)P(H) }{ P(D|H^{ c })P(H^{ c }) }=\frac{ P(D|H) }{ P(D|H^{ c }) }\cdot\frac{ P(H) }{ P(H^{ c }) }=\frac{ P(D|H) }{ P(D|H^{ c }) }\cdot O(H)$

$\Rightarrow$ posterior odds = Bayes’ Factor x prior odds

Evidence for or against the hypothesis?

• BF > 1 gives evidence for the hypothesis
• BF < 1 gives evidence against the hypothesis
• BF = 1 gives no evidence

## Updating again and again

Conditionally Independent

Two events are conditionally independent if: $P(D_{ 1 }, D_{ 2 }|H)=P(D_{ 1 }|H)P(D_{ 2 }|H)$

Example: Recall the coin example: We had the following Bayesian Updating table with the data being tails:

 Hypothesis Prior Likelihood unnormalised Posterior normalised Posterior H P(H) P(D|H) P(D|H)P(H) P(H|D) A 0.5 0.5 0.25 0.4 B 0.25 0.6 0.15 0.24 C 0.25 0.9 0.225 0.36 Total 1 0.625 1

What are the odds for a type A coin after we had one heads and one tails?

We first calculate O(A): $O(A)=\frac{ P(A) }{ P(A^{ c }) }$ where $P(A^{ c })=P(B)+P(C)$. We then have: $O(A)=\frac{ P(A) }{ P(B)+P(C) }=\frac{ 0.5 }{ 0.5 }=1$

We then calculate the BF for tails: $\frac{ P(A|tails)P(A) }{ P(B|tails)P(B)+P(C|tails)P(C) }=\frac{ 0.25 }{ 0.375 }=\frac{ 2 }{ 3 }$

We then calculate the BF for heads: $\frac{ P(A|heads)P(A) }{ P(B|heads)P(B)+P(C|heads)P(C) }=\frac{ 0.25 }{ 0.125 }=2$

We can then finally calculate O(A|heads, tails): $O(A|heads,tails)=BF_{ 1 }BF_{ 2 }O(A)=\frac{ 2 }{ 3 }\cdot 2 \cdot 1 = \frac{ 4 }{ 3 } \approx 1.333$

Odds allow us to compute the posterior probabilities faster and with less computation. That’s why they are so important for us.