Prerequisites
- Conditional Independence
- Maximum A Posteriori (MAP) Decision Rule
- Joint Probability
Step-by-Step Derivation
Let the sequence of n independent reports be denoted as R=(r1,r2,…,rn).
Let k be the number of times the friend reports heads (H), and n−k be the number of times the friend reports tails (T).
We want to use the MAP decision rule, comparing the posterior probabilities p(s=H∣R) and p(s=T∣R).
This is equivalent to comparing the joint probabilities:
p(R∣s=H)p(s=H)vsp(R∣s=T)p(s=T)
Because the reports are statistically independent given the true outcome s, the joint conditional probability is the product of individual conditional probabilities:
p(R∣s=H)=∏i=1np(ri∣s=H)=p(r=H∣s=H)k⋅p(r=T∣s=H)n−k
p(R∣s=H)=(1−θ1)kθ1n−k
Similarly, for s=T:
p(R∣s=T)=∏i=1np(ri∣s=T)=p(r=H∣s=T)k⋅p(r=T∣s=T)n−k
p(R∣s=T)=θ2k(1−θ2)n−k
Now, we substitute these into our MAP comparison along with the priors p(s=H)=α and p(s=T)=1−α.
The new minimum probability of error decision rule is to guess heads (H) if:
(1−θ1)kθ1n−kα>θ2k(1−θ2)n−k(1−α)
Otherwise, guess tails (T).