Prerequisites
- Bayes' Theorem
- Maximum A Posteriori (MAP) Decision Rule
- Minimum Probability of Error
Step-by-Step Derivation
To minimize the probability of error, we should use the Maximum A Posteriori (MAP) decision rule. This means we should choose the outcome s that maximizes the posterior probability p(s∣r=H).
We need to compare p(s=H∣r=H) and p(s=T∣r=H).
According to Bayes' theorem:
p(s=H∣r=H)=p(r=H)p(r=H∣s=H)p(s=H)
p(s=T∣r=H)=p(r=H)p(r=H∣s=T)p(s=T)
Since the denominator p(r=H) is the same for both, we only need to compare the numerators:
p(r=H∣s=H)p(s=H)vsp(r=H∣s=T)p(s=T)
From the problem description, we know the prior probabilities:
- p(s=H)=α
- p(s=T)=1−α
We also know the conditional probabilities of the reports:
- p(r=H∣s=H)=1−p(r=T∣s=H)=1−θ1
- p(r=H∣s=T)=θ2
Substituting these into our comparison, we should guess heads (H) if:
(1−θ1)α>θ2(1−α)
Otherwise, we should guess tails (T). (If they are exactly equal, either guess yields the same probability of error).