Skip to main content

Answer

Prerequisites

  • Conditional Independence
  • Maximum A Posteriori (MAP) Decision Rule
  • Joint Probability

Step-by-Step Derivation

Let the sequence of nn independent reports be denoted as R=(r1,r2,,rn)R = (r_1, r_2, \dots, r_n). Let kk be the number of times the friend reports heads (HH), and nkn-k be the number of times the friend reports tails (TT).

We want to use the MAP decision rule, comparing the posterior probabilities p(s=HR)p(s = H | R) and p(s=TR)p(s = T | R). This is equivalent to comparing the joint probabilities: p(Rs=H)p(s=H)vsp(Rs=T)p(s=T)p(R | s = H) p(s = H) \quad \text{vs} \quad p(R | s = T) p(s = T)

Because the reports are statistically independent given the true outcome ss, the joint conditional probability is the product of individual conditional probabilities: p(Rs=H)=i=1np(ris=H)=p(r=Hs=H)kp(r=Ts=H)nkp(R | s = H) = \prod_{i=1}^n p(r_i | s = H) = p(r = H | s = H)^k \cdot p(r = T | s = H)^{n-k} p(Rs=H)=(1θ1)kθ1nkp(R | s = H) = (1 - \theta_1)^k \theta_1^{n-k}

Similarly, for s=Ts = T: p(Rs=T)=i=1np(ris=T)=p(r=Hs=T)kp(r=Ts=T)nkp(R | s = T) = \prod_{i=1}^n p(r_i | s = T) = p(r = H | s = T)^k \cdot p(r = T | s = T)^{n-k} p(Rs=T)=θ2k(1θ2)nkp(R | s = T) = \theta_2^k (1 - \theta_2)^{n-k}

Now, we substitute these into our MAP comparison along with the priors p(s=H)=αp(s = H) = \alpha and p(s=T)=1αp(s = T) = 1 - \alpha.

The new minimum probability of error decision rule is to guess heads (HH) if: (1θ1)kθ1nkα>θ2k(1θ2)nk(1α)(1 - \theta_1)^k \theta_1^{n-k} \alpha > \theta_2^k (1 - \theta_2)^{n-k} (1 - \alpha)

Otherwise, guess tails (TT).