Skip to main content

Answer

Prerequisites

  • Bayes' Theorem
  • Uniform Prior Distribution
  • Beta Distribution Integral

Step-by-Step Derivation

  1. Apply Bayes' Theorem: The posterior distribution p(πD)p(\pi|\mathcal{D}) is proportional to the likelihood p(Dπ)p(\mathcal{D}|\pi) times the prior p(π)p(\pi). p(πD)=p(Dπ)p(π)p(D)=p(Dπ)p(π)01p(Dπ)p(π)dπp(\pi|\mathcal{D}) = \frac{p(\mathcal{D}|\pi)p(\pi)}{p(\mathcal{D})} = \frac{p(\mathcal{D}|\pi)p(\pi)}{\int_0^1 p(\mathcal{D}|\pi)p(\pi) d\pi}

  2. Define the Prior: A uniform prior over π[0,1]\pi \in [0, 1] means p(π)=1p(\pi) = 1.

  3. Calculate the Likelihood component: From part (a), the likelihood is p(Dπ)=πs(1π)nsp(\mathcal{D}|\pi) = \pi^s (1 - \pi)^{n-s}. The numerator is thus: p(Dπ)p(π)=πs(1π)ns1=πs(1π)nsp(\mathcal{D}|\pi)p(\pi) = \pi^s (1 - \pi)^{n-s} \cdot 1 = \pi^s (1 - \pi)^{n-s}

  4. Calculate the Marginal Likelihood (Denominator): We integrate the numerator over all possible values of π\pi: p(D)=01πs(1π)nsdπp(\mathcal{D}) = \int_0^1 \pi^s (1 - \pi)^{n-s} d\pi Using the given identity with m=sm = s and n=nsn' = n - s: p(D)=s!(ns)!(s+(ns)+1)!=s!(ns)!(n+1)!p(\mathcal{D}) = \frac{s!(n-s)!}{(s + (n-s) + 1)!} = \frac{s!(n-s)!}{(n+1)!}

  5. Compute the Posterior: Divide the numerator by the denominator: p(πD)=πs(1π)nss!(ns)!(n+1)!=(n+1)!s!(ns)!πs(1π)nsp(\pi|\mathcal{D}) = \frac{\pi^s (1 - \pi)^{n-s}}{\frac{s!(n-s)!}{(n+1)!}} = \frac{(n+1)!}{s!(n-s)!} \pi^s (1 - \pi)^{n-s}

  6. Plotting for n=1n=1: For n=1n=1, the possible values of ss (sum of x1x_1) are s=0s=0 or s=1s=1.

    • If s=0s=0: p(πx1=0)=2!0!1!π0(1π)1=2(1π)p(\pi|x_1=0) = \frac{2!}{0!1!} \pi^0 (1-\pi)^1 = 2(1-\pi). This is a straight line from (0,2)(0, 2) to (1,0)(1, 0).
    • If s=1s=1: p(πx1=1)=2!1!0!π1(1π)0=2πp(\pi|x_1=1) = \frac{2!}{1!0!} \pi^1 (1-\pi)^0 = 2\pi. This is a straight line from (0,0)(0, 0) to (1,2)(1, 2).