-
Apply Bayes' Theorem: The posterior distribution p(π∣D) is proportional to the likelihood p(D∣π) times the prior p(π).
p(π∣D)=p(D)p(D∣π)p(π)=∫01p(D∣π)p(π)dπp(D∣π)p(π)
-
Define the Prior: A uniform prior over π∈[0,1] means p(π)=1.
-
Calculate the Likelihood component: From part (a), the likelihood is p(D∣π)=πs(1−π)n−s. The numerator is thus:
p(D∣π)p(π)=πs(1−π)n−s⋅1=πs(1−π)n−s
-
Calculate the Marginal Likelihood (Denominator): We integrate the numerator over all possible values of π:
p(D)=∫01πs(1−π)n−sdπ
Using the given identity with m=s and n′=n−s:
p(D)=(s+(n−s)+1)!s!(n−s)!=(n+1)!s!(n−s)!
-
Compute the Posterior: Divide the numerator by the denominator:
p(π∣D)=(n+1)!s!(n−s)!πs(1−π)n−s=s!(n−s)!(n+1)!πs(1−π)n−s
-
Plotting for n=1:
For n=1, the possible values of s (sum of x1) are s=0 or s=1.
- If s=0: p(π∣x1=0)=0!1!2!π0(1−π)1=2(1−π). This is a straight line from (0,2) to (1,0).
- If s=1: p(π∣x1=1)=1!0!2!π1(1−π)0=2π. This is a straight line from (0,0) to (1,2).