Pre-required Knowledge
-
Predictive Distribution: The probability of a new sample x given the observed data D, marginalizing over the parameter π:
p(x∣D)=∫p(x∣π)p(π∣D)dπ
This is often equivalent to finding the expected value of the likelihood parameter under the posterior distribution.
-
Bernoulli Expectation: Since x∈{0,1},
- p(x=1∣D)=E[π∣D]
- p(x∣D) can be written as π^x(1−π^)1−x where π^=p(x=1∣D).
Step-by-Step Proof
-
Identify p(x=1∣D):
The predictive probability of the next outcome being 1 (x=1) is:
p(x=1∣D)=∫01p(x=1∣π)p(π∣D)dπ
Since p(x=1∣π)=π:
p(x=1∣D)=∫01π⋅p(π∣D)dπ=E[π∣D]
This is simply the mean of the posterior distribution.
-
Calculate the Mean of the Posterior:
Substitute Eq. (3.33) into the integral:
E[π∣D]=∫01π[s!(n−s)!(n+1)!πs(1−π)n−s]dπ
=s!(n−s)!(n+1)!∫01πs+1(1−π)n−sdπ
-
Apply the Integral Identity:
Use Eq. (3.32) again with m=s+1 and exponent for (1−π) is n−s.
∫01πs+1(1−π)n−sdπ=((s+1)+(n−s)+1)!(s+1)!(n−s)!=(n+2)!(s+1)!(n−s)!
-
Combine Terms:
E[π∣D]=s!(n−s)!(n+1)!⋅(n+2)!(s+1)!(n−s)!
Cancel (n−s)!:
=s!(n+1)!⋅(n+2)!(s+1)!
Expand factorials:
- s!(s+1)!=s+1
- (n+2)!(n+1)!=n+21
E[π∣D]=n+2s+1
-
Formulate Predictive PDF:
Since x is Bernoulli, if P(x=1)=n+2s+1, then:
p(x∣D)=(n+2s+1)x(1−n+2s+1)1−x
This matches Eq. (3.34).
Effective Bayesian Estimate
The effective Bayesian estimate for π, which is the parameter used for prediction, is:
π^Bayes=n+2s+1
Intuitive Explanation ("Virtual" Samples)
The Maximum Likelihood Estimate (MLE) is π^MLE=ns (successes / total).
The Bayesian estimate can be rewritten as:
π^Bayes=n+2s+1
Intuition:
We can imagine we added 2 virtual samples to our dataset before we started:
- 1 virtual success (+1 in numerator)
- 1 virtual failure (+1 to the count of failures, so total samples n increases by 2)
So, n→n+2 (total virtual size) and s→s+1 (total virtual successes).
These "virtual counts" come from the Uniform Prior. A uniform prior acts like we have already seen one Head and one Tail, smoothing the estimate. This prevents the estimate from being 0 or 1 even if n is small (Laplace smoothing).