Answer
Prerequisites
- Expectation and Variance Properties
- Unbiased Estimator
- Properties of independent and identically distributed (i.i.d.) random variables
Step-by-Step Derivation
-
An estimator is considered unbiased if its expected value is equal to the true underlying parameter, i.e., .
-
Recall the ML estimator from part (a):
-
Calculate the expectation of using the linearity of expectation:
-
Given that each is drawn from a Poisson distribution with parameter , we know the expected value of a single sample is : Since , the ML estimator is unbiased.
-
Next, calculate the variance of the estimator, :
-
By the properties of variance, multiplying a random variable by a constant scales the variance by . Also, because the samples are independent, the variance of their sum is the sum of their individual variances:
-
For a Poisson distribution, we are given that . Substituting this into the equation yields:
Thus, the estimator variance is indeed .