Skip to main content

Problem 2.1 (b)

Pre-required Knowledge

  • Expectation Properties: Linearity of expectation E[aX+bY]=aE[X]+bE[Y]\mathbb{E}[aX + bY] = a\mathbb{E}[X] + b\mathbb{E}[Y].
  • Variance Properties: For independent variables, var(Xi)=var(Xi)\text{var}(\sum X_i) = \sum \text{var}(X_i). Also var(aX)=a2var(X)\text{var}(aX) = a^2 \text{var}(X).
  • Unbiased Estimator: An estimator θ^\hat{\theta} is unbiased if its expected value equals the true parameter value, i.e., E[θ^]=θ\mathbb{E}[\hat{\theta}] = \theta.
  • I.I.D.: Independent and Identically Distributed. Since kik_i are i.i.d samples from Poisson(λ\lambda), E[ki]=λ\mathbb{E}[k_i] = \lambda and var(ki)=λ\text{var}(k_i) = \lambda.

Step-by-Step Answer

  1. Recall the ML Estimator: From part (a), the estimator is: λ^=1Ni=1Nki\hat{\lambda} = \frac{1}{N} \sum_{i=1}^{N} k_i

  2. Show Unbiasedness: We calculate the expectation of λ^\hat{\lambda}:

    E[λ^]=E[1Ni=1Nki]=1Ni=1NE[ki](Linearity of Expectation)\begin{aligned} \mathbb{E}[\hat{\lambda}] &= \mathbb{E}\left[ \frac{1}{N} \sum_{i=1}^{N} k_i \right] \\ &= \frac{1}{N} \sum_{i=1}^{N} \mathbb{E}[k_i] \quad \text{(Linearity of Expectation)} \end{aligned}

    Since each kik_i is drawn from a Poisson distribution with parameter λ\lambda, we know E[ki]=λ\mathbb{E}[k_i] = \lambda.

    E[λ^]=1Ni=1Nλ=1N(Nλ)=λ\begin{aligned} \mathbb{E}[\hat{\lambda}] &= \frac{1}{N} \sum_{i=1}^{N} \lambda \\ &= \frac{1}{N} (N\lambda) \\ &= \lambda \end{aligned}

    Since E[λ^]=λ\mathbb{E}[\hat{\lambda}] = \lambda, the estimator is unbiased.

  3. Calculate the Variance: We calculate the variance of λ^\hat{\lambda}:

    \begin{aligned} \text{var}(\hat{\lambda}) &= \text{var}\left( \frac{1}{N} \sum_{i=1}^{N} k_i \right) \\ &= \frac{1}{N^2} \text{var}\left( \sum_{i=1}^{N} k_i \right) \quad \text{(Property: var(aX) = a^2 var(X))} \end{aligned}

    Since the kik_i samples are independent, the variance of the sum is the sum of the variances:

    var(λ^)=1N2i=1Nvar(ki)\begin{aligned} \text{var}(\hat{\lambda}) &= \frac{1}{N^2} \sum_{i=1}^{N} \text{var}(k_i) \end{aligned}

    For a Poisson distribution, the variance is equal to the mean, so var(ki)=λ\text{var}(k_i) = \lambda.

    var(λ^)=1N2i=1Nλ=1N2(Nλ)=λN\begin{aligned} \text{var}(\hat{\lambda}) &= \frac{1}{N^2} \sum_{i=1}^{N} \lambda \\ &= \frac{1}{N^2} (N\lambda) \\ &= \frac{\lambda}{N} \end{aligned}

    Thus, the estimator variance is λN\frac{\lambda}{N}.