Skip to main content

Answer

Prerequisites

  • Maximum Likelihood Estimation (MLE)
  • Log-likelihood
  • Calculus (Derivation)

Step-by-Step Derivation

  1. The likelihood function L(λ)L(\lambda) for NN independent and identically distributed (i.i.d.) samples {k1,k2,,kN}\{k_1, k_2, \dots, k_N\} is the product of their individual probability mass functions (PMF): L(λ)=i=1Np(x=kiλ)=i=1N1ki!eλλkiL(\lambda) = \prod_{i=1}^N p(x = k_i | \lambda) = \prod_{i=1}^N \frac{1}{k_i!}e^{-\lambda}\lambda^{k_i}

  2. Take the natural logarithm to obtain the log-likelihood function l(λ)l(\lambda). This simplifies the product into a sum, which is easier to differentiate: l(λ)=lnL(λ)=i=1Nln(1ki!eλλki)l(\lambda) = \ln L(\lambda) = \sum_{i=1}^N \ln\left( \frac{1}{k_i!}e^{-\lambda}\lambda^{k_i} \right) l(λ)=i=1N(ln(ki!)λ+kilnλ)l(\lambda) = \sum_{i=1}^N \left( -\ln(k_i!) - \lambda + k_i \ln\lambda \right) l(λ)=i=1Nln(ki!)Nλ+(lnλ)i=1Nkil(\lambda) = - \sum_{i=1}^N \ln(k_i!) - N\lambda + (\ln\lambda)\sum_{i=1}^N k_i

  3. To find the maximum-likelihood estimate, take the derivative of l(λ)l(\lambda) with respect to λ\lambda and set it to 00: dl(λ)dλ=N+1λi=1Nki=0\frac{\text{d}l(\lambda)}{\text{d}\lambda} = -N + \frac{1}{\lambda}\sum_{i=1}^N k_i = 0

  4. Solve for λ\lambda to obtain the estimator λ^\hat{\lambda}: N=1λi=1NkiN = \frac{1}{\lambda}\sum_{i=1}^N k_i λ^ML=1Ni=1Nki\hat{\lambda}_{\text{ML}} = \frac{1}{N}\sum_{i=1}^N k_i