Maximum Likelihood Estimation (MLE):
Likelihood Function:
The likelihood function is the joint probability density function of the observed data, viewed as a function of the
parameter \( \theta \). In this case, the \(X_i\) are independent, so the likelihood function is the product of their individual density functions:
\[ L(\theta) = \prod f_X(x_i; \theta) = \theta^n \exp(-\theta \sum (x_i - 1)) \]
Log-Likelihood:
Taking the logarithm of the likelihood function simplifies calculations:
\[ \ln L(\theta) = n \ln(\theta) - \theta \sum (x_i - 1) \]
Derivative:
To find the MLE, we differentiate the log-likelihood with respect to \( \theta \) and set it equal to 0:
\[ \frac{d}{d\theta} \ln L(\theta) = \frac{n}{\theta} - \sum (x_i - 1) = 0 \]
Solve for \( \theta \):
Solving for \( \theta \) gives:
\[ \theta = \frac{n}{\sum (x_i - 1)} = \frac{n}{nX - n} = \frac{1}{X - 1} \]