in Probability
98 views
0 votes
0 votes
Let \(X_1, X_2, \ldots, X_n\) be independent and identically distributed random variables with probability density function given by
\[ f_X(x; \theta) = \begin{cases}
\theta e^{-\theta(x-1)}, & \text{if } x \geq 1 \\
0, & \text{otherwise}
\end{cases} \]

Also, let \(X = \frac{1}{n} \sum_{i=1}^{n} X_i\). Then the maximum likelihood estimator of \(\theta\) is

(A) \(\frac{1}{X}\)

(B) \(\frac{1}{X^{\frac{1}{\theta} - 1}}\)

(C) \(\frac{1}{X - 1}\)

(D) $X$
in Probability
98 views

1 Answer

0 votes
0 votes

 

Maximum Likelihood Estimation (MLE):

Likelihood Function:


The likelihood function is the joint probability density function of the observed data, viewed as a function of the

parameter \( \theta \). In this case, the \(X_i\) are independent, so the likelihood function is the product of their individual density functions:
\[ L(\theta) = \prod f_X(x_i; \theta) = \theta^n \exp(-\theta \sum (x_i - 1)) \]

Log-Likelihood:
Taking the logarithm of the likelihood function simplifies calculations:
\[ \ln L(\theta) = n \ln(\theta) - \theta \sum (x_i - 1) \]

Derivative:
To find the MLE, we differentiate the log-likelihood with respect to \( \theta \) and set it equal to 0:
\[ \frac{d}{d\theta} \ln L(\theta) = \frac{n}{\theta} - \sum (x_i - 1) = 0 \]

Solve for \( \theta \):
Solving for \( \theta \) gives:
\[ \theta = \frac{n}{\sum (x_i - 1)} = \frac{n}{nX - n} = \frac{1}{X - 1} \]
 

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true