Math Problem Statement

Let X1, . . . , Xn be a random sample of size n from the Geometric distribution, given by, P (X = x) = qx−1p, where 0 ≤ p ≤ 1 and q = 1 − p. Obtain the maximum likelihood estimator of p

Solution

To obtain the Maximum Likelihood Estimator (MLE) of pp for a random sample X1,X2,,XnX_1, X_2, \ldots, X_n from the geometric distribution, we proceed as follows:


1. Likelihood Function

The probability mass function (PMF) of the geometric distribution is:

P(X=x)=qx1p,where q=1p,x=1,2,3,P(X = x) = q^{x-1} p, \quad \text{where } q = 1 - p, \, x = 1, 2, 3, \ldots

For a random sample X1,X2,,XnX_1, X_2, \ldots, X_n, the likelihood function is:

L(p)=i=1nP(Xi)=i=1nqXi1pL(p) = \prod_{i=1}^n P(X_i) = \prod_{i=1}^n q^{X_i - 1} p

Substitute q=1pq = 1 - p into the likelihood function:

L(p)=i=1n(1p)Xi1pL(p) = \prod_{i=1}^n (1 - p)^{X_i - 1} p

Simplify:

L(p)=pn(1p)i=1n(Xi1)L(p) = p^n (1 - p)^{\sum_{i=1}^n (X_i - 1)}


2. Log-Likelihood Function

Take the natural logarithm of the likelihood function:

(p)=lnL(p)=ln(pn)+ln((1p)i=1n(Xi1))\ell(p) = \ln L(p) = \ln(p^n) + \ln((1 - p)^{\sum_{i=1}^n (X_i - 1)})

(p)=nln(p)+(i=1n(Xi1))ln(1p)\ell(p) = n \ln(p) + \left(\sum_{i=1}^n (X_i - 1)\right) \ln(1 - p)

Let S=i=1n(Xi1)=i=1nXinS = \sum_{i=1}^n (X_i - 1) = \sum_{i=1}^n X_i - n. Substitute this:

(p)=nln(p)+Sln(1p)\ell(p) = n \ln(p) + S \ln(1 - p)


3. Differentiation

Differentiate (p)\ell(p) with respect to pp:

d(p)dp=npS1p\frac{d\ell(p)}{dp} = \frac{n}{p} - \frac{S}{1 - p}

Set d(p)dp=0\frac{d\ell(p)}{dp} = 0 to find the maximum likelihood estimator of pp:

npS1p=0\frac{n}{p} - \frac{S}{1 - p} = 0

Rearrange:

np=S1p\frac{n}{p} = \frac{S}{1 - p}

Cross-multiply:

n(1p)=Spn (1 - p) = S p

Simplify:

nnp=Spn - n p = S p

n=np+Spn = n p + S p

n=p(n+S)n = p (n + S)

Solve for pp:

p^=nn+S\hat{p} = \frac{n}{n + S}

Substitute S=i=1nXinS = \sum_{i=1}^n X_i - n:

p^=nn+(i=1nXin)\hat{p} = \frac{n}{n + \left(\sum_{i=1}^n X_i - n\right)}

Simplify:

p^=ni=1nXi\hat{p} = \frac{n}{\sum_{i=1}^n X_i}


Final Result

The Maximum Likelihood Estimator (MLE) of pp is:

p^=ni=1nXi\hat{p} = \frac{n}{\sum_{i=1}^n X_i}


Additional Insight:

  • The MLE p^\hat{p} is inversely proportional to the sample mean of XX, since i=1nXi/n\sum_{i=1}^n X_i / n is the sample mean.

Would you like more details, examples, or clarifications on this derivation? 😊


5 Related Questions:

  1. How is the sample mean of X1,,XnX_1, \ldots, X_n related to the MLE of pp?
  2. What is the intuition behind the geometric distribution and its parameters?
  3. Can you derive the variance of p^\hat{p}?
  4. How does the MLE behave as nn \to \infty?
  5. How would you test the goodness of fit for the estimated geometric model?

Tip:

Always check if the derived MLE lies in the parameter space (i.e., 0<p10 < p \leq 1) before using it in applications!

Ask a new question for Free

By Image

Drop file here or Click Here to upload

Math Problem Analysis

Mathematical Concepts

Probability Theory
Maximum Likelihood Estimation (MLE)
Geometric Distribution

Formulas

P(X = x) = q^(x−1)p, where q = 1 - p
L(p) = p^n (1 - p)^(Σ(Xi - 1))
Log-Likelihood: l(p) = n ln(p) + Σ(Xi - 1) ln(1 - p)
MLE of p: p̂ = n / ΣXi

Theorems

Maximum Likelihood Estimation Theorem

Suitable Grade Level

Undergraduate