Math Problem Statement

Given a random sample of size n from a population that has the known mean μ and the finite variance σ^2, show that 1/n ∑_(i=1)^n▒(X_i-μ)^2  is an unbiased estimator of σ^2.

Solution

We are tasked with showing that the following expression is an unbiased estimator of the population variance σ2\sigma^2:

σ^2=1ni=1n(Xiμ)2\hat{\sigma}^2 = \frac{1}{n} \sum_{i=1}^{n} (X_i - \mu)^2

Here, X1,X2,,XnX_1, X_2, \dots, X_n are random samples drawn from a population with known mean μ\mu and variance σ2\sigma^2. To prove that σ^2\hat{\sigma}^2 is an unbiased estimator, we need to show that the expected value of σ^2\hat{\sigma}^2 is equal to σ2\sigma^2. In other words, we need to prove that:

E[1ni=1n(Xiμ)2]=σ2E\left[\frac{1}{n} \sum_{i=1}^{n} (X_i - \mu)^2\right] = \sigma^2

Step 1: Break down the sum

We begin by analyzing the sum inside the expectation:

i=1n(Xiμ)2\sum_{i=1}^{n} (X_i - \mu)^2

Step 2: Expand the squared term

First, expand the square:

(Xiμ)2=Xi22Xiμ+μ2(X_i - \mu)^2 = X_i^2 - 2X_i \mu + \mu^2

Now, take the sum over ii:

i=1n(Xiμ)2=i=1nXi22μi=1nXi+nμ2\sum_{i=1}^{n} (X_i - \mu)^2 = \sum_{i=1}^{n} X_i^2 - 2\mu \sum_{i=1}^{n} X_i + n\mu^2

Step 3: Take the expectation

Next, we compute the expectation of the entire sum. We can separate each term because the expectation operator is linear:

[ E\left[\sum_{i=1}^{n} (X_i - \mu)^2\right] = E\left[\sum_{i=1}^{n} X_i^2\right] - 2\mu E\left[\sum_{i=1}^{n} X_i\right] + n\mu^2 ]

Now, evaluate the expectations for each term:

  • E[Xi2]E[X_i^2] is the second moment of XiX_i, and since XiX_i is drawn from a population with variance σ2\sigma^2 and mean μ\mu, we know that:

    E[Xi2]=σ2+μ2E[X_i^2] = \sigma^2 + \mu^2

  • E[Xi]E[X_i] is the mean of the random variable XiX_i, which is μ\mu, so:

    E[i=1nXi]=nμE\left[\sum_{i=1}^{n} X_i\right] = n\mu

Thus, the expectation becomes:

E[i=1n(Xiμ)2]=n(σ2+μ2)2μ(nμ)+nμ2E\left[\sum_{i=1}^{n} (X_i - \mu)^2\right] = n(\sigma^2 + \mu^2) - 2\mu(n\mu) + n\mu^2

Simplifying this expression:

E[i=1n(Xiμ)2]=nσ2+nμ22nμ2+nμ2E\left[\sum_{i=1}^{n} (X_i - \mu)^2\right] = n\sigma^2 + n\mu^2 - 2n\mu^2 + n\mu^2

E[i=1n(Xiμ)2]=nσ2E\left[\sum_{i=1}^{n} (X_i - \mu)^2\right] = n\sigma^2

Step 4: Final Calculation

Now, divide by nn to compute σ^2\hat{\sigma}^2:

[ E\left[\hat{\sigma}^2\right] = E\left[\frac{1}{n} \sum_{i=1}^{n} (X_i - \mu)^2\right] = \frac{1}{n} \times n\sigma^2 = \sigma^2 ]

Conclusion

We have shown that:

E[1ni=1n(Xiμ)2]=σ2E\left[\frac{1}{n} \sum_{i=1}^{n} (X_i - \mu)^2\right] = \sigma^2

Thus, 1ni=1n(Xiμ)2\frac{1}{n} \sum_{i=1}^{n} (X_i - \mu)^2 is indeed an unbiased estimator of σ2\sigma^2.


Would you like more details on any of the steps or concepts? Here are a few related questions you might find interesting:

  1. How does this result change if we estimate the population mean μ\mu from the sample data rather than assuming it's known?
  2. What would the expression for an unbiased estimator of variance be if we used n1n-1 in the denominator instead of nn?
  3. How do unbiased estimators of population parameters differ from biased estimators in terms of properties like consistency and efficiency?
  4. Can you derive the variance of the estimator σ^2\hat{\sigma}^2?
  5. How does the sample size nn influence the reliability of this estimator?

Tip: The choice of using nn or n1n-1 in the denominator when estimating the variance depends on whether you're working with the full population or a sample. For sample variance, n1n-1 corrects the bias.

Ask a new question for Free

By Image

Drop file here or Click Here to upload

Math Problem Analysis

Mathematical Concepts

Statistics
Unbiased Estimators
Variance Estimation

Formulas

1/n ∑_(i=1)^n (X_i - μ)^2
E[X_i^2] = σ^2 + μ^2

Theorems

Unbiased Estimator Theorem

Suitable Grade Level

College/University