Math Problem Statement

Suppose X̄ and S² are calculated from a random sample X₁, ..., Xₙ drawn from a population with finite variance σ². We know that E(S²) = σ². Prove that E(S) ≤ σ, and, if σ² > 0, then E(S) < σ.

Solution

The problem involves showing that E(S)σE(S) \leq \sigma, and that if σ2>0\sigma^2 > 0, then E(S)<σE(S) < \sigma.

We are given the following:

  • A random sample X1,X2,,XnX_1, X_2, \dots, X_n from a population with variance σ2\sigma^2.
  • The sample mean is Xˉ\bar{X}, and the sample variance is S2S^2, which is defined as:

S2=1n1i=1n(XiXˉ)2.S^2 = \frac{1}{n-1} \sum_{i=1}^n (X_i - \bar{X})^2.

We also know that:

E(S2)=σ2.E(S^2) = \sigma^2.

Part 1: Prove E(S)σE(S) \leq \sigma

To prove this, we will use Jensen's inequality. Jensen's inequality states that for a convex function g(x)g(x), the following holds:

E(g(X))g(E(X)).E(g(X)) \geq g(E(X)).

In this case, S=S2S = \sqrt{S^2}, and the square root function is concave, meaning g(x)=xg(x) = \sqrt{x} satisfies the reverse inequality:

E(S2)E(S2).E(\sqrt{S^2}) \leq \sqrt{E(S^2)}.

Using this, we get:

E(S)=E(S2)E(S2)=σ2=σ.E(S) = E(\sqrt{S^2}) \leq \sqrt{E(S^2)} = \sqrt{\sigma^2} = \sigma.

Thus, we have shown that E(S)σE(S) \leq \sigma.

Part 2: Prove that if σ2>0\sigma^2 > 0, then E(S)<σE(S) < \sigma

To prove strict inequality, note that SS is a random variable, while σ\sigma is a constant. S2S^2 is an unbiased estimator of σ2\sigma^2, meaning E(S2)=σ2E(S^2) = \sigma^2. However, because SS is random, it will take values both below and above σ\sigma, but since S2S^2 has a nonzero variance when σ2>0\sigma^2 > 0, we will have:

E(S)<σ.E(S) < \sigma.

This is because the concavity of the square root function ensures that the expected value of the square root is strictly less than the square root of the expected value unless S2S^2 is a degenerate distribution (i.e., with zero variance, which would only happen if σ2=0\sigma^2 = 0).

Thus, for σ2>0\sigma^2 > 0, we have E(S)<σE(S) < \sigma.


Would you like more details on Jensen's inequality or any further explanation on how the sample variance behaves?

5 Related Questions:

  1. What is Jensen's inequality, and why does it apply in this context?
  2. How is the sample variance S2S^2 an unbiased estimator of the population variance?
  3. Why is the square root function concave, and how does this affect the expectation?
  4. What happens to the relationship between E(S)E(S) and σ\sigma as the sample size nn increases?
  5. How does the strict inequality E(S)<σE(S) < \sigma change if σ2=0\sigma^2 = 0?

Tip: Remember that Jensen's inequality is a powerful tool whenever you are dealing with expectations of convex or concave functions.

Ask a new question for Free

By Image

Drop file here or Click Here to upload

Math Problem Analysis

Mathematical Concepts

Statistics
Expectation
Sample variance
Convexity and concavity
Jensen's inequality

Formulas

S² = (1/(n-1)) * Σ(Xᵢ - X̄)²
E(S²) = σ²
Jensen's inequality: E(g(X)) ≥ g(E(X)) for convex functions

Theorems

Jensen's Inequality
Unbiased Estimator Theorem

Suitable Grade Level

Undergraduate