Math Problem Statement

Ex 3.1. Suppose Xi ∼ Bern(p), i = 1,2, and they are independent from each other. De- rive the distributions (PMFs or CDFs) of the following two random variables (functions

of random variables): g (X1,X2) and f (X1,X1), where g (x, y) = x + y and f (x, y) = x y.

Solution

Let's solve this step by step for each function of the Bernoulli random variables X1X_1 and X2X_2.

Given Information:

  • X1Bern(p)X_1 \sim \text{Bern}(p), X2Bern(p)X_2 \sim \text{Bern}(p).
  • X1X_1 and X2X_2 are independent.
  • We need to derive the PMFs or CDFs for two functions of X1X_1 and X2X_2:
    • g(X1,X2)=X1+X2g(X_1, X_2) = X_1 + X_2
    • f(X1,X2)=X1X2f(X_1, X_2) = X_1 \cdot X_2

Solution:

1. Distribution of g(X1,X2)=X1+X2g(X_1, X_2) = X_1 + X_2

Since X1X_1 and X2X_2 are Bernoulli random variables, each can take values 0 or 1 with probabilities:

  • P(Xi=1)=pP(X_i = 1) = p
  • P(Xi=0)=1pP(X_i = 0) = 1 - p

The possible values of g(X1,X2)=X1+X2g(X_1, X_2) = X_1 + X_2 are:

  1. g(X1,X2)=0g(X_1, X_2) = 0, if X1=0X_1 = 0 and X2=0X_2 = 0.
  2. g(X1,X2)=1g(X_1, X_2) = 1, if either X1=1X_1 = 1 and X2=0X_2 = 0, or X1=0X_1 = 0 and X2=1X_2 = 1.
  3. g(X1,X2)=2g(X_1, X_2) = 2, if X1=1X_1 = 1 and X2=1X_2 = 1.

Let's calculate the probabilities for each case.

  • P(g(X1,X2)=0)P(g(X_1, X_2) = 0): P(X1=0 and X2=0)=P(X1=0)P(X2=0)=(1p)(1p)=(1p)2P(X_1 = 0 \text{ and } X_2 = 0) = P(X_1 = 0) \cdot P(X_2 = 0) = (1 - p)(1 - p) = (1 - p)^2

  • P(g(X1,X2)=1)P(g(X_1, X_2) = 1): There are two ways this can happen: (X1=1,X2=0)(X_1 = 1, X_2 = 0) or (X1=0,X2=1)(X_1 = 0, X_2 = 1). P(g(X1,X2)=1)=P(X1=1,X2=0)+P(X1=0,X2=1)=p(1p)+(1p)p=2p(1p)P(g(X_1, X_2) = 1) = P(X_1 = 1, X_2 = 0) + P(X_1 = 0, X_2 = 1) = p(1 - p) + (1 - p)p = 2p(1 - p)

  • P(g(X1,X2)=2)P(g(X_1, X_2) = 2): P(X1=1 and X2=1)=P(X1=1)P(X2=1)=pp=p2P(X_1 = 1 \text{ and } X_2 = 1) = P(X_1 = 1) \cdot P(X_2 = 1) = p \cdot p = p^2

Thus, the PMF of g(X1,X2)g(X_1, X_2) is:

(1 - p)^2, & k = 0, \\ 2p(1 - p), & k = 1, \\ p^2, & k = 2. \end{cases}$$ #### 2. Distribution of $$f(X_1, X_2) = X_1 \cdot X_2$$ The possible values of $$f(X_1, X_2) = X_1 \cdot X_2$$ are: 1. $$f(X_1, X_2) = 0$$, if $$X_1 = 0$$ or $$X_2 = 0$$. 2. $$f(X_1, X_2) = 1$$, if $$X_1 = 1$$ and $$X_2 = 1$$. Let's calculate the probabilities for each case. - **$$P(f(X_1, X_2) = 0)$$:** This happens if at least one of $$X_1$$ or $$X_2$$ is 0. The complementary event is that both are 1, so: $$P(f(X_1, X_2) = 0) = 1 - P(X_1 = 1 \text{ and } X_2 = 1) = 1 - p^2$$ - **$$P(f(X_1, X_2) = 1)$$:** This happens if both $$X_1 = 1$$ and $$X_2 = 1$$. $$P(f(X_1, X_2) = 1) = P(X_1 = 1) \cdot P(X_2 = 1) = p^2$$ Thus, the PMF of $$f(X_1, X_2)$$ is: $$P(f(X_1, X_2) = k) = \begin{cases} 1 - p^2, & k = 0, \\ p^2, & k = 1. \end{cases}$$ ### Summary 1. For $$g(X_1, X_2) = X_1 + X_2$$, the PMF is: $$P(g(X_1, X_2) = k) = \begin{cases} (1 - p)^2, & k = 0, \\ 2p(1 - p), & k = 1, \\ p^2, & k = 2. \end{cases}$$ 2. For $$f(X_1, X_2) = X_1 \cdot X_2$$, the PMF is: $$P(f(X_1, X_2) = k) = \begin{cases} 1 - p^2, & k = 0, \\ p^2, & k = 1. \end{cases}$$ Would you like more details on the derivations, or do you have any

Ask a new question for Free

By Image

Drop file here or Click Here to upload

Math Problem Analysis

Mathematical Concepts

Probability Theory
Bernoulli Distribution
PMF (Probability Mass Function)

Formulas

P(X_i = 1) = p, P(X_i = 0) = 1 - p
P(g(X_1, X_2) = 0) = (1 - p)^2
P(g(X_1, X_2) = 1) = 2p(1 - p)
P(g(X_1, X_2) = 2) = p^2
P(f(X_1, X_2) = 0) = 1 - p^2
P(f(X_1, X_2) = 1) = p^2

Theorems

Independence of random variables
Bernoulli distribution properties

Suitable Grade Level

University level