# Combinations of Random Variables

Sometimes, it is necessary to add or subtract random variables. When this occurs, it is useful to know the mean and variance of the result.

Recommendation: Read the sample problems at the end of the lesson. This lesson introduces some important equations, and the sample problems show how to apply those equations.

## Sums and Differences of Random Variables: Effect on the Mean

Suppose you have two variables: X with a mean of μx and Y with a mean of μy. Then, the mean of the sum of these variables μx+y and the mean of the difference between these variables μx-y are given by the following equations.

μx+y = μx + μy       and       μx-y = μx - μy

The above equations for general variables also apply to random variables. If X and Y are random variables, then

E(X + Y) = E(X) + E(Y)       and       E(X - Y) = E(X) - E(Y)

where E(X) is the expected value (mean) of X, E(Y) is the expected value of Y, E(X + Y) is the expected value of X plus Y, and E(X - Y) is the expected value of X minus Y.

## Independence of Random Variables

If two random variables, X and Y, are independent, they satisfy the following conditions.

• P(x|y) = P(x), for all values of X and Y.
• P(x y) = P(x) * P(y), for all values of X and Y.

The above conditions are equivalent. If either one is met, the other condition also met; and X and Y are independent. If either condition is not met, X and Y are dependent.

Note: If X and Y are independent, then the correlation between X and Y is equal to zero.

## Sums and Differences of Independent Random Variables: Effect on Variance

Suppose X and Y are independent random variables. Then, the variance of (X + Y) and the variance of (X - Y) are described by the following equations

Var(X + Y) = Var(X - Y) = Var(X) + Var(Y)

where Var(X + Y) is the variance of the sum of X and Y, Var(X - Y) is the variance of the difference between X and Y, Var(X) is the variance of X, and Var(Y) is the variance of Y.

Note: The standard deviation (SD) is always equal to the square root of the variance (Var). Thus,

SD(X + Y) = sqrt[ Var(X + Y) ]       and       SD(X - Y) = sqrt[ Var(X - Y) ]

## Test Your Understanding of This Lesson

Problem 1

 X 0 1 2 Y 3 0.1 0.2 0.2 4 0.1 0.2 0.2

The table on the right shows the joint probability distribution between two random variables - X and Y. (In a joint probability distribution table, numbers in the cells of the table represent the probability that particular values of X and Y occur together.)

What is the mean of the sum of X and Y?

(A) 1.2
(B) 3.5
(C) 4.5
(D) 4.7
(E) None of the above.

Solution

The correct answer is D. The solution requires three computations: (1) find the mean (expected value) of X, (2) find the mean (expected value) of Y, and (3) find the sum of the means. Those computations are shown below, beginning with the mean of X.

E(X) = Σ [ xi * P(xi) ]
E(X) = 0 * (0.1 + 0.1) + 1 * (0.2 + 0.2) + 2 * (0.2 + 0.2) = 0 + 0.4 + 0.8 = 1.2

Next, we find the mean of Y.

E(Y) = Σ [ yi * P(yi) ]
E(Y) = 3 * (0.1 + 0.2 + 0.2) + 4 * (0.1 + 0.2 + 0.2) = (3 * 0.5) + (4 * 0.5) = 1.5 + 2 = 3.5

And finally, the mean of the sum of X and Y is equal to the sum of the means. Therefore,

E(X + Y) = E(X) + E(Y) = 1.2 + 3.5 = 4.7

Note: A similar approach is used to find differences between means. The difference between X and Y is E(X - Y) = E(X) - E(Y) = 1.2 - 3.5 = -2.3; and the difference between Y and X is E(Y - X) = E(Y) - E(X) = 3.5 - 1.2 = 2.3

Problem 2

The table on the left shows the joint probability distribution between two random variables - X and Y; and the table on the right shows the joint probability distribution between two random variables - A and B.

 X 0 1 2 Y 3 0.1 0.2 0.2 4 0.1 0.2 0.2
 A 0 1 2 B 3 0.1 0.2 0.2 4 0.2 0.2 0.1

Which of the following statements are true?

I. X and Y are independent random variables.
II. A and B are independent random variables.

(A) I only
(B) II only
(C) I and II
(D) Neither statement is true.
(E) It is not possible to answer this question, based on the information given.

Solution

The correct answer is A. The solution requires several computations to test the independence of random variables. Those computations are shown below.

X and Y are independent if P(x|y) = P(x), for all values of X and Y. From the probability distribution table, we know the following:

P(x=0) = 0.2;      P(x=0 | y=3) = 0.2;      P(x=0 | y = 4) = 0.2
P(x=1) = 0.4;      P(x=1 | y=3) = 0.4;      P(x=1 | y = 4) = 0.4
P(x=2) = 0.4;      P(x=2 | y=3) = 0.4;      P(x=2 | y = 4) = 0.4

Thus, P(x|y) = P(x), for all values of X and Y, which means that X and Y are independent. We repeat the same analysis to test the independence of A and B.

P(a=0) = 0.3;      P(a=0 | b=3) = 0.2;      P(a=0 | b = 4) = 0.4
P(a=1) = 0.4;      P(a=1 | b=3) = 0.4;      P(a=1 | b = 4) = 0.4
P(a=2) = 0.3;      P(a=2 | b=3) = 0.4;      P(a=2 | b = 4) = 0.2

Thus, P(a|b) is not equal to P(a), for all values of A and B. For example, P(a=0) = 0.3; but P(a=0 | b=3) = 0.2. This means that A and B are not independent.

Problem 3

Suppose X and Y are independent random variables. The variance of X is equal to 16; and the variance of Y is equal to 9. Let Z = X - Y.

What is the standard deviation of Z?

(A) 2.65
(B) 5.00
(C) 7.00
(D) 25.0
(E) It is not possible to answer this question, based on the information given.

Solution

The correct answer is B. The solution requires us to recognize that Variable Z is a combination of two independent random variables. As such, the variance of Z is equal to the variance of X plus the variance of Y.

Var(Z) = Var(X) + Var(Y) = 16 + 9 = 25

The standard deviation of Z is equal to the square root of the variance. Therefore, the standard deviation is equal to the square root of 25, which is 5.