Stat Trek

Teach yourself statistics

Stat Trek

Teach yourself statistics


Independent Random Variables

When a study involves pairs of random variables, it is often useful to know whether or not the random variables are independent. This lesson explains how to assess the independence of random variables.

Independence of Random Variables

If two random variables, X and Y, are independent, they satisfy the following conditions.

  • P(x|y) = P(x), for all values of X and Y.
  • P(x y) = P(x) * P(y), for all values of X and Y.

The above conditions are equivalent. If either one is met, the other condition is also met; and X and Y are independent. If either condition is not met, X and Y are dependent.

Note: If X and Y are independent, then the correlation between X and Y is equal to zero.

Joint Probability Distributions

The table below shows the joint probability distribution between two discrete random variables - X and Y.

X
0 1 2
Y 3 0.1 0.2 0.2
4 0.1 0.2 0.2

In a joint probability distribution table, numbers in the cells of the table represent the probability that particular values of X and Y occur together. From this table, you can see that the probability that X=0 and Y=3 is 0.1; the probability that X=1 and Y=3 is 0.2; and so on.

You can use tables like this to figure out whether two discrete random variables are independent or dependent. Problem 1 below shows how.

Test Your Understanding

Problem 1

The table below shows the joint probability distribution between two random variables - X and Y.

X
0 1 2
Y 3 0.1 0.2 0.2
4 0.1 0.2 0.2

And the next table shows the joint probability distribution between two random variables - A and B.

A
0 1 2
B 3 0.1 0.2 0.2
4 0.2 0.2 0.1

Which of the following statements are true?

I. X and Y are independent random variables.
II. A and B are independent random variables.

(A) I only
(B) II only
(C) I and II
(D) Neither statement is true.
(E) It is not possible to answer this question, based on the information given.

Solution

The correct answer is A. The solution requires several computations to test the independence of random variables. Those computations are shown below.

X and Y are independent if P(x|y) = P(x), for all values of X and Y. From the probability distribution table, we know the following:

P(x=0) = 0.2;    P(x=0 | y=3) = 0.2;    P(x=0 | y = 4) = 0.2
P(x=1) = 0.4;    P(x=1 | y=3) = 0.4;    P(x=1 | y = 4) = 0.4
P(x=2) = 0.4;    P(x=2 | y=3) = 0.4;    P(x=2 | y = 4) = 0.4

Thus, P(x|y) = P(x), for all values of X and Y, which means that X and Y are independent. We repeat the same analysis to test the independence of A and B.

P(a=0) = 0.3;    P(a=0 | b=3) = 0.2;    P(a=0 | b = 4) = 0.4
P(a=1) = 0.4;    P(a=1 | b=3) = 0.4;    P(a=1 | b = 4) = 0.4
P(a=2) = 0.3;    P(a=2 | b=3) = 0.4;    P(a=2 | b = 4) = 0.2

Thus, P(a|b) is not equal to P(a), for all values of A and B. For example, P(a=0) = 0.3; but P(a=0 | b=3) = 0.2. This means that A and B are not independent.