Statistics and Probability Dictionary
Select a term from the dropdown text box. The online statistics
glossary will display a definition, plus links to other
related web pages.
A cumulative probability refers to the probability that the value of a
falls within a specified range. Frequently, cumulative
probabilities refer to the probability that a random variable is less than or
equal to a specified value.
Consider a coin flip experiment. If we flip a coin two times, we might ask: What
is the probability that the coin flips would result in one or fewer heads? The
answer would be a cumulative probability. It would be the probability that the
coin flip results in zero heads plus the probability that the coin flip
results in one head. Thus, the cumulative probability would equal:
P(X < 1) = P(X = 0) + P(X = 1) = 0.25 + 0.50 = 0.75
The table below shows both the probabilities and the cumulative probabilities
associated with this experiment.
Number of heads