# Statistics Dictionary

To see a definition, select a term from the dropdown text box below. The statistics dictionary will display the definition, plus links to related web pages.

Select term:

### Sums of Squares

A sum of squares is the sum of squared deviations from a mean score. For example, a one-way analysis of variance makes use of three sums of squares:

• Between-groups sum of squares. The between-groups sum of squares (SSB) measures variation of group means around the grand mean. It can be computed from the following formula:
SSB =
kΣj=1
n jΣi=1
X  j - X )2  =
kΣj=1
nj ( X  j - X )2
• Within-groups sum of squares. The within-groups sum of squares (SSW) measures variation of all scores around their respective group means. It can be computed from the following formula:
SSW =
kΣj=1
n jΣi=1
( X i j - X j )2
• Total sum of squares. The total sum of squares (SST) measures variation of all scores around the grand mean. It can be computed from the following formula:
SST =
kΣj=1
n jΣi=1
( X i j - X )2

It turns out that the total sum of squares is equal to the between-groups sum of squares plus the within-groups sum of squares, as shown below:

SST = SSB + SSW