Sums of Squares and Cross Products Matrix
This lesson introduces the sums of squares and cross products matrix (aka, SSCP matrix). We show how to use matrix methods to compute the SSCP matrix, using both raw scores and deviation scores.
Sum of Squares: Vectors
In statistics, many formulas require the calculation of sums of squares; i.e., squaring all of the elements in a set and then taking the sum of those squares.
Using matrix algebra, the sum of squares for all the elements of a vector is calculated according to the following formula:
Σ x_{i}^{2} = x'x
where
x is an n x 1 column vector
of scores: x_{1}, x_{2}, . . . ,
x_{n}
Σ x_{i}^{2} is the sum of the squared values from
vector x
To illustrate, let's find the sum of squares for the elements of vector x, where x' = [ 1 2 3 ].
Σ x_{i}^{2} =  [ 1 2 3 ] 


x'  x 
Σ x_{i}^{2} = ( 1 * 1 ) + ( 2 * 2 ) + ( 3 * 3 )
Σ x_{i}^{2} = 1 + 4 + 9 = 14
Thus, the sum of the squared elements from vector x is 14.
Sums of Squares and Cross Products: Matrices
With matrices, we can compute not only sums of squares but also sums of cross products. For an r x c matrix, an individual cross product is represented by X_{r}_{j}X_{r}_{k}. The sum of cross products between all the elements of columns j and k is represented by Σ X_{r}_{j}X_{r}_{k}, summed over r. A matrix of sums of squares and sums of cross products is represented by X' X, as shown below.
X' X = 

where
X is an r x c matrix of raw scores:
X_{1}_{1}, X_{1}_{2}, . . . ,
X_{r}_{c}
X' X is a c x c matrix
of sums of squares and sums of cross products
Σ X_{i}^{2} is the sum of the square of all
elements in column i of matrix X
Σ X_{i} X_{j} is the sum of cross products
produced by multiplying each
element in column i of matrix X with the
corresponding element from column j and summing the result
Thus, the diagonal elements of X' X are sums of squares, and the offdiagonal elements are cross products. Note that the cross product matrix X' X is a symmetric matrix.
See problem 1 for an example showing how to create a cross product matrix.
Sums of Squares and Deviation Scores
In the previous lesson, we showed how to transform a matrix of raw scores into a matrix of deviation scores. There are advantages to working with deviation scores.
 Computations can be easier.
 Equations are often more comprehensible.
As a result, researchers often transform their raw data into deviation scores before they calculate sums of squares and cross products. This is such a common practice that the term "sums of squares" has two meanings. It can refer to raw score sums of squares or to deviation score sums of squares.
See problem 2 for an example showing how to create a cross product matrix, using deviation scores.
Test Your Understanding
Problem 1
Using matrix A below, find the cross products matrix defined by matrix A' A.
A = 

Solution
A' A = 


A' A = 

Problem 2
Matrix A, shown below, is a matrix of raw scores. Find the deviation score sums of squares matrix produced from matrix A; that is, find matrix a' a.
A = 

Solution
First, we transform raw score matrix A into deviation score matrix a, as shown below. Previously, we described how to transform raw scores to deviation scores. We repeat the transformation formula below, and then we make the transformation.
a = A  11'A ( 1 / r )
where
1 is an 5 x 1 column
vector
of ones
a is an 5 x 3 matrix
of deviation scores: a_{1}_{1},
a_{1}_{2}, . . . ,
a_{5}_{3}
A is an 5 x 3 matrix
of raw scores: A_{1}_{1},
A_{1}_{2}, . . . ,
A_{5}_{3}
r is the number of rows in matrix A
a = 

 


( 1/5 ) 
a = 

 

a = 

Then, to find the deviation score sums of square matrix, we simply compute a'a, as shown below.
a' a = 


a' a = 
