Statistics Dictionary

To see a definition, select a term from the dropdown text box below. The statistics dictionary will display the definition, plus links to related web pages.

Select term:


In a cause and effect relationship, the independent variable is the cause, and the dependent variable is the effect. Least squares linear regression is a method for predicting the value of a dependent variable Y, based on the value of an independent variable X.

Linear regression finds the straight line, called the least squares regression line or LSRL, that best represents observations in a bivariate data set. Suppose Y is a dependent variable, and X is an independent variable. Then, the equation for the regression line would be:

ŷ = b0 + b1x

where b0 is a constant, b1 is the regression coefficient, x is the value of the independent variable, and ŷ is the predicted value of the dependent variable.

Normally, you will use a computational tool - a software package (e.g., Excel) or a graphing calculator - to find b0 and b1. You enter the X and Y values into your program or calculator, and the tool solves for each parameter.

In the unlikely event that you find yourself on a desert island without a computer or a graphing calculator, you can solve for b0 and b1 "by hand". Here are the equations.

b1 = Σ [ (xi - x)(yi - y) ] / Σ [ (xi - x)2]      


b0 = y - b1 * x

where b0 is the constant in the regression equation, b1 is the regression coefficient, xi is the X value of observation i, yi is the Y value of observation i, and x and y are the means of X and Y, respectively.

See also:   AP Statistics Tutorial: Least Squares Linear Regression