15.17 REGRESSION ¶
The REGRESSION
procedure fits linear models to data via least-squares
estimation. The procedure is appropriate for data which satisfy those
assumptions typical in linear regression:
- The data set contains n observations of a dependent variable, say
Y_1,...,Y_n, and n observations of one or more explanatory
variables.
Let X_{11}, X_{12}, …, X_{1n} denote the n observations
of the first explanatory variable;
X_{21},…,X_{2n} denote the n observations of the second
explanatory variable;
X_{k1},…,X_{kn} denote the n observations of
the kth explanatory variable.
- The dependent variable Y has the following relationship to the
explanatory variables:
Y_i = b_0 + b_1 X_{1i} + ... + b_k X_{ki} + Z_i
where b_0, b_1, ..., b_k are unknown
coefficients, and Z_1,...,Z_n are independent, normally
distributed noise terms with mean zero and common variance.
The noise, or error terms are unobserved.
This relationship is called the linear model.
The REGRESSION
procedure estimates the coefficients
b_0,...,b_k and produces output relevant to inferences for the
linear model.