A 2 Matrix Representation of Linear Least Squares

It is usually more convenient to base programs for nonlinear regression on matrix algebra. This is the approach taken in higher level mathematical programming software such as that provided by Matlab and Mathcad (sources for the Matlab and Mathcad software are listed at the end of this chapter). The principles are exactly the same as in the algebraic approach discussed above, but matrix methods facilitate organization and manipulation of the data.

In matrix notation, the straight-line model can be expressed as [3, 5]

where Y is a vector containing the n values of y; (meas), X is an n X 2 sample matrix, e is a vector containing the observed residuals, and b is the vector containing values of the slope and intercept. For an example with n = 3, eq. (2.17) can be represented as in Box 2.1.

Y = Xb + e












Box 2.1 Elements of vectors and matrices in a straight-line model.

Using the same assumptions as in the algebraic approach to obtaining the parameters, we have variances in y;(meas) that are all equal, and no errors in Xj. In the matrix notation, the problem involves finding the parameter matrix b, whose elements are the slope bi and the intercept b2. Application of the least squares condition leads to estimation of b from [4, 5]

where X' is the transpose of X, and [X'X]-1 is the inverse of the matrix product X'X. Efficient methods of computing b are discussed by Bates and Watts [5], Also, Bevington's book [3] provides a good discussion of the matrix approach starting from an algebraic viewpoint.

Was this article helpful?

0 0

Post a comment