Linear Models

As an introduction to nonlinear regression, we start with a review of linear regression analysis. The main characteristic of linear models is that the measured quantity is linearly dependent upon the parameters in the model. Beer's law, mentioned in Chapter 1, is a linear model. If we measure the absorbance A at various concentrations C, of an absorbing chemical species, this model has the form a = b.Cj. (2.1)

The only parameter in this model is ¿»i, the product of the cell thickness and the molar absorptivity (cf. eq. (1.1)). We shall see later that it is quite important in computer modeling of data to account for background signals. In this case, let us assume that we have a constant background signal b2 derived from the solution in which the absorber is dissolved. The model for the Beer's law experiment becomes

which is an equation for a straight line. In the interest of generality, we will convert A in eq. (2.2) to y, called the measured or dependent variable, and convert C to x, which is the experimentally controlled or independent variable. Including the constant background, eq. (2.1) can be written in the well-known form of a straight line:

Equation (2.3) is a linear model because the independent variable y is a linear function of the model parameters b\ and b2, not because the

Table 2.1 Examples of Linear Models

Linear model

Equation number yj = b^xj + b2xf + biXj + b, y, = bj log*, + b2 y, = exp(*;?) + b2Xj + b3 y, = b,!x}'2 + b2

equation describes a straight line. The model need not be a straight line to be linear. Some other linear models are listed in Table 2.1.

Note that none of the linear models in eqs. (2.4) to (2.7) describe a straight line. The first is a polynomial in x, the second depends logarithmically on x, the third depends exponentially on x2, and the last one depends on l/x1'2. However, in all of these equations, the dependent variable yj depends linearly on the k parameters bu ..., bk. These parameters appear raised to the first power in all the eqs. (2.3) to (2.7).

Thus, we can have models that are not straight lines but are still considered linear for regression analyses. The preceding models have only one independent variable, x. They are called single equation models. Nowhere in these linear models do we see a term such as b\, exp(bxx), or \og(bxx). This would make the models nonlinear, because would depend on one of the parameters in a nonlinear fashion. We can also have linear models containing more than one independent variable.

Was this article helpful?

0 0

Post a comment