When substantial error in the data exists, polynomial interpolation is inappropriate and can lead to unsatisfactory results when used to predict intermediate values. The experimental data are often like that. A more appropriate strategy in these cases is to obtain an approximate function setting "proper" behaviour or the general trend of the data, without necessarily coincide with each particular point (Johnson, 2009). A straight line can be used to characterize the trend of the data without having cash any particular point. One way to determine the least square regression line is to draw the "best" line through the points. Unless the points define a perfect straight line, it would be an appropriate interpolation (Johnson, 2009). The way to remove this subjectivity is to consider a criterion to quantify the adequacy of the adjustment. One way is to obtain a curve which minimizes the difference between the data and the curve and the method to accomplish this goal is to be called least squares regression (Klugh, 2006).
Linear Regression
The simplest example of an approach is the least squares fitting a straight line to a set of observed data pairs (x1, y1) (x2, y2), ..., (xn, yn). The mathematical expression of a straight line is:
Y = ? + b X + e
where a and b are coefficients representing the intersection with the x-axis and slope, respectively, and E is the error or residual between the model and observations (Klugh, 2006). Criteria for a "best" fit represent the strategy that obtains the "best" line through the points that minimize the sum of the residual errors (Klugh, 2006).
Another approach involves minimizing the sum of the absolute values ??of differences. A third strategy in the setting of an optimal line is the minimum criterion. In this method, the line is ...