# least squares

(redirected from Sum of squared error)
Also found in: Thesaurus, Medical, Acronyms.

## least squares

pl.n. Statistics
A method of determining the curve that best describes the relationship between expected and observed sets of data by minimizing the sums of the squares of deviation between observed and expected values.

## least squares

n
(Mathematics) a method for determining the best value of an unknown quantity relating one or more sets of observations or measurements, esp to find a curve that best fits a set of data. It states that the sum of the squares of the deviations of the experimentally determined value from its optimum value should be a minimum

## least′ squares′

n.
a statistical method of estimating values from a set of observations by minimizing the sum of the squares of the differences between the observations and the values to be found.
Also called least′-squares′ meth`od.
[1860–65]
ThesaurusAntonymsRelated WordsSynonymsLegend:
 Noun 1 least squares - a method of fitting a curve to data points so as to minimize the sum of the squares of the distances of the points from the curvemethod of least squaresstatistics - a branch of applied mathematics concerned with the collection and interpretation of quantitative data and the use of probability theory to estimate population parametersstatistical method, statistical procedure - a method of analyzing or representing statistical data; a procedure for calculating a statistic
References in periodicals archive ?
Sum of squared error is a common measure to evaluate k-means.
Often used when training MLP networks are the sum of squared errors, taking the following form:
The sum of squared errors on the training data set has been minimized by finding a vector of connection weights that is called network learning.
We have used Solver to determine coefficients that minimize the sum of squared error term between the original data and the modeled approximation.
In space L(W, P) ,the dependent variable is Y: P [right arrow] R and independent variable is X: W [right arrow] [R.sup.n], the sum of squared errors is defined as the objective function, namely photovoltaic power plants output power is defined as response variable, solar irradiance as prediction variable.
Therefore the already known sum of squared errors (equation (5)) is augmented by a term [[lambda].sub.reg] [[theta].sup.2] to the regularized sum of squared errors which will be minimized.
For example, squaring errors of 1 and 4 would produce a sum of squared errors of 17 ([1.sup.2] + [4.sup.2]) while squaring errors of 2 and 3 (which have the same simple sum: 5) would produce a sum of squared errors of only 13 ([2.sup.2] + [3.sup.2]): being off by 4 half of the time is worse than always being off by 2 or 3.
Estimates of [beta] = [(X'X).sup.-1] X'y, which minimize the sum of squared errors. For a thorough discussion of OLS regression, see Jeffrey M.

Site: Follow: Share:
Open / Close