language-icon Old Web
English
Sign In

Least squares adjustment

Least squares adjustment is a model for the solution of an overdetermined system of equations based on the principle of least squares of observation residuals. It is used extensively in the disciplines of surveying, geodesy, and photogrammetry—the field of geomatics, collectively. Least squares adjustment is a model for the solution of an overdetermined system of equations based on the principle of least squares of observation residuals. It is used extensively in the disciplines of surveying, geodesy, and photogrammetry—the field of geomatics, collectively. There are three forms of least squares adjustment: parametric, conditional, and combined. In parametric adjustment, one can find an observation equation h(X)=Y relating observations Y explicitly in terms of parameters X (leading to the A-model below). In conditional adjustment, there exists a condition equation g(Y)=0 involving only observations Y (leading to the B-model below) — with no parameters X at all. Finally, in a combined adjustment, both parameters X and observations Y are involved implicitly in a mixed-model equation f(X,Y)=0. Clearly, parametric and conditional adjustments correspond to the more general combined case when f(X,Y)=h(X)-Y and f(X,Y)=g(Y), respectively. Yet the special cases warrant simpler solutions, as detailed below. Often in the literature, Y may be denoted L. The equalities above only hold for the estimated parameters X ^ {displaystyle {hat {X}}} and observations Y ^ {displaystyle {hat {Y}}} , thus f ( X ^ , Y ^ ) = 0 {displaystyle fleft({hat {X}},{hat {Y}} ight)=0} . In contrast, measured observations Y ~ {displaystyle { ilde {Y}}} and approximate parameters X ~ {displaystyle { ilde {X}}} produce a nonzero misclosure: One can proceed to Taylor series expansion of the equations, which results in the Jacobians or design matrices: the first one,

[ "Least squares", "Geodesy", "Statistics", "Machine learning" ]
Parent Topic
Child Topic
    No Parent Topic