Least squares is a form of approximation, and is used to make a prediction based on experimental data.
Example[]
Given a set of points
, and the deviation from the points
minimize
by changing values a and b.
Finding the minimum involves talking a look a the derivates.


Removing unnecessary 2 constants (solving to approach 0), and simplifying:


This can be re-written as a 2x2 linear system:


Note that some systems may be more complex, and may involve more than two paramaters.
Recursive least-squares algorithm[]
The Recursive least-squares formula is designed for real-time estimation, rather than performing a batch result each time an entry is added.[1]

![{\displaystyle P=[\sum _{i=1}^{N}(\phi (t)\phi ^{T}(t))]^{-1}=(\Phi \Phi ^{T})^{-1}}](https://services.fandom.com/mathoid-facade/v1/media/math/render/svg/7fa08224cbab2c1f64730eb05e40c2e4ca9129ee)
![{\displaystyle P^{-1}=[\sim _{i=1}^{N}(\phi (t)\phi ^{T}(t))]=(\Phi \Phi ^{T}):<math>b=\sum {i=1}^{N}y(t)\phi (t)}](https://services.fandom.com/mathoid-facade/v1/media/math/render/svg/f83800e8a6b34d10391d75726153541f31ec4ada)
References[]