Let with a basis . The projection of a vector onto , , where . Hence, , and hence . Hence, we get:
2. Least Squares
Suppose where , , , has no solution for cases when . Hence, . We can find the least squares solution by minimizing the residual. This can be rewritten as .
2.1 Image and Null Properties
Although , it can be decomposed into a where and .
(!) Image and Null Reminder
.
.
The rank is the dimension of the image of , and the nullity is the dimension of the null space of .
The rank-nullity theorem states that .
We also know that . Hence,
(!) Proposition
Suppose .
Let , then .
Then .
Hence, .
The following properties hold:
If where and , and and are unique. Then,
2.2 Least Squares Solution
Now, we can rewrite . We can now further rewrite this as which is equal to . We can remove the term as it is orthogonal to . Hence, .
Now, we must find in order to minimize the residual. We can rewrite this . Hence, to get the least square error, we just need to solve: .
(?) Example
Let and . has no solution. Instead, we can solve , to find the that minimizes the residual, also known as the least squares solution.
3. Linear Regression
Linear Regression is a method to find the best fit line for a given set of data points. We can model this as a least squares problem.
Let and here . And we want to find . Let and .
Suppose is a function of is given as , where are fixed parameters and are basic functions. We then get a matrix and , and . Hence, we can minimize for by LSM. This means that we can model any non-linear function or model using linear regression.
(?) Example
Suppose . Then, and . Our basic functions are and . Hence, we can write and . We must find such that is minimized.