is positive semi-definite iff . This means all its eigenvalues are non-negative.
is positive definite iff . This means all its eigenvalues are strictly positive.
For any arbitrary matrix , the matrices and are always symmetric and positive semi-definite.
2. Singular Value Decomposition
The Singular Value Decomposition (SVD) of is the decomposition of into the 3 matrices , where is orthogonal, is orthogonal, and is diagonal with and . These are the sigular values of .
If is the rank of then has positive singular values and singular values equal to .
Then, if and , we have .
Let , where :
, is orthogonal, .
, is orthogonal, .
, where , , , and .
2.1 Construction of SVD
Suppose the SVD for as above exists.
.
Hence where .
is a diagonal matrix, thus giving the spectral composition of the positive semi-definite matrix .
Hence is the matrix of orthonormal eigenvectors of , and are the eigenvalues of .
As , . Thus, for .
Hence, can be found using gram-schmidt between and .
(!) Example
Let .
We will work with because it is smaller.
. Now, we must find the spectral decomposition of this matrix.
. So, .
We can find the singular values . and .
The first eigenvector of : . So .
The second eigenvector of : . So .
Since , . So . This only works up until .
Now, we must find where and .
We can easily pick an orthogonal as .
can be found with a cross product, .
2.2 Properties of SVD
The singular values of are . .
. Since the first term here is most important, we can approximate .
3. Principal Component Analysis
If is the SVD of , then the columns of , are the principal axes of . The first principal axis is , the second is , and so on.
The principle components of are the columns of . The first principle component is , the second is , and so on.
If , then the data contained in can be compressed by projecting in the direction of the principle component: .
This is called data compression, principle component analysis and is an example of a dimensionality reduction algorithm.