Singular value decomposition is a technique used factorizing matrices. Given a matrix of dimensions , SVD decomposes into 3 matrices:
- is an orthogonal matrix whose columns are called the left singular vectors of
- is an matrices diagonal matrix with non-negative real numbers on the diagonal.
- These are known as the singular values of , and they are usually arranged in descending order.
- The singular values give insight into the rank and condition number of .
- is the transpose of an orthogonal matrix whose columns are the right singular vectors of .
SVD provides a lower-rank approximation of the original matrix. In the matrix, we have singular values on the diagonal, in descending order such that . To approximate , we keep only the top singular values, truncating to keep the first columns and to keep the first rows.
The intuition behind this is that the singular values represent the magnitude of the contribution of each singular vector to the matrix . By keeping only the top singular values, we retain the most significant components of , which capture the bulk of the information or variability in the data, while possibly reducing noise.
Applications
- Data Compression: SVD can compress data by identifying patterns and removing redundancies, which is useful in image compression and digital signal processing. By keeping only the largest singular values (and corresponding singular vectors), SVD provides a lower-rank approximation of the original data, reducing its size while retaining essential features.
- Dimensionality Reduction: Principal Component Analysis (PCA), a common technique for dimensionality reduction, is essentially based on SVD. It transforms the data to a new coordinate system, reducing its dimensionality by selecting the most significant directions (principal components).
- Noise Reduction: SVD can separate the signal from noise in data. By truncating smaller singular values (which often correspond to noise), SVD can produce a cleaner, denoised version of the data.
Singular Vectors
Left Singular Vector
The columns of are called the left singular vectors of . They are orthonormal vectors that form a basis for the column space of (also known as the range of ). The left singular vectors are eigenvectors of the matrix .
Right Singular Vector
The columns of (or rows of ) are the right singular vectors of . These are orthonormal vectors that form a basis for the row space of (also known as the domain of ). The right singular vectors are eigenvectors of the matrix .