Factors a matrix $A \in \R^{m \times n}$ in terms of eigenvectors $U$ of $AA^T$ (known as left singular vectors), eigenvectors $V$ of $A^T A$ (known as right singular vectors), and eigenvalues $\Sigma$ of both products (known as singular values). Since both products are symmetric, both $U$ and $V$ are orthogonal and $\Sigma$ is real.
$$ A = U \Sigma V^T $$
Singular values $\sigma$ are, by convention, ordered under the criterion that $\sigma_i \geq \sigma _j$ for $i < j$. Furthermore, they will always correspond to the eigenvalues of the smaller of the two products, since $r \leq \min\{m, n\}$.
Additionally, $C(V_r) = C(A^T) \sube C(V)$ and $C(U_r) = C(A) \sube C(U)$ which both follow from the expanding $A^T \mathbf{x}$ and $A\mathbf{x}$ respectively.
$$ C(V) = C(V_r) + C(V_{n - r}) = C(A^T) + \mathcal{N}(A) = \text{Domain}(A) \\
C(U) = C(U_r) + C(U_{m - r}) = C(A) + \mathcal{N}(A^T) = \text{Codomain}(A) $$
Singular value decomposition is done by first finding the singular values and then finding their associated right/left singular vectors.
Singular Values. These can be found by either considering $AA^T$ or $A^TA$, finding the eigenvalues of the selected product, then square rooting each to produce the singular values.
Right Singular Vectors. If $A^TA$ is the choice of product, the right singular vectors are simply the eigenvectors of this.
Left Singular Vectors. The remaining singular vectors can be determined through the following relation,
$$ \mathbf{u}_k = \frac{1}{\sigma _k} A \mathbf{v}_k \hspace{30pt} \mathbf{v}_k = \frac{1}{\sigma _k} A^T\mathbf{u}_k $$
If there are deficiencies in $U$ or $V$, i.e. either $AA^T$ or $A^TA$ are defective, then you must add in unit vectors to span the rest of the codomain or domain, performing Gram-Schmidt to ensure that orthonormality holds.
States that any linear transformation is composed of an isometric transformation, followed by a stretching transformation, followed by another isometric transformation.