QR Decomposition#
If \(A\) is a \(m \times n\) matrix with \(\mathrm{rank}(A) = n\), then the decomposition \(A = QR\) provides orthonormal bases of both \(R(A)\) and \(R(A)^{\perp}\). In particular, write \(Q = [Q_1 \ Q_2]\) where \(Q_1\) is the first \(n\) columns of \(Q\), then the columns of \(Q_1\) provide an orthonormal basis of \(R(A)\) and the columns of \(Q_2\) provide an orthonormal basis of \(R(A)^{\perp}\).
Orthogonal Matrices#
A matrix \(A\) is orthogonal if \(A^TA = AA^T = I\).
The condition \(A^TA = I\) implies that the columns of \(A\) are orthonormal, and \(AA^T = I\) implies the rows of \(A\) are orthonormal.
If \(A\) is an orthogonal matrix, then \(\| A \boldsymbol{x} \| = \| \boldsymbol{x} \|\) for all \(\boldsymbol{x} \in \mathbb{R}^n\).
Proof. Compute
Rotations and reflections are examples of orthogonal matrices.
An orthogonal matrix and an orthogonal projector are not the same thing but they are related. If \(P\) is an orthogonal projector then \(Q = I - 2P\) is an orthogonal (and symmetric) matrix. In fact, if \(P\) projects onto a subspace \(U\) then \(Q\) is the reflection through \(U^{\perp}\).
QR by Gram-Schmidt#
Let \(A\) be an \(m \times n\) matrix with \(\mathrm{rank}(A) = n\) and let \(\boldsymbol{a}_1,\dots,\boldsymbol{a}_n\) be the columns of \(A\). Apply the Gram-Schmidt algorithm to the columns and construct an orthonormal basis \(\{ \boldsymbol{w}_1,\dots,\boldsymbol{w}_n \}\) of \(R(A)\). Project the columns onto the basis
where \(\boldsymbol{a}_k \in \mathrm{span} \{ \boldsymbol{w}_1 , \dots , \boldsymbol{w}_k \}\) by construction. Write as matrix multiplication
where
Extend the basis to an orthonormal basis \(\{ \boldsymbol{w}_1 , \dots , \boldsymbol{w}_n , \boldsymbol{w}_{n+1} , \dots , \boldsymbol{w}_m \}\) of \(\mathbb{R}^m\) where \(\{ \boldsymbol{w}_{n+1} , \dots , \boldsymbol{w}_m \}\) is any orthonormal basis of the orthogonal complement \(R(A)^{\perp}\) and let
Finally, the QR decomposition of \(A\) is
where \(Q\) is a \(m \times m\) orthogonal matrix and \(R\) is a \(m \times n\) upper triangular matrix. The decomposition \(A = Q_1 R_1\) is called the thin QR decomposition. See Wikipedia:QR decomposition.
Compute the QR decomposition for the matrix
Apply Gram-Schmidt to find an orthonormal basis of the column space
Therefore
and
The Gram-Schmidt algorithm shows that the QR decomposition exists but it is not the most efficient way to compute the QR decomposition. Software such as the MATLAB function qr
(see documentation) and the SciPy function scipy.linalg.qr
(see documentation), and LAPACK (see documentation) use elementary reflectors to construct the matrices \(Q\) and \(R\).
Orthogonal Projection by QR#
Let \(A\) be a \(m \times n\) matrix such that \(\mathrm{rank}(A) = n\), and let \(A = Q R\) be the QR decomposition. The projection of \(\boldsymbol{x} \in \mathbb{R}^m\) onto \(R(A)\) is given by
and the projection onto \(R(A)^{\perp}\) is given by
Exercises#
Compute the thin QR decomposition of the matrix
Solution
Apply Gram-Schmidt to the columns of \(A\)
Therefore
and