Matrix of the Orthogonal Projection

The minimization problem stated above arises in lot of applications. So, it will be very helpful if the matrix of the orthogonal projection can be obtained under a given basis.

To this end, let $ W$ be a $ k$ -dimensional subspace of $ {\mathbb{R}}^n$ with $ W^{\perp}$ as its orthogonal complement. Let $ P_W : {\mathbb{R}}^n \longrightarrow {\mathbb{R}}^n$ be the orthogonal projection of $ {\mathbb{R}}^n$ onto $ W$ . Suppose, we are given an orthonormal basis $ {\cal B}= ({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_k)$ of $ W.$ Under the assumption that $ {\cal B}$ is known, we explicitly give the matrix of $ P_W$ with respect to an extended ordered basis of $ {\mathbb{R}}^n.$

Let us extend the given ordered orthonormal basis $ {\cal B}$ of $ W$ to get an orthonormal ordered basis
$ {\cal B}_1 = ({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_k, {\mathbf v}_{k+1} \ldots, {\mathbf v}_n)$ of $ {\mathbb{R}}^n.$ Then by Theorem 5.1.12, for any $ {\mathbf v}\in {\mathbb{R}}^n, \; {\mathbf v}= \sum\limits_{i=1}^n \langle {\mathbf v}, {\mathbf v}_i\rangle {\mathbf v}_i.$ Thus, by definition, $ P_W({\mathbf v}) = \sum\limits_{i=1}^k \langle {\mathbf v}, {\mathbf v}_i\rangle {\mathbf v}_i.$ Let $ A = [ {\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_k].$ Consider the standard orthogonal ordered basis $ {\cal B}_2 = ({\mathbf e}_1, {\mathbf e}_2, \ldots, {\mathbf e}_n)$ of $ {\mathbb{R}}^n.$ Therefore, if $ {\mathbf v}_i = \sum\limits_{j=1}^n a_{ji} {\mathbf e}_j,$ for $ 1 \leq i \leq k,$ then

$\displaystyle A = \begin{bmatrix}a_{11} & a_{12} & \cdots & a_{1k} \\
a_{21} &...
...ts_{i=1}^n a_{ni} \langle {\mathbf v}, {\mathbf v}_i\rangle \end{bmatrix}\; \;
$

and

$\displaystyle [P_W({\mathbf v})]_{{\cal B}_2} =
\begin{bmatrix}\sum\limits_{i=1...
...\limits_{i=1}^k a_{ni} \langle {\mathbf v}, {\mathbf v}_i\rangle
\end{bmatrix}.$

Then as observed in Remark 5.2.3.4, $ A^t A = I_k.$ That is, for $ 1 \leq i, j \leq k,$

$\displaystyle \sum_{s=1}^n a_{si} a_{sj} = \left\{\begin{array}{cc} 1 & {\mbox{ if }} i = j \\ 0 & {\mbox{ if }} \; i \neq j. \end{array}\right.$ (5.3.1)

Thus, using the associativity of matrix product and (5.3.1), we get
$\displaystyle (A A^t)({\mathbf v})$ $\displaystyle =$ $\displaystyle A
\begin{bmatrix}a_{11} & a_{21} & \cdots & a_{n1} \\
a_{12} & a...
...m\limits_{i=1}^n a_{ni} \langle {\mathbf v}, {\mathbf v}_i\rangle \end{bmatrix}$  
  $\displaystyle =$ $\displaystyle A \begin{bmatrix}
\sum\limits_{s=1}^n a_{s1} \left( \sum\limits_{...
... a_{sk} a_{si} \right)
\langle {\mathbf v}, {\mathbf v}_i \rangle \end{bmatrix}$  
  $\displaystyle =$ $\displaystyle A \begin{bmatrix}\langle{\mathbf v}, {\mathbf v}_1\rangle \\ \lan...
...m\limits_{i=1}^k a_{ni} \langle {\mathbf v}, {\mathbf v}_i\rangle \end{bmatrix}$  
  $\displaystyle =$ $\displaystyle [P_W({\mathbf v})]_{{\cal B}_2}.$  

Thus $ P_W[{\cal B}_2, {\cal B}_2] = A A^t.$ Thus, we have proved the following theorem.

THEOREM 5.3.15   Let $ W$ be a $ k$ -dimensional subspace of $ {\mathbb{R}}^n$ and let $ P_W : {\mathbb{R}}^n \longrightarrow {\mathbb{R}}^n$ be the orthogonal projection of $ {\mathbb{R}}^n$ onto $ W$ along $ W^{\perp}.$ Suppose, $ {\cal B}= ({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_k)$ is an orthonormal ordered basis of $ W.$ Define an $ n \times k$ matrix $ A = [ {\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_k].$ Then the matrix of the linear transformation $ P_W$ in the standard orthogonal ordered basis $ ({\mathbf e}_1, {\mathbf e}_2, \ldots, {\mathbf e}_n)$ is $ A A^t.$

EXAMPLE 5.3.16   Let $ W = \{(x,y,z,w) \in {\mathbb{R}}^4 : x = y, z = w \}$ be a subspace of $ W.$ Then an orthonormal ordered basis of $ W$ is

$\displaystyle \bigl( \frac{1}{\sqrt{2}} (1,1,0,0), \frac{1}{\sqrt{2}}(0,0,1,1) \bigr),$

and that of $ W^{\perp}$ is

$\displaystyle \bigl( \frac{1}{\sqrt{2}} (1,-1,0,0), \frac{1}{\sqrt{2}}(0,0,1,-1) \bigr).$

Therefore, if $ P_W : {\mathbb{R}}^4 \longrightarrow {\mathbb{R}}^4$ is an orthogonal projection of $ {\mathbb{R}}^4$ onto $ W$ along $ W^{\perp},$ then the corresponding matrix $ A$ is given by

$\displaystyle A = \begin{bmatrix}\frac{1}{\sqrt{2}} & 0 \\ \frac{1}{\sqrt{2}} & 0 \\
0 & \frac{1}{\sqrt{2}} \\ 0 & \frac{1}{\sqrt{2}} \end{bmatrix}.$

Hence, the matrix of the orthogonal projection $ P_W$ in the ordered basis

$\displaystyle {\cal B}= \bigl( \frac{1}{\sqrt{2}} (1,1,0,0), \frac{1}{\sqrt{2}}(0,0,1,1),
\frac{1}{\sqrt{2}} (1,-1,0,0), \frac{1}{\sqrt{2}}(0,0,1,-1) \bigr)$

is

$\displaystyle P_W[{\cal B},{\cal B}] = A A^t = \begin{bmatrix}\frac{1}{2} & \fr...
... \frac{1}{2} & \frac{1}{2} \\
0 & 0 & \frac{1}{2} & \frac{1}{2} \end{bmatrix}.$

It is easy to see that
  1. the matrix $ P_W[{\cal B},{\cal B}]$ is symmetric,
  2. $ P_W[{\cal B},{\cal B}]^2 = P_W[{\cal B},{\cal B}],$ and
  3. $ \bigr( I_4 - P_W[{\cal B},{\cal B}] \bigl) P_W[{\cal B}, {\cal B}] = {\mathbf 0}= P_W[{\cal B},{\cal B}]
\bigr( I_4 - P_W[{\cal B},{\cal B}] \bigl).$
Also, for any $ (x,y,z,w) \in {\mathbb{R}}^4,$ we have

$\displaystyle [(x,y,z,w)]_{{\cal B}} = \left(\frac{x+y}{\sqrt{2}}, \frac{z+w}{\sqrt{2}},
\frac{x-y}{\sqrt{2}},\frac{z-w}{\sqrt{2}}\right)^t.$

Thus, $ P_W \bigl( (x,y,z,w) \bigr) = \displaystyle\frac{x+y}{2}(1,1,0,0) +
\frac{z+w}{2}(0,0,1,1)$ is the closest vector to the subspace $ W$ for any vector $ (x,y,z,w) \in {\mathbb{R}}^4.$

EXERCISE 5.3.17  
  1. Show that for any non-zero vector $ {\mathbf v}^t \in {\mathbb{R}}^n,$ the rank of the matrix $ {\mathbf v}{\mathbf v}^t$ is $ 1.$
  2. Let $ W$ be a subspace of a vector space $ V$ and let $ P : V \longrightarrow V$ be the orthogonal projection of $ V$ onto $ W$ along $ W^{\perp}.$ Let $ {\cal B}$ be an orthonormal ordered basis of $ V.$ Then prove that corresponding matrix satisfies $ P[{\cal B},{\cal B}]^t = P[{\cal B},{\cal B}].$
  3. Let $ A$ be an $ n \times n$ matrix with $ A^2 = A$ and $ A^t = A.$ Consider the associated linear transformation $ T_A : {\mathbb{R}}^n \longrightarrow {\mathbb{R}}^n$ defined by $ T_A({\mathbf v}) = A {\mathbf v}$ for all $ {\mathbf v}^t \in {\mathbb{R}}^n.$ Then prove that there exists a subspace $ W$ of $ {\mathbb{R}}^n$ such that $ T_A$ is the orthogonal projection of $ {\mathbb{R}}^n$ onto $ W$ along $ W^{\perp}.$
  4. Let $ W_1$ and $ W_2$ be two distinct subspaces of a finite dimensional vector space $ V.$ Let $ P_{W_1}$ and $ P_{W_2}$ be the corresponding orthogonal projection operators of $ V$ along $ W_1^{\perp}$ and $ W_2^{\perp},$ respectively. Then by constructing an example in $ {\mathbb{R}}^2,$ show that the map $ P_{W_1} \circ P_{W_2}$ is a projection but not an orthogonal projection.
  5. Let $ W$ be an $ (n-1)$ -dimensional vector subspace of $ {\mathbb{R}}^n$ and let $ W^{\perp}$ be its orthogonal complement. Let $ {\cal B}= ({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_{n-1}, {\mathbf v}_n)$ be an orthogonal ordered basis of $ {\mathbb{R}}^n$ with $ ( {\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_{n-1})$ an ordered basis of $ W.$ Define a map

    $\displaystyle T: {\mathbb{R}}^n \longrightarrow {\mathbb{R}}^n \;\; {\mbox{ by }}
T({\mathbf v}) = {\mathbf w}_0 - {\mathbf w}$

    whenever $ {\mathbf v}= {\mathbf w}+ {\mathbf w}_0$ for some $ {\mathbf w}\in W$ and $ {\mathbf w}_0 \in W^{\perp}.$ Then
    1. prove that $ T$ is a linear transformation,
    2. find the matrix, $ T[{\cal B},{\cal B}],$ and
    3. prove that $ T[{\cal B}, {\cal B}]$ is an orthogonal matrix.
    $ T$ is called the reflection along $ W^{\perp}.$

A K Lal 2007-09-12