In this section, we will look at some special classes of square
matrices which are diagonalisable. We will also be dealing with
matrices having complex entries and hence for a matrix
recall the following definitions.
Note that a symmetric matrix is always Hermitian, a skew-symmetric matrix
is always skew-Hermitian and an orthogonal matrix is always
unitary. Each of these matrices are normal. If
is a unitary
matrix then
EXAMPLE 6.3.2
- Let
Then
is skew-Hermitian.
- Let
and
Then
is a unitary matrix and
is a normal
matrix. Note that
is also a normal matrix.
EXERCISE 6.3.4
- Let
be a square matrix such that
is a diagonal matrix for some unitary matrix
.
Prove that
is a normal matrix.
- Let
be any matrix. Then
where
is the Hermitian part of
and
is the skew-Hermitian part of
- Every matrix can be uniquely expressed as
where both
and
are Hermitian matrices.
- Show that
is always skew-Hermitian.
- Does there exist a unitary matrix
such that
where
and
Proof.
Let
![$ (\lambda, {\mathbf x})$](img2744.png)
be an eigenpair. Then
![$ A {\mathbf x}= \lambda {\mathbf x}$](img2709.png)
and
![$ A = A^*$](img3006.png)
implies
Hence
But
![$ {\mathbf x}$](img264.png)
is an eigenvector and hence
![$ {\mathbf x}\neq {\mathbf 0}$](img2316.png)
and so the real number
![$ \Vert{\mathbf x}\Vert^2 = {\mathbf x}^* {\mathbf x}$](img3009.png)
is non-zero as well. Thus
![$ \lambda =
{\overline{\lambda}}.$](img3010.png)
That is,
![$ {\lambda}$](img2712.png)
is a real number.
height6pt width 6pt depth 0pt
Proof.
We will prove the result by induction on the size of
the matrix. The result is clearly true if
![$ n=1.$](img2388.png)
Let the result
be true for
![$ n = k-1.$](img3013.png)
we will prove the result in case
![$ n = k.$](img3014.png)
So, let
![$ A$](img9.png)
be a
![$ k \times k$](img1978.png)
matrix and let
![$ (\lambda_1, {\mathbf x})$](img3015.png)
be
an eigenpair of
![$ A$](img9.png)
with
![$ \Vert {\mathbf x}\Vert = 1.$](img2539.png)
We now extend the linearly
independent set
![$ \{ {\mathbf x}\}$](img2540.png)
to form an orthonormal basis
![$ \{{\mathbf x}, {\mathbf u}_2, {\mathbf u}_3,
\ldots, {\mathbf u}_k \}$](img3016.png)
(using
Gram-Schmidt
Orthogonalisation) of
![$ {\mathbb{C}}^k$](img3017.png)
.
As
is an orthonormal set,
Therefore, observe that for all
Hence, we also have
![$ {\mathbf x}^* (A {\mathbf u}_i) = 0$](img3021.png)
for
![$ 2 \leq i \leq k.$](img3022.png)
Now, define
![$ U_1 = [ {\mathbf x}, \; {\mathbf u}_2, \; \cdots, {\mathbf u}_k ]$](img3023.png)
(with
![$ {\mathbf x},
{\mathbf u}_2, \ldots, {\mathbf u}_k$](img3024.png)
as columns of
![$ U_1$](img3025.png)
). Then the matrix
![$ U_1$](img3025.png)
is a unitary matrix
and
where
![$ B$](img87.png)
is a
![$ (k-1) \times (k-1)$](img3030.png)
matrix. As
![$ A^* = A$](img3031.png)
,we get
![$ (U_1^{*} A U_1)^* = U_1^{*} A U_1$](img3032.png)
. This condition,
together with the fact that
![$ {\lambda}_1$](img3033.png)
is a real number (use Proposition
6.3.5), implies that
![$ B^* = B$](img3034.png)
. That is,
![$ B$](img87.png)
is also
a Hermitian matrix. Therefore, by induction hypothesis there
exists a
![$ (k-1) \times (k-1)$](img3030.png)
unitary matrix
![$ U_2$](img3035.png)
such that
Recall that , the entries
![$ {\lambda}_i, \; $](img3037.png)
for
![$ 2 \leq i
\leq k$](img3038.png)
are the eigenvalues of the matrix
![$ B.$](img100.png)
We also know that
two similar matrices have the same set of eigenvalues. Hence, the
eigenvalues of
![$ A$](img9.png)
are
![$ \lambda_1, \lambda_2, \ldots, \lambda_k.$](img3039.png)
Define
![$ U= U_1 \begin{bmatrix}1 & {\mathbf 0}\\ {\mathbf 0}& U_2
\end{bmatrix}.$](img3040.png)
Then
![$ U$](img2994.png)
is a unitary matrix
and
Thus,
![$ U^{*} A U$](img3047.png)
is a diagonal matrix with diagonal entries
![$ \lambda_1, \lambda_2, \ldots, \lambda_k,$](img3048.png)
the eigenvalues of
![$ A.$](img33.png)
Hence, the result follows.
height6pt width 6pt depth 0pt
COROLLARY 6.3.7
Let
be an
real symmetric matrix. Then
- the eigenvalues of
are all real,
- the corresponding eigenvectors can be chosen to have real entries, and
- the eigenvectors also form an orthonormal basis of
Proof.
As
![$ A$](img9.png)
is symmetric,
![$ A$](img9.png)
is also an Hermitian matrix. Hence, by Proposition
6.3.5, the eigenvalues of
![$ A$](img9.png)
are all real.
Let
![$ ({\lambda}, \; {\mathbf x})$](img3049.png)
be an eigenpair of
![$ A.$](img33.png)
Suppose
![$ {\mathbf x}^t \in {\mathbb{C}}^n.$](img3050.png)
Then there exist
![$ {\mathbf y}^t, {\mathbf z}^t \in {\mathbb{R}}^n$](img3051.png)
such that
![$ {\mathbf x}= {\mathbf y}+ i {\mathbf z}.$](img3052.png)
So,
Comparing the real and imaginary parts, we get
![$ A {\mathbf y}= {\lambda}{\mathbf y}$](img3054.png)
and
![$ A {\mathbf z}= {\lambda}{\mathbf z}.$](img3055.png)
Thus, we can choose the eigenvectors to have real entries.
To prove the orthonormality of the eigenvectors, we proceed on the lines
of the proof of Theorem 6.3.6, Hence, the readers are advised
to complete the proof.
height6pt width 6pt depth 0pt
Remark 6.3.9
In the previous exercise, we saw that the matrices
and
are similar but not unitarily equivalent, whereas
unitary equivalence implies similarity equivalence as
But in numerical calculations, unitary transformations are preferred
as compared to similarity transformations. The main reasons being:
- Exercise 6.3.8.2 implies
that an orthonormal
change of basis leaves unchanged the sum of squares of the
absolute values of the entries which need not be true under a
non-orthonormal change of basis.
- As
for a unitary matrix
unitary equivalence is
computationally simpler.
- Also in doing ``conjugate transpose", the loss of accuracy due to round-off
errors doesn't occur.
We next prove the Schur's Lemma and use it to show that normal matrices
are unitarily diagonalisable.
LEMMA 6.3.10 (Schur's Lemma)
Every
complex matrix is unitarily similar to an upper
triangular matrix.
Proof.
We will prove the result by induction on the size of
the matrix. The result is clearly true if
![$ n=1.$](img2388.png)
Let the result
be true for
![$ n = k-1.$](img3013.png)
we will prove the result in case
![$ n = k.$](img3014.png)
So, let
![$ A$](img9.png)
be a
![$ k \times k$](img1978.png)
matrix and let
![$ (\lambda_1, {\mathbf x})$](img3015.png)
be
an eigenpair for
![$ A$](img9.png)
with
![$ \Vert {\mathbf x}\Vert = 1.$](img2539.png)
Then the linearly
independent set
![$ \{ {\mathbf x}\}$](img2540.png)
can be extended, using the
Gram-Schmidt
Orthogonalisation process, to get an orthonormal basis
![$ \{{\mathbf x}, {\mathbf u}_2, {\mathbf u}_3,
\ldots, {\mathbf u}_k \}$](img3016.png)
of
![$ {\mathbb{C}}^n({\mathbb{C}})$](img2321.png)
. Then
![$ U_1 = [ {\mathbf x}\; {\mathbf u}_2 \; \cdots {\mathbf u}_k ]$](img3081.png)
(with
![$ {\mathbf x},
{\mathbf u}_2, \ldots, {\mathbf u}_k$](img3024.png)
as the columns of the matrix
![$ U_1$](img3025.png)
)
is a unitary matrix and
where
![$ B$](img87.png)
is a
![$ (k-1) \times (k-1)$](img3030.png)
matrix. By induction
hypothesis there exists a
![$ (k-1) \times (k-1)$](img3030.png)
unitary matrix
![$ U_2$](img3035.png)
such that
![$ U_2^{*} B U_2 $](img3083.png)
is an upper triangular matrix
with diagonal entries
![$ \lambda_2, \ldots, \lambda_k,$](img3084.png)
the eigen
values of the matrix
![$ B.$](img100.png)
Observe that since the eigenvalues of
![$ B$](img87.png)
are
![$ \lambda_2, \ldots, \lambda_k$](img3085.png)
the
eigenvalues of
![$ A$](img9.png)
are
![$ \lambda_1, \lambda_2, \ldots, \lambda_k.$](img3039.png)
Define
![$ U= U_1 \begin{bmatrix}1 & {\mathbf 0}\\ {\mathbf 0}& U_2
\end{bmatrix}.$](img3040.png)
Then check that
![$ U$](img2994.png)
is a unitary matrix and
![$ U^{*} A U$](img3047.png)
is an upper triangular matrix with diagonal entries
![$ \lambda_1, \lambda_2, \ldots, \lambda_k,$](img3048.png)
the eigenvalues of the
matrix
![$ A.$](img33.png)
Hence, the result follows.
height6pt width 6pt depth 0pt
We end this chapter with an application of the theory of diagonalisation
to the study of conic sections in analytic geometry and the study of
maxima and minima in analysis.
A K Lal
2007-09-12