Equivalent conditions for Invertibility

DEFINITION 2.5.7   A square matrix $ A$ or order $ n$ is said to be of full rank if $ {\mbox{rank}}(A) =n.$

THEOREM 2.5.8   For a square matrix $ A$ of order $ n,$ the following statements are equivalent.
  1. $ A$ is invertible.
  2. $ A$ is of full rank.
  3. $ A$ is row-equivalent to the identity matrix.
  4. $ A$ is a product of elementary matrices.

Proof. 1 $ \Longrightarrow $ 2

Let if possible rank $ (A) = r < n.$ Then there exists an invertible matrix $ P$ (a product of elementary matrices) such that $ P A = \begin{bmatrix}B_1 & B_2 \\
{\mathbf 0}& {\mathbf 0}\end{bmatrix},$ where $ B_1$ is an $ r \times r$ matrix. Since $ A$ is invertible, let $ A^{-1} =
\begin{bmatrix}C_1 \\ C_2 \end{bmatrix},$ where $ C_1$ is an $ r
\times n$ matrix. Then

$\displaystyle P = P I_n = P (A A^{-1})= (P A ) A^{-1} = \begin{bmatrix}B_1 & B_...
...2 \end{bmatrix} = \begin{bmatrix}B_1 C_1 + B_2 C_2 \\ {\mathbf 0}\end{bmatrix}.$ (2.5.1)

Thus the matrix $ P$ has $ n-r$ rows as zero rows. Hence, $ P$ cannot be invertible. A contradiction to $ P$ being a product of invertible matrices. Thus, $ A$ is of full rank.

2 $ \Longrightarrow $ 3

Suppose $ A$ is of full rank. This implies, the row reduced echelon form of $ A$ has all non-zero rows. But $ A$ has as many columns as rows and therefore, the last row of the row reduced echelon form of $ A$ will be $ (0, 0, \ldots, 0, 1).$ Hence, the row reduced echelon form of $ A$ is the identity matrix.

3 $ \Longrightarrow $ 4

Since $ A$ is row-equivalent to the identity matrix there exist elementary matrices $ E_1, E_2, \ldots, E_k$ such that $ A = E_1 E_2 \cdots E_k I_n.$ That is, $ A$ is product of elementary matrices.

4 $ \Longrightarrow $ 1

Suppose $ A = E_1 E_2 \cdots E_k; $ where the $ E_i$ 's are elementary matrices. We know that elementary matrices are invertible and product of invertible matrices is also invertible, we get the required result. height6pt width 6pt depth 0pt

The ideas of Theorem 2.5.8 will be used in the next subsection to find the inverse of an invertible matrix. The idea used in the proof of the first part also gives the following important Theorem. We repeat the proof for the sake of clarity.

THEOREM 2.5.9   Let $ A$ be a square matrix of order $ n.$
  1. Suppose there exists a matrix $ B$ such that $ A B = I_n.$ Then $ A^{-1}$ exists.
  2. Suppose there exists a matrix $ C$ such that $ C A = I_n.$ Then $ A^{-1}$ exists.

Proof. Suppose that $ A B = I_n.$ We will prove that the matrix $ A$ is of full rank. That is, $ {\mbox{rank}}(A) =n.$

Let if possible, rank $ (A) = r < n.$ Then there exists an invertible matrix $ P$ (a product of elementary matrices) such that $ P A = \begin{bmatrix}C_1 & C_2 \\
{\mathbf 0}& {\mathbf 0}\end{bmatrix}.$ Let $ B = \begin{bmatrix}B_1 \\ B_2
\end{bmatrix},$ where $ B_1$ is an $ r
\times n$ matrix. Then

$\displaystyle P = P I_n = P (A B)= (P A ) B = \begin{bmatrix}C_1 & C_2 \\ {\mat...
...2 \end{bmatrix} = \begin{bmatrix}C_1 B_1 + C_2 B_2 \\ {\mathbf 0}\end{bmatrix}.$ (2.5.2)

Thus the matrix $ P$ has $ n-r$ rows as zero rows. So, $ P$ cannot be invertible. A contradiction to $ P$ being a product of invertible matrices. Thus, $ {\mbox{rank}}(A) =n.$ That is, $ A$ is of full rank. Hence, using Theorem 2.5.8, $ A$ is an invertible matrix. That is, $ BA = I_n$ as well.

Using the first part, it is clear that the matrix $ C$ in the second part, is invertible. Hence

$\displaystyle A C = I_n = C A.$

Thus, $ A$ is invertible as well. height6pt width 6pt depth 0pt

Remark 2.5.10   This theorem implies the following: ``if we want to show that a square matrix $ A$ of order $ n$ is invertible, it is enough to show the existence of
  1. either a matrix $ B$ such that $ A B = I_n$
  2. or a matrix $ C$ such that $ C A = I_n.$

THEOREM 2.5.11   The following statements are equivalent for a square matrix $ A$ of order $ n.$
  1. $ A$ is invertible.
  2. $ A {\mathbf x}= {\mathbf 0}$ has only the trivial solution $ {\mathbf x}= {\mathbf 0}.$
  3. $ A {\mathbf x}= {\mathbf b}$ has a solution $ {\mathbf x}$ for every $ {\mathbf b}.$

Proof. 1 $ \Longrightarrow $ 2

Since $ A$ is invertible, by Theorem 2.5.8 $ A$ is of full rank. That is, for the linear system $ A {\mathbf x}= {\mathbf 0},$ the number of unknowns is equal to the rank of the matrix $ A.$ Hence, by Theorem 2.5.1 the system $ A {\mathbf x}= {\mathbf 0}$ has a unique solution $ {\mathbf x}= {\mathbf 0}.$

2 $ \Longrightarrow $ 1

Let if possible $ A$ be non-invertible. Then by Theorem 2.5.8, the matrix $ A$ is not of full rank. Thus by Corollary 2.5.3, the linear system $ A {\mathbf x}= {\mathbf 0}$ has infinite number of solutions. This contradicts the assumption that $ A {\mathbf x}= {\mathbf 0}$ has only the trivial solution $ {\mathbf x}= {\mathbf 0}.$

1 $ \Longrightarrow $ 3

Since $ A$ is invertible, for every $ {\mathbf b},$ the system $ A {\mathbf x}= {\mathbf b}$ has a unique solution $ {\mathbf x}= A^{-1} {\mathbf b}.$

3 $ \Longrightarrow $ 1

For $ 1 \leq i \leq n,$ define $ {\mathbf e}_i = (0, \ldots, 0, \underbrace{1}_{i^{\mbox{th position}}},
0, \ldots, 0)^t,$ and consider the linear system $ A {\mathbf x}= {\mathbf e}_i.$ By assumption, this system has a solution $ {\mathbf x}_i$ for each $ i, \;
1 \leq i \leq n.$ Define a matrix $ B = [{\mathbf x}_1, {\mathbf x}_2, \ldots, {\mathbf x}_n ].$ That is, the $ i^{\mbox{th}}$ column of $ B$ is the solution of the system $ A {\mathbf x}= {\mathbf e}_i.$ Then

$\displaystyle A B = A [{\mathbf x}_1, {\mathbf x}_2 \ldots, {\mathbf x}_n] = [A...
..., A {\mathbf x}_n]=
[{\mathbf e}_1, {\mathbf e}_2 \ldots, {\mathbf e}_n] = I_n.$

Therefore, by Theorem 2.5.9, the matrix $ A$ is invertible. height6pt width 6pt depth 0pt

EXERCISE 2.5.12  
  1. Show that a triangular matrix $ A$ is invertible if and only if each diagonal entry of $ A$ is non-zero.
  2. Let $ A$ be a $ 1
\times 2$ matrix and $ B$ be a $ 2 \times 1$ matrix having positive entries. Which of $ B A$ or $ A B$ is invertible? Give reasons.
  3. Let $ A$ be an $ n \times m$ matrix and $ B$ be an $ m \times n$ matrix. Prove that the matrix $ I - BA$ is invertible if and only if the matrix $ I - AB$ is invertible.

A K Lal 2007-09-12