Adjoint of a Matrix

DEFINITION 2.6.9 (Minor, Cofactor of a Matrix)   The number $ \det \left(A(i\vert j)\right)$ is called the $ (i,j)^{\mbox{th}}$ minor of $ A$ . We write $ A_{ij} = \det \left(A(i\vert j)\right).$ The $ (i,j)^{\mbox{th}}$ cofactor of $ A,$ denoted $ C_{ij},$ is the number $ (-1)^{i+j} A_{ij}.$

DEFINITION 2.6.10 (Adjoint of a Matrix)   Let $ A$ be an $ n \times n$ matrix. The matrix $ B= [b_{ij}]$ with $ b_{ij} = C_{ji},$ for $ 1 \leq i, j \leq n$ is called the Adjoint of $ A,$ denoted $ Adj (A).$

EXAMPLE 2.6.11   Let $ A = \begin{bmatrix}1 & 2 & 3 \\ 2 & 3 & 1 \\ 1 & 2 &
2 \end{bmatrix}.$ Then $ Adj(A) = \begin{bmatrix}4 &
2 & -7 \\ -3 & -1& 5 \\ 1 & 0 & -1 \end{bmatrix};$
as $ C_{11} = (-1)^{1+1}A_{11} = 4, C_{12} = (-1)^{1+2} A_{12} = -3,
C_{13} = (-1)^{1+3} A_{13} = 1, $ and so on.

THEOREM 2.6.12   Let $ A$ be an $ n \times n$ matrix. Then
  1. for $ 1 \leq i \leq n,$ $ \; \sum\limits_{j=1}^n a_{ij} \; C_{ij}
= \sum\limits_{j=1}^n a_{ij} (-1)^{i+j} \; A_{i j} = \det(A),$
  2. for $ i \neq \ell, \; \sum\limits_{j=1}^n a_{ij} \; C_{\ell j}
= \sum\limits_{j=1}^n a_{ij} (-1)^{\ell+j} \; A_{\ell j} = 0,$ and
  3. $ \; A (Adj (A) ) = \det(A) I_n.$ Thus,

    $\displaystyle \det (A) \neq 0 \Rightarrow A^{-1} = \frac{1}{\det(A)} Adj (A).$ (2.6.2)

Proof. Let $ B= [b_{ij}]$ be a square matrix with By the construction of $ B,$ two rows ( $ i^{\mbox{th}}$ and $ \ell^{\mbox{th}}$ ) are equal. By Part 5 of Lemma 2.6.6, $ \det (B) = 0.$ By construction again, $ \det\bigl(A(\ell\vert j)\bigr) = \det\bigl(B(\ell\vert j)\bigr)$ for $ 1 \leq j \leq n.$ Thus, by Remark 2.6.7, we have
$\displaystyle 0 = \det (B)$ $\displaystyle =$ $\displaystyle \sum_{j=1}^n (-1)^{\ell + j} b_{\ell j}
\det\bigl(B(\ell\vert j)\bigr) = \sum_{j=1}^n (-1)^{\ell + j} a_{ij}
\det\bigl(B(\ell\vert j)\bigr)$  
  $\displaystyle =$ $\displaystyle \sum_{j=1}^n (-1)^{\ell + j} a_{ij}
\det\bigl(A(\ell\vert j)\bigr) = \sum_{j=1}^n a_{ij} C_{\ell j}.$  

Now,
$\displaystyle \biggl(A\bigl( {\mbox{Adj}}(A) \bigr)\biggr)_{ij}$ $\displaystyle =$ $\displaystyle \sum_{k=1}^n a_{ik} \bigl( {\mbox{Adj}}(A)\bigr)_{kj} = \sum_{k=1}^n
a_{ik} C_{jk}$  
  $\displaystyle =$ $\displaystyle \left\{\begin{array}{cc} 0 & {\mbox{ if }} i \neq j \\
\det(A) & {\mbox{ if }} i = j \end{array}\right.$  

Thus, $ \; A (Adj (A) ) = \det(A) I_n.$ Since, $ \det(A) \neq 0,$ $ \; A
\displaystyle \frac{1}{\det(A)} Adj(A) = I_n.$ Therefore, $ A$ has a right inverse. Hence, by Theorem 2.5.9 $ A$ has an inverse and

$\displaystyle A^{-1} = \frac{1}{\det(A)} Adj (A).$

height6pt width 6pt depth 0pt

EXAMPLE 2.6.13   Let $ A= \begin{bmatrix}1 & -1 & 0 \\ 0 & 1 & 1\\ 1 & 2 &
1 \end{bmatrix}.$ Then

$\displaystyle Adj (A) = \begin{bmatrix}
-1 & 1 & -1 \\ 1 & 1 & -1\\ -1 &-3 & 1 \end{bmatrix}$

and $ \det (A) = -2.$ By Theorem 2.6.12.3, $ A^{-1} =
\begin{bmatrix}1/2 & -1/2 & 1/2 \\ -1/2 & -1/2 & 1/2\\ 1/2 &
3/2 & -1/2 \end{bmatrix}.$

The next corollary is an easy consequence of Theorem 2.6.12 (recall Theorem 2.5.9).

COROLLARY 2.6.14   If $ A$ is a non-singular matrix, then
$ \bigl(Adj(A) \bigr) A = \det(A) I_n \;\; $ and $ \;\; \sum\limits_{i=1}^n a_{ij} \; C_{ik} =
\left\{\begin{array}{cc} \det (A) & {\mbox{ if }} j = k \\ 0 &
{\mbox{if }} j \neq k \end{array}\right..$

THEOREM 2.6.15   Let $ A$ and $ B$ be square matrices of order $ n.$ Then $ \;\det (A B) = \det (A) \det (B).$

Proof. Step 1. Let $ \det (A) \neq 0.$
This means, $ A$ is invertible. Therefore, either $ A$ is an elementary matrix or is a product of elementary matrices (see Theorem 2.5.8). So, let $ E_1, E_2, \ldots, E_k$ be elementary matrices such that $ A = E_1 E_2 \cdots E_k.$ Then, by using Parts 1, 2 and 4 of Lemma 2.6.6 repeatedly, we get
$\displaystyle \det(AB)$ $\displaystyle =$ $\displaystyle \det (E_1 E_2 \cdots E_k B) =\det (E_1) \det( E_2 \cdots E_k B)$  
  $\displaystyle =$ $\displaystyle \det(E_1) \det( E_2) \det(E_3 \cdots E_k B)$  
  $\displaystyle =$ $\displaystyle \det(E_1 E_2) \det(E_3 \cdots E_k B)$  
  $\displaystyle =$ $\displaystyle \vdots$  
  $\displaystyle =$ $\displaystyle \det(E_1 E_2 \cdots E_k) \det( B)$  
  $\displaystyle =$ $\displaystyle \det(A) \det(B).$  

Thus, we get the required result in case $ A$ is non-singular.

Step 2. Suppose $ \det (A) =
0.$
Then $ A$ is not invertible. Hence, there exists an invertible matrix $ P$ such that $ P A = C,$ where $ C = \begin{bmatrix}C_1 \\ {\mathbf 0}
\end{bmatrix}.$ So, $ A = P^{-1} C, $ and therefore

$\displaystyle \det(AB)$ $\displaystyle =$ $\displaystyle \det ((P^{-1} C) B) = \det (P^{-1} (C B)) = \det \left( P^{-1}
\begin{bmatrix}C_1 B \\ {\mathbf 0}\end{bmatrix}\right)$  
  $\displaystyle =$ $\displaystyle \det( P^{-1} ) \cdot \det \left( \begin{bmatrix}C_1 B \\ {\mathbf 0}
\end{bmatrix}\right) \;\; {\mbox{ as }} P^{-1} {\mbox{ is non-singular}}$  
  $\displaystyle =$ $\displaystyle \det (P) \cdot 0 = 0 = 0 \cdot \det (B) = \det(A) \det(B).$  

Thus, the proof of the theorem is complete. height6pt width 6pt depth 0pt

COROLLARY 2.6.16   Let $ A$ be a square matrix. Then $ A$ is non-singular if and only if $ A$ has an inverse.

Proof. Suppose $ A$ is non-singular. Then $ \det(A) \neq 0$ and therefore, $ A^{-1} = \displaystyle\frac{1}{\det(A)} Adj(A).$ Thus, $ A$ has an inverse.

Suppose $ A$ has an inverse. Then there exists a matrix $ B$ such that $ A B = I = BA.$ Taking determinant of both sides, we get

$\displaystyle \det(A) \det (B) =
\det(AB) = \det(I) = 1.$

This implies that $ \det (A) \neq 0.$ Thus, $ A$ is non-singular. height6pt width 6pt depth 0pt

THEOREM 2.6.17   Let $ A$ be a square matrix. Then $ \det (A) = \det(A^t).$

Proof. If $ A$ is a non-singular Corollary 2.6.14 gives $ \det (A) = \det(A^t).$

If $ A$ is singular, then $ \det (A) =
0.$ Hence, by Corollary 2.6.16, $ A$ doesn't have an inverse. Therefore, $ A^t$ also doesn't have an inverse (for if $ A^t$ has an inverse then $ A^{-1} = \bigl((A^t)^{-1}\bigr)^t).$ Thus again by Corollary 2.6.16, $ \det(A^t) = 0.$ Therefore, we again have $ \det(A) = 0 = \det(A^t).$

Hence, we have $ \det (A) = \det(A^t).$ height6pt width 6pt depth 0pt

A K Lal 2007-09-12