In the last few sections, the following has been discussed in detail: 
Given a finite dimensional vector space 
 of dimension 
we fixed an ordered basis 
For any 
 we calculated the column vector 
 to obtain the
coordinates of 
 with respect to the ordered basis 
 Also,
for any linear transformation 
 we got an 
matrix 
 the matrix of 
 with respect to the ordered basis 
That is, once an ordered basis of 
 is fixed,
every linear transformation is represented by a matrix with
entries from the scalars.
In this section, we   understand the matrix
representation of 
 in terms of different
bases 
 and 
 of 
 That is, we relate the two
 matrices 
 and 
We start with the following important theorem.
This theorem also enables us to understand WHY THE MATRIX PRODUCT
IS DEFINED SOMEWHAT DIFFERENTLY.
Now for
![]()  | 
![]()  | 
||
![]()  | 
|||
![]()  | 
|||
![]()  | 
Hence,
This completes the proof. height6pt width 6pt depth 0pt
Suppose 
 Then using the rank-nullity theorem,  observe that
So, to complete the proof of the second inequality, we need to show that
We now prove the first inequality. 
Let 
 and let 
 be a basis of
 Clearly,  
 as 
 We extend it to get  a basis
 of
Claim: The set 
 is linearly
independent subset of 
As 
 the set
 is a subset of 
Let if possible the given set be linearly dependent. Then there exist
non-zero scalars 
 such that
So, the vector
 and is a
linear combination of the basis vectors 
 of 
 such that
Or equivalently
That is, the
Thus, the set 
 is a
linearly independent subset of 
 and so 
Hence, 
height6pt width 6pt depth 0pt
Recall from Theorem 4.1.8 that if 
 is an invertible
linear Transformation,  then 
 is a linear transformation
defined by 
 whenever 
We now state an important result about inverse of a linear transformation.
The reader is required to supply the proof (use Theorem
4.4.1).
Prove that
Let 
 be a vector space  with 
Let 
 and
 be two
ordered bases of 
 Recall from Definition 4.1.5
that 
 is the identity linear
transformation defined by 
 for every 
 Suppose 
 with 
 and 
We now express each vector in 
 as a linear
combination of the vectors from 
 Since 
for 
 and 
 is a basis of 
 we can find scalars
 such that
Hence,
and ![]()  | 
Equivalently,
Note: Observe that the identity linear transformation
 defined by 
 for every 
is invertible and
Therefore, we also have
Let 
 be a finite dimensional vector space and let
 and 
 be two ordered bases of 
Let 
 be a linear transformation.
We are now in a position to relate the two matrices
 and 
Also, let 
 be the matrix of the identity
linear transformation with respect to the bases 
 and 
Then 
 Equivalently 
Since the result is true for all
Another Proof:
Let 
 and 
 Then
for 
So, for each
![]()  | 
|||
![]()  | 
Hence
Also, for each 
![]()  | 
|||
![]()  | 
This gives us
Let 
 be a vector space with 
 and let 
 be
a linear transformation. Then for each ordered basis 
 of 
 we get
an 
 matrix 
 Also, we know that for
any vector space we have infinite number of choices for an ordered basis.
So, as we change an ordered basis, the matrix of the linear transformation
changes. Theorem  4.4.6 tells us that all these matrices are
related.
Now, let 
 and 
 be two 
 matrices such
that 
 for some invertible matrix 
 Recall the
linear transformation 
 defined by 
for all 
 Then   we have seen that
if the standard basis of 
 is the ordered basis 
then 
 Since 
 is an invertible matrix, its columns
are linearly independent and hence we can take its columns as an ordered
basis 
 Then note that 
 The above observations
lead to the following remark and the definition.
is the set of all matrices that are similar to the given matrix
Then
Therefore,
![]()  | 
|||
![]()  | 
|||
![]()  | 
Then
Find
Check that,
is a basis of
Let
Let
A K Lal 2007-09-12