Let be a finite dimensional inner product space. Suppose is a linearly independent subset of Then the Gram-Schmidt orthogonalisation process uses the vectors to construct new vectors such that for and for This process proceeds with the following idea.
Suppose we are given two vectors
and
in a plane. If we
want to get vectors
and
such that
is a unit
vector in the direction of
and
is a unit vector
perpendicular to
then they can be obtained in the following
way:
Take the first vector
Let
be the angle between the
vectors
and
Then
Defined
Then
is a vector perpendicular
to the unit vector
, as we have removed the component of
from
.
So, the vectors that we are interested in are
and
This idea is used to give the Gram-Schmidt Orthogonalisation process which we now describe.
For we have Since and
Hence, the result holds for
Let the result hold for all That is, suppose we are given any set of linearly independent vectors of Then by the inductive assumption, there exists a set of vectors satisfying the following:
Now, let us assume that we are given a set of linearly independent vectors of Then by the inductive assumption, we already have vectors satisfying
On the contrary, assume that Then there exist scalars such that
So, by (5.2.2)
Thus, by the third induction assumption,
This gives a contradiction to the given assumption that the set of vectors is linear independent.
So, . Define . Then . Also, it can be easily verified that for . Hence, by the principle of mathematical induction, the proof of the theorem is complete. height6pt width 6pt depth 0pt
We illustrate the Gram-Schmidt process by the following example.
Hence, Let Then
We claim that in this case,
Since, we have chosen the smallest satisfying
for the set is linearly independent (use Corollary 3.2.5). So, by Theorem 5.2.1, there exists an orthonormal set such that
As by Remark 5.1.15
So, by definition of
Therefore, in this case, we can continue with the Gram-Schmidt process by replacing by
Let Then in the ordered basis we have
is an matrix.
Also, observe that the conditions and for implies that
Perhaps the readers must have noticed that the inverse of is its transpose. Such matrices are called orthogonal matrices and they have a special role to play.
It is worthwhile to solve the following exercises.
where
Prove that Hence deduce that is an orthogonal matrix.
In case, is non-singular, the diagonal entries of can be chosen to be positive. Also, in this case, the decomposition is unique.
Let the columns of be The Gram-Schmidt orthogonalisation process applied to the vectors gives the vectors satisfying
By using (5.2.5), we get
The proof doesn't guarantee that for is positive. But this can be achieved by replacing the vector by whenever is negative.
Uniqueness: suppose then Observe the following properties of upper triangular matrices.
Suppose we have matrix of dimension with Then by Remark 5.2.3.2, the application of the Gram-Schmidt orthogonalisation process yields a set of orthonormal vectors of In this case, for each we have
Hence, proceeding on the lines of the above theorem, we have the following result.
We now compute
If
we denote
then by the Gram-Schmidt process,
and
The readers are advised to check that is indeed correct.
Let Define Let Then
Hence, Let Then
So, we again take Then
So, Hence,
The readers are advised to check the following:
A K Lal 2007-09-12