Next: The Elimination Method Up: Main  Previous: Norms of Vectors

3. Induced Norms:

In many problems we shall be concerned at the same time with norms of vectors and matrices. It would seem unwise if we use completely unrelated norms for the vectors and matrices. It turns out to be convenient to have a matrix norm induced by the vector norm. Thus we have:
If $ n(x)$ is a vector norm satisfying the vector norm axioms, then for any matrix A

where the supremum is over all non-zero vectors x, satisfies the matrix norm axioms and is called the norm induced by n(x).
It is clear that, no matter what $ n(x)$ is, we have

$\displaystyle m_n(I)=1$

It is not too difficult to determine the matrix norms induced by our three basic vector norms. These are given below:
Vector Norm Induced Matrix Norm
$ \vert\vert x\vert\vert _1=\sum\limits^n_{i=1}\vert x_i\vert$
$ \vert\vert x\vert\vert _2=\left[\sum\limits_{i=1}^n\vert x_i\vert^2\right]^{1/2}$ $ \vert\vert A\vert\vert _2=[$dominant eigen value of$ A'A]^{1/2}$
$ \vert\vert x\vert\vert _{\infty}=\max\limits_{1\leq i \leq n}\vert x_i\vert$

Here we give the proof of the first of the above results.
Use the vector norm

$\displaystyle \vert\vert x\vert\vert _1=\sum\limits_{j=1}^n\vert x_j\vert,$    then

Changing the order of summation, we have

$\displaystyle \vert\vert Ax\vert\vert _1 \leq \sum\limits_{j=1}^n\vert x_j\vert\sum\limits_{i=1}^n\vert a_{ij}\vert$

Let       ..........(1)

. Then

and thus $ \vert\vert A\vert\vert _1 \leq C$
To show this is an equality, we demonstrate an x for which

$\displaystyle \frac{\vert\vert Ax\vert\vert _1}{\vert\vert x\vert\vert _1}=C$

Let k be the column index for which the maximum in (1) is attained. Let $ x=e^{k}$ the $ k^{th}$ unit vector. Then $ \vert\vert x\vert\vert _1=1$ and

This proves that for the vector norm $ \vert\vert x\vert\vert _1$, the induced matrix norm is

 

 

 



Next: The Elimination Method Up:Main Previous: Norms of Vectors