next up previous
: 4. The Elimination Method : lec1 : 2. Norms of Vectors

3. Induced Norms:

In many problems we shall be concerned at the same time with norms of vectors and matrices. It would seem unwise if we use completely unrelated norms for the vectors and matrices. It turns out to be convenient to have a matrix norm 'induced' by the vector norm. Thus we have:
If $ n(x)$ is a vector norm satisfying the vector norm axioms, then for any matrix A

$\displaystyle m_n(A)=m(A)=Sup \frac{n(Ax)}{n(x)}$

Where the supremum is over all non-zero vectors x, satisfies the matrix norm axioms and is called the norm induced by n(x).
It is clean that, no matter what $ n(x)$ is, we have

$\displaystyle m_n(I)=1$

It is not too difficult to determine the matrix norms induced by our three basic vector norms. These are given below:
Vector Norm Induced Matrix Norm
$ \vert\vert x\vert\vert _1=\sum\limits^n_{i=1}\vert x_i\vert$ $ \vert\vert A\vert\vert _1=\max\limits_{1\leq j \leq n} \sum\limits_{i=1}^n\vert a_{ij}\vert$
$ \vert\vert x\vert\vert _2=\left[\sum\limits_{i=1}^n\vert x_i\vert^2\right]^{1/2}$ $ \vert\vert A\vert\vert _2=[$dominant eigen value of$ A'A]^{1/2}$
$ \vert\vert x\vert\vert _{\infty}=\max\limits_{1\leq i \leq n}\vert x_i\vert$ $ \vert\vert A\vert\vert _{\infty}=\max\limits_{1\leq i \leq n}\sum\limits_{j=1}^n\vert a_{ij}\vert$

Here we give the proof of the first of the above results.
Use the vector norm

$\displaystyle \vert\vert x\vert\vert _1=\sum\limits_{j=1}^n\vert x_j\vert,$

then

$\displaystyle \vert\vert Ax\vert\vert _1=\sum\limits_{i=1}^n\left\vert\sum\limi...
...\vert\leq \sum\limits_{i=1}^n\sum\limits_{j=1}^n\vert a_{ij}\vert\vert x_j\vert$

Changing the order of summation, we have

$\displaystyle \vert\vert Ax\vert\vert _1 \leq \sum\limits_{j=1}^n\vert x_j\vert\sum\limits_{i=1}^n\vert a_{ij}\vert$

Let $ C= \max\limits_{1\leq j \leq n}\sum\limits_{i=1}^n\vert a_{ij}\vert$. Then ............(1)

$\displaystyle \vert\vert Ax\vert\vert _1\leq C\vert\vert x\vert\vert _1 \leq C\vert\vert x\vert\vert _1$

and thus $ \vert\vert A\vert\vert _1 \leq C$
To show this is and equality, we demonstrate an x for which

$\displaystyle \frac{\vert\vert Ax\vert\vert _1}{\vert\vert x\vert\vert _1}=C$

Let k be the column index for which the maximum in (1) is attained. Let $ x=e^{k}$ the $ k^{th}$ unit vector. Then $ \vert\vert x\vert\vert _1=1$ and

$\displaystyle \vert\vert Ax\vert\vert _1=\sum\limits_{i=1}^n\left\vert \sum\limits_{j=1}^n a_{ij} x_j \right\vert =\sum\limits_{i=1}^n \vert a_{ik}\vert=C$

This process that for the vector norm $ \vert\vert x\vert\vert _1$ the induced matrix norm is

$\displaystyle \vert\vert A\vert\vert _1=\max\limits_{1\leq j \leq n}\sum\limits_{i=1}^n\vert a_{ij}\vert$


next up previous
: 4. The Elimination Method : lec1 : 2. Norms of Vectors
root 平成18年1月24日