Ordered Bases

Let $ {\cal B}=\{{\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n \}$ be a basis of a vector space $ V
({\mathbb{F}}).$ As $ {\cal B}$ is a set, there is no ordering of its elements. In this section, we want to associate an order among the vectors in any basis of $ V.$

DEFINITION 3.4.1 (Ordered Basis)   An ordered basis for a vector space $ V ({\mathbb{F}})$ of dimension $ n,$ is a basis $ \{{\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n\} $ together with a one-to-one correspondence between the sets $ \{{\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n\} $ and $ \{1,2,3, \ldots, n\}.$

If the ordered basis has $ {\mathbf u}_1$ as the first vector, $ {\mathbf u}_2$ as the second vector and so on, then we denote this ordered basis by

$\displaystyle ({\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n).$

EXAMPLE 3.4.2   Consider $ {\cal P}_2({\mathbb{R}}),$ the vector space of all polynomials of degree less than or equal to $ 2$ with coefficients from $ {\mathbb{R}}.$ The set $ \{1-x, 1+x, x^2\}$ is a basis of $ {\cal P}_2({\mathbb{R}}).$

For any element $ a_0 + a_1 x + a_2 x^2 \in {\cal P}_2({\mathbb{R}}),$ we have

$\displaystyle a_0 + a_1 x + a_2 x^2 = \frac{a_0 - a_1}{2} (1-x) +
\frac{a_0 + a_1}{2} (1+x) + a_2 x^2.$

If $ (1-x, 1+x, x^2)$ is an ordered basis, then $ \displaystyle
\frac{a_0 - a_1}{2}$ is the first component, $ \displaystyle
\frac{a_0 + a_1}{2}$ is the second component, and $ a_2$ is the third component of the vector $ a_0 + a_1 x + a_2 x^2. $

If we take $ (1+x, 1-x, x^2)$ as an ordered basis, then $ \displaystyle
\frac{a_0 + a_1}{2}$ is the first component, $ \displaystyle
\frac{a_0 - a_1}{2}$ is the second component, and $ a_2$ is the third component of the vector $ a_0 + a_1 x + a_2 x^2. $

That is, as ordered bases $ ({\mathbf u}_1, {\mathbf u}_2, \ldots,
{\mathbf u}_n),$ $ ({\mathbf u}_2, {\mathbf u}_3, \ldots, {\mathbf u}_n, {\mathbf u}_1),$ and $ ({\mathbf u}_n, {\mathbf u}_1,
{\mathbf u}_2, \ldots, {\mathbf u}_{n-1})$ are different even though they have the same set of vectors as elements.

DEFINITION 3.4.3 (Coordinates of a Vector)   Let $ {\cal B}= ({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_n)$ be an ordered basis of a vector space $ V ({\mathbb{F}})$ and let $ {\mathbf v}\in V.$ If

$\displaystyle {\mathbf v}= \beta_1 {\mathbf v}_1 + \beta_2 {\mathbf v}_2
+ \cdots + \beta_n {\mathbf v}_n$

then the tuple $ (\beta_1, \beta_2, \ldots, \beta_n)$ is called the coordinate of the vector $ {\mathbf v}$ with respect to the ordered basis $ {\cal B}.$

Mathematically, we denote it by $ [{\mathbf v}]_{{\cal B}} = (\beta_1, \ldots, \beta_n)^t,$ A COLUMN VECTOR.

Suppose $ {\cal B}_1 = ({\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n)$ and $ {\cal B}_2= ({\mathbf u}_n, {\mathbf u}_1, {\mathbf u}_2,
\ldots, {\mathbf u}_{n-1})$ are two ordered bases of $ V.$ Then for any $ {\mathbf x}\in V$ there exists unique scalars $ \alpha_1, \alpha_2,\dots, \alpha_n$ such that

$\displaystyle {\mathbf x}= \alpha_1 {\mathbf u}_1 + \alpha_2 {\mathbf u}_2 + \c...
...mathbf u}_n + \alpha_1 {\mathbf u}_1 + \cdots + \alpha_{n-1} {\mathbf u}_{n-1}.$

Therefore,

$\displaystyle [{\mathbf x}]_{{\cal B}_1} = ({\alpha}_1, {\alpha}_2, \ldots, {\a...
...thbf x}]_{{\cal B}_2} = (\alpha_n, \alpha_1, \alpha_2,
\ldots, \alpha_{n-1})^t.$

Note that $ {\mathbf x}$ is uniquely written as $ \sum\limits_{i=1}^n \alpha_i {\mathbf u}_i $ and hence the coordinates with respect to an ordered basis are unique.

Suppose that the ordered basis $ {\cal B}_1$ is changed to the ordered basis $ {\cal B}_3 = ({\mathbf u}_2, {\mathbf u}_1,
{\mathbf u}_3, \ldots, {\mathbf u}_n).$ Then $ [{\mathbf x}]_{{\cal B}_3} = (\alpha_2, \alpha_1,
\alpha_3, \ldots, \alpha_n )^t.$ So, the coordinates of a vector depend on the ordered basis chosen.

EXAMPLE 3.4.4   Let $ V = {\mathbb{R}}^3.$ Consider the ordered bases
$ {\cal B}_1=\bigl( (1,0,0), (0,1,0), (0,0,1) \bigr),$ $ {\cal B}_2= \bigl(
(1,0,0), (1,1,0), (1,1,1) \bigr)$ and $ {\cal B}_3 = \bigl( (1,1,1),
(1,1,0), (1,0,0)\bigr)$ of $ V.$ Then, with respect to the above bases we have

$\displaystyle (1, -1, 1)$ $\displaystyle =$ $\displaystyle 1 \cdot (1,0,0) + (-1)\cdot (0, 1, 0) + 1 \cdot (0,0,1).$  
  $\displaystyle =$ $\displaystyle 2 \cdot (1,0,0) + (-2) \cdot (1,1,0) +
1 \cdot (1,1,1).$  
  $\displaystyle =$ $\displaystyle 1 \cdot (1,1,1) + (-2) \cdot (1,1,0) + 2 \cdot (1,0,0).$  

Therefore, if we write $ {\mathbf u}= (1,-1,1),$ then

$\displaystyle [{\mathbf u}]_{{\cal B}_1} = (1,-1,1)^t, \; [{\mathbf u}]_{{\cal B}_2} = (2,-2,1)^t,
\; [{\mathbf u}]_{{\cal B}_3} = (1,-2,2)^t.$

In general, let $ V$ be an $ n$ -dimensional vector space with ordered bases $ {\cal B}_1 = ({\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n)$ and $ {\cal B}_2 =
({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_n).$ Since, $ {\cal B}_1$ is a basis of $ V,$ there exists unique scalars $ a_{ij}, \; 1 \leq i, j \leq n$ such that

$\displaystyle {\mathbf v}_i = \sum\limits_{l=1}^n a_{li} {\mathbf u}_l \hspace{0.5in}
{\mbox{for }} 1 \leq i \leq n.$

That is, for each $ i, \; 1 \leq i \leq n, \; [{\mathbf v}_i]_{{\cal B}_1} =
(a_{1i}, a_{2i}, \ldots, a_{ni})^t.$

Let $ {\mathbf v}\in V$ with $ [{\mathbf v}]_{{\cal B}_2}
= (\alpha_1, \alpha_2, \ldots, \alpha_n)^t.$ As $ {\cal B}_2$ as ordered basis $ ({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_n),$ we have

$\displaystyle {\mathbf v}= \sum_{i=1}^n \alpha_i {\mathbf v}_i = \sum_{i=1}^n \...
...ight) = \sum_{j=1}^n \left(
\sum_{i=1}^n a_{ji} \alpha_i \right) {\mathbf u}_j.$

Since $ {\cal B}_1$ is a basis this representation of $ {\mathbf v}$ in terms of $ {\mathbf u}_i$ 's is unique. So,
$\displaystyle [{\mathbf v}]_{{\cal B}_1}$ $\displaystyle =$ $\displaystyle \left( \sum_{i=1}^n a_{1i} \alpha_i,
\sum_{i=1}^n a_{2i} \alpha_i, \ldots, \sum_{i=1}^n a_{ni} \alpha_i
\right)^t$  
  $\displaystyle =$ $\displaystyle \begin{bmatrix}a_{11} & \cdots & a_{1n} \\
a_{21} &\cdots & a_{2...
...matrix} \begin{bmatrix}
\alpha_1\\ \alpha_2 \\ \vdots
\\ \alpha_n
\end{bmatrix}$  
  $\displaystyle =$ $\displaystyle A [{\mathbf v}]_{{\cal B}_2}.$  

Note that the $ i^{th}$ column of the matrix $ A$ is equal to $ [{\mathbf v}_i]_{{\cal B}_1},$ i.e., the $ i^{th}$ column of $ A$ is the coordinate of the $ i^{th}$ vector $ {\mathbf v}_i$ of $ {\cal B}_2$ with respect to the ordered basis $ {\cal B}_1.$ Hence, we have proved the following theorem.

THEOREM 3.4.5   Let $ V$ be an $ n$ -dimensional vector space with ordered bases $ {\cal B}_1 = ({\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n)$ and $ {\cal B}_2 =
({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_n).$ Let

$\displaystyle A = \left[ [{\mathbf v}_1]_{{\cal B}_1}, [{\mathbf v}_2]_{{\cal B}_1},
\ldots, [{\mathbf v}_n]_{{\cal B}_1} \right].$

Then for any $ {\mathbf v}\in V,$

$\displaystyle [v]_{{\cal B}_1} = A [{\mathbf v}]_{{\cal B}_2}.$

EXAMPLE 3.4.6   Consider two bases $ {\cal B}_1= \bigl((1,0,0), (1,1,0), (1,1,1)\bigr)$ and $ {\cal B}_2 = \bigl((1,1,1), (1,-1,1), (1,1,0)\bigr)$ of $ {\mathbb{R}}^3.$
  1. Then
    $\displaystyle [(x,y,z)]_{{\cal B}_1}$ $\displaystyle =$ $\displaystyle (x-y) \cdot (1,0,0) + (y-z)
\cdot (1, 1, 0) + z \cdot (1,1,1)$  
      $\displaystyle =$ $\displaystyle (x-y,y-z,z)^t$  

    and
    $\displaystyle [(x,y,z)]_{{\cal B}_2}$ $\displaystyle =$ $\displaystyle (\frac{y-x}{2} + z) \cdot (1,1,1) +
\frac{x-y}{2} \cdot (1, -1, 1)$  
        $\displaystyle + (x-z) \cdot (1,1,0)$  
      $\displaystyle =$ $\displaystyle (\frac{y-x}{2} + z,
\frac{x-y}{2}, x-z)^t.$  

  2. Let $ A = [a_{ij}] = \begin{bmatrix}0 & 2 & 0\\ 0 &
-2 & 1 \\ 1 & 1 & 0 \end{bmatrix}.$ The columns of the matrix $ A$ are obtained by the following rule:

    $\displaystyle [(1,1,1)]_{{\cal B}_1} = 0 \cdot (1,0,0) + 0 \cdot (1,1,0) + 1
\cdot (1,1,1) = (0,0,1)^t,$

    $\displaystyle [(1,-1,1)]_{{\cal B}_1} = 2 \cdot (1,0,0) + (-2) \cdot
(1,1,0) + 1 \cdot (1,1,1) = (2, -2, 1)^t$

    and

    $\displaystyle [(1,1,0)]_{{\cal B}_1} = 0 \cdot (1,0,0) + 1 \cdot (1,1,0) + 0 \cdot (1,1,1)
= (0,1,0)^t.$

    That is, the elements of $ {\cal B}_2 = \bigl((1,1,1), (1,-1,1), (1,1,0)\bigr)$ are expressed in terms of the ordered basis $ {\cal B}_1.$
  3. Note that for any $ (x,y,z)\in {\mathbb{R}}^3,$

    $\displaystyle [(x,y,z)]_{{\cal B}_1} = \begin{bmatrix}x-y\\ y-z\\ z \end{bmatri...
...-x}{2} + z\\ \frac{x-y}{2}\\ x-z \end{bmatrix} =A \; \; [(x,y,z)]_{{\cal B}_2}.$

  4. The matrix $ A$ is invertible and hence $ [(x,y,z)]_{{\cal B}_2} = A^{-1}
\;\; [(x,y,z)]_{{\cal B}_1}.$

In the next chapter, we try to understand Theorem 3.4.5 again using the ideas of `linear transformations / functions'.

EXERCISE 3.4.7  
  1. Determine the coordinates of the vectors $ (1,2,1)$ and $ (4, -2, 2)$ with respect to the basis $ {\cal B}= \bigl( (2,1,0), (2,1,1),
(2,2,1) \bigr)$ of $ {\mathbb{R}}^3.$
  2. Consider the vector space $ {\cal P}_3({\mathbb{R}}).$
    1. Show that $ {\cal B}_1= (1 - x, 1 + x^2, 1 - x^3, 3 + x^2 - x^3)$ and $ {\cal B}_2= (1, 1- x, 1 + x^2, 1 - x^3)$ are bases of $ {\cal P}_3({\mathbb{R}}).$
    2. Find the coordinates of the vector $ {\mathbf u}=1 + x + x^2 + x^3$ with respect to the ordered basis $ {\cal B}_1$ and $ {\cal B}_2.$
    3. Find the matrix $ A$ such that $ [{\mathbf u}]_{{\cal B}_2} = A [{\mathbf u}]_{{\cal B}_1}.$
    4. Let $ {\mathbf v}= a_0 + a_1 x + a_2 x^2 + a_3 x^3.$ Then verify the following:
      $\displaystyle [{\mathbf v}]_{{\cal B}_1}$ $\displaystyle =$ $\displaystyle \begin{bmatrix}-a_1 \\ -a_0 - a_1 + 2 a_2 - a_ 3 \\
-a_0 - a_1 + a_2 - 2 a_3 \\ a_0 + a_1 - a_2 + a_3 \end{bmatrix}$  
        $\displaystyle =$ $\displaystyle \begin{bmatrix}0 & 1 & 0 & 0 \\ -1 & 0 & 1 & 0 \\ -1 & 0 & 0 & 1 ...
...\;\; \begin{bmatrix}a_0 + a_1 - a_2 + a_3 \\
-a_1 \\ a_2 \\ -a_3 \end{bmatrix}$  
        $\displaystyle =$ $\displaystyle [{\mathbf v}]_{{\cal B}_2}.$  

A K Lal 2007-09-12