Linear Independence

DEFINITION 3.2.1 (Linear Independence and Dependence)   Let $ S= \{{\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_m \}$ be any non-empty subset of $ V.$ If there exist some non-zero $ {\alpha}_i$ 's $ \; 1 \le i \le m,$ such that

$\displaystyle \alpha_1 {\mathbf u}_1 + \alpha_2 {\mathbf u}_2 + \cdots + \alpha_m {\mathbf u}_m = {\mathbf 0},$

then the set $ S$ is called a linearly dependent set. Otherwise, the set $ S$ is called linearly independent.

EXAMPLE 3.2.2  
  1. Let $ S = \{(1,2,1), (2,1,4), (3,3,5) \}.$ Then check that $ 1
(1,2,1) + 1 (2,1,4) + (-1) (3,3,5) = (0,0,0).$ Since $ \alpha_1 =
1, \alpha_2 = 1$ and $ \alpha_3 = -1$ is a solution of (3.2.1), so the set $ S$ is a linearly dependent subset of $ {\mathbb{R}}^3.$
  2. Let $ S= \{(1,1,1), (1,1,0),
(1,0,1) \}.$ Suppose there exists $ \alpha, \beta, \gamma \in
{\mathbb{R}}$ such that $ \alpha (1,1,1) + \beta (1,1,0) + \gamma
(1,0,1) = (0,0,0).$ Then check that in this case we necessarily have $ \alpha = \beta = \gamma = 0$ which shows that the set $ S=
\{(1,1,1), (1,1,0), (1,0,1) \}$ is a linearly independent subset of $ {\mathbb{R}}^3.$

In other words, if $ S= \{{\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_m \}$ is a non-empty subset of a vector space $ V,$ then to check whether the set $ S$ is linearly dependent or independent, one needs to consider the equation

$\displaystyle \alpha_1 {\mathbf u}_1 + \alpha_2 {\mathbf u}_2 + \cdots + \alpha_m {\mathbf u}_m = {\mathbf 0}.$ (3.2.1)

In case $ \alpha_1 = \alpha_2 = \cdots = \alpha_m = 0$ is THE ONLY SOLUTION of (3.2.1), the set $ S$ becomes a linearly independent subset of $ V.$ Otherwise, the set $ S$ becomes a linearly dependent subset of $ V.$

PROPOSITION 3.2.3   Let $ V$ be a vector space.
  1. Then the zero-vector cannot belong to a linearly independent set.
  2. If $ S$ is a linearly independent subset of $ V,$ then every subset of $ S$ is also linearly independent.
  3. If $ S$ is a linearly dependent subset of $ V$ then every set containing $ S$ is also linearly dependent.

Proof. We give the proof of the first part. The reader is required to supply the proof of other parts.
Let $ S = \{{\mathbf 0}={\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n\}$ be a set consisting of the zero vector. Then for any $ \gamma \neq o,$ $ \gamma {\mathbf u}_1 + o {\mathbf u}_2 + \cdots + 0 {\mathbf u}_n = {\mathbf 0}.$ Hence, for the system $ \alpha_1 {\mathbf u}_1 + \alpha_2 {\mathbf u}_2 + \cdots + \alpha_m {\mathbf u}_m = {\mathbf 0},$ we have a non-zero solution $ \alpha_1 = \gamma$ and $ o= \alpha_2 = \cdots = \alpha_n .$ Therefore, the set $ S$ is linearly dependent. height6pt width 6pt depth 0pt

THEOREM 3.2.4   Let $ \{{\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_p \}$ be a linearly independent subset of a vector space $ V.$ Suppose there exists a vector $ {\mathbf v}_{p+1} \in V,$ such that the set $ \{{\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_p, {\mathbf v}_{p+1} \}$ is linearly dependent, then $ {\mathbf v}_{p+1}$ is a linear combination of $ {\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_p.$

Proof. Since the set $ \{{\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_p, {\mathbf v}_{p+1} \}$ is linearly dependent, there exist scalars $ \alpha_1,
\alpha_2, \ldots, \alpha_{p+1},$ NOT ALL ZERO such that

$\displaystyle \alpha_1 {\mathbf v}_1 + \alpha_2 {\mathbf v}_2 + \cdots + \alpha_p {\mathbf v}_p +\alpha_{p+1} {\mathbf v}_{p+1} = {\mathbf 0}.$ (3.2.2)

CLAIM: $ \alpha_{p+1} \neq 0.$
Let if possible $ \alpha_{p+1} = 0.$ Then equation (3.2.2) gives $ \alpha_1 {\mathbf v}_1 + \alpha_2 {\mathbf v}_2 + \cdots + \alpha_p {\mathbf v}_p =
{\mathbf 0}$ with not all $ \alpha_i, \; 1 \leq i \leq p$ zero. Hence, by the definition of linear independence, the set $ \{{\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_p \}$ is linearly dependent which is contradictory to our hypothesis. Thus, $ \alpha_{p+1} \neq 0$ and we get

$\displaystyle {\mathbf v}_{p+1}
= - \frac{1}{\alpha_{p+1}} (\alpha_1 {\mathbf v}_1 + \cdots + \alpha_p {\mathbf v}_p
).$

Note that $ \alpha_i \in {\mathbb{F}}$ for every $ i, \;
1\leq i \leq p+1$ and hence $ - \frac{\alpha_i}{\alpha_{p+1} } \in
{\mathbb{F}}$ for $ 1 \leq i \leq p.$ Hence the result follows. height6pt width 6pt depth 0pt

We now state two important corollaries of the above theorem. We don't give their proofs as they are easy consequence of the above theorem.

COROLLARY 3.2.5   Let $ \{{\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_n\} $ be a linearly dependent subset of a vector space $ V.$ Then there exists a smallest $ k, \; 2 \leq k \leq n$ such that

$\displaystyle L({\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_k) = L({\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_{k-1}).$

The next corollary follows immediately from Theorem 3.2.4 and Corollary 3.2.5.

COROLLARY 3.2.6   Let $ \{{\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_p \}$ be a linearly independent subset of a vector space $ V.$ Suppose there exists a vector $ {\mathbf v}\in V,$ such that $ {\mathbf v}\not \in L({\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_p ).$ Then the set $ \{{\mathbf v}_1, {\mathbf v}_2, \ldots, {\mathbf v}_p, {\mathbf v}\}$ is also a linearly independent subset of $ V.$

EXERCISE 3.2.7  
  1. Consider the vector space $ {\mathbb{R}}^2.$ Let $ {\mathbf u}_1 = (1,0).$ Find all choices for the vector $ {\mathbf u}_2$ such that the set $ \{{\mathbf u}_1, {\mathbf u}_2
\}$ is linear independent subset of $ {\mathbb{R}}^2.$ Does there exist choices for vectors $ {\mathbf u}_2$ and $ {\mathbf u}_3$ such that the set $ \{{\mathbf u}_1,
{\mathbf u}_2, {\mathbf u}_3\}$ is linearly independent subset of $ {\mathbb{R}}^2$ ?
  2. If none of the elements appearing along the principal diagonal of a lower triangular matrix is zero, show that the row vectors are linearly independent in $ {\mathbb{R}}^n.$ The same is true for column vectors.
  3. Let $ S = \{(1,1,1,1), (1,-1,1,2), (1,1,-1,1)\}
\subset {\mathbb{R}}^4.$ Determine whether or not the vector $ (1,1,2,1)
\in L(S)?$
  4. Show that $ S = \{(1,2,3), (-2,1,1), (8,6,10) \}$ is linearly dependent in $ {\mathbb{R}}^3.$
  5. Show that $ S=
\{(1,0,0), (1,1,0), (1,1,1) \}$ is a linearly independent set in $ {\mathbb{R}}^3.$ In general if $ \{f_1, f_2, f_3\}$ is a linearly independent set then $ \{f_1, f_1 + f_2, f_1 + f_2 + f_3 \}$ is also a linearly independent set.
  6. In $ {\mathbb{R}}^3,$ give an example of $ 3$ vectors $ {\mathbf u}, {\mathbf v}$ and $ {\mathbf w}$ such that $ \{{\mathbf u}, {\mathbf v},
{\mathbf w}\}$ is linearly dependent but any set of $ 2$ vectors from $ {\mathbf u}, {\mathbf v}, {\mathbf w}$ is linearly independent.
  7. What is the maximum number of linearly independent vectors in $ {\mathbb{R}}^3?$
  8. Show that any set of $ k$ vectors in $ {\mathbb{R}}^3$ is linearly dependent if $ k \geq 4.$
  9. Is the set of vectors $ (1,0), (\ i
, 0)$ linearly independent subset of $ {\mathbb{C}}^2 \;({\mathbb{R}})?$
  10. Suppose $ V$ is a collection of vectors such that $ V({\mathbb{C}})$ as well as $ V({\mathbb{R}})$ are vector spaces. Prove that the set $ \{{\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_k, i{\mathbf u}_1, i{\mathbf u}_2, \ldots, i{\mathbf u}_k\}$ is a linearly independent subset of $ V({\mathbb{R}})$ if and only if $ \{{\mathbf u}_1, {\mathbf u}_2, \ldots, {\mathbf u}_k\}$ is a linear independent subset of $ V({\mathbb{C}})$ .
  11. Under what conditions on $ \alpha$ are the vectors $ (1 + \alpha, 1 -
\alpha) $ and $ (\alpha - 1, 1+ \alpha)$ in $ {\mathbb{C}}^2({\mathbb{R}})$ linearly independent?
  12. Let $ {\mathbf u}, {\mathbf v}\in V$ and $ M$ be a subspace of $ V.$ Further, let $ K$ be the subspace spanned by $ M$ and $ {\mathbf u}$ and $ H$ be the subspace spanned by $ M$ and $ {\mathbf v}.$ Show that if $ {\mathbf v}\in
K$ and $ {\mathbf v}\not\in M$ then $ {\mathbf u}\in H.$

A K Lal 2007-09-12