Linear Algebra/Definition and Examples of Linear Independence/Solutions

From Wikibooks, open books for an open world
< Linear Algebra‎ | Definition and Examples of Linear Independence
Jump to: navigation, search

Solutions[edit]

This exercise is recommended for all readers.
Problem 1

Decide whether each subset of  \mathbb{R}^3 is linearly dependent or linearly independent.

  1.  \{\begin{pmatrix} 1 \\ -3 \\ 5 \end{pmatrix},
\begin{pmatrix} 2 \\ 2 \\ 4 \end{pmatrix},
\begin{pmatrix} 4 \\ -4 \\ 14 \end{pmatrix} \}
  2.  \{\begin{pmatrix} 1 \\ 7 \\ 7 \end{pmatrix},
\begin{pmatrix} 2 \\ 7 \\ 7 \end{pmatrix},
\begin{pmatrix} 3 \\ 7 \\ 7 \end{pmatrix} \}
  3.  \{\begin{pmatrix} 0 \\ 0 \\ -1 \end{pmatrix},
\begin{pmatrix} 1 \\ 0 \\ 4 \end{pmatrix} \}
  4.  \{\begin{pmatrix} 9 \\ 9 \\ 0 \end{pmatrix},
\begin{pmatrix} 2 \\ 0 \\ 1 \end{pmatrix},
\begin{pmatrix} 3 \\ 5 \\ -4 \end{pmatrix},
\begin{pmatrix} 12 \\ 12 \\ -1 \end{pmatrix} \}
Answer

For each of these, when the subset is independent it must be proved, and when the subset is dependent an example of a dependence must be given.

  1. It is dependent. Considering
    
c_1\begin{pmatrix} 1 \\ -3 \\ 5 \end{pmatrix}
+c_2\begin{pmatrix} 2 \\ 2 \\ 4 \end{pmatrix}
+c_3\begin{pmatrix} 4 \\ -4 \\ 14 \end{pmatrix}
=\begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}
    gives rise to this linear system.
    
\begin{array}{*{3}{rc}r}
c_1 &+ &2c_2 &+ &4c_3 &= &0 \\
-3c_1&+ &2c_2 &- &4c_3 &= &0 \\
5c_1 &+ &4c_2 &+ &14c_3 &= &0
\end{array}
    Gauss' method
    
\left(\begin{array}{*{3}{c}|c}
1 &2 &4 &0 \\
-3 &2 &-4 &0 \\
5 &4 &14 &0
\end{array}\right)
\xrightarrow[-5\rho_1+\rho_3]{3\rho_1+\rho_2}
\;\xrightarrow[]{(3/4)\rho_2+\rho_3}
\left(\begin{array}{*{3}{c}|c}
1 &2 &4 &0 \\
0 &8 &8 &0 \\
0 &0 &0 &0
\end{array}\right)
    yields a free variable, so there are infinitely many solutions. For an example of a particular dependence we can set c_3 to be, say, 1. Then we get  c_2=-1 and  c_1=-2 .
  2. It is dependent. The linear system that arises here
    
\left(\begin{array}{*{3}{c}|c}
1 &2 &3 &0 \\
7 &7 &7 &0 \\
7 &7 &7 &0
\end{array}\right)
\;\xrightarrow[-7\rho_1+\rho_3]{-7\rho_1+\rho_2}
\;\xrightarrow[]{-\rho_2+\rho_3}\;
\left(\begin{array}{*{3}{c}|c}
1 &2 &3  &0 \\
0 &-7 &-14 &0 \\
0 &0 &0  &0
\end{array}\right)
    has infinitely many solutions. We can get a particular solution by taking c_3 to be, say, 1, and back-substituting to get the resulting c_2 and c_1.
  3. It is linearly independent. The system
    
\left(\begin{array}{*{2}{c}|c}
0 &1 &0 \\
0 &0 &0 \\
-1 &4 &0
\end{array}\right)
\;\xrightarrow[]{\rho_1\leftrightarrow\rho_2}
\;\xrightarrow[]{\rho_3\leftrightarrow\rho_1}\;
\left(\begin{array}{*{2}{c}|c}
-1 &4 &0 \\
0 &1 &0 \\
0 &0 &0
\end{array}\right)
    has only the solution c_1=0 and c_2=0. (We could also have gotten the answer by inspection— the second vector is obviously not a multiple of the first, and vice versa.)
  4. It is linearly dependent. The linear system
    
\left(\begin{array}{*{4}{c}|c}
9 &2 &3 &12 &0 \\
9 &0 &5 &12 &0 \\
0 &1 &-4 &-1 &0
\end{array}\right)
    has more unknowns than equations, and so Gauss' method must end with at least one variable free (there can't be a contradictory equation because the system is homogeneous, and so has at least the solution of all zeroes). To exhibit a combination, we can do the reduction
    
\xrightarrow[]{-\rho_1+\rho_2}
\;\xrightarrow[]{(1/2)\rho_2+\rho_3}\;
\left(\begin{array}{*{4}{c}|c}
9 &2 &3 &12 &0 \\
0 &-2 &2 &0  &0 \\
0 &0 &-3 &-1 &0
\end{array}\right)
    and take, say, c_4=1. Then we have that c_3=-1/3, c_2=-1/3, and c_1=-31/27.
This exercise is recommended for all readers.
Problem 2

Which of these subsets of  \mathcal{P}_3 are linearly dependent and which are independent?

  1.  \{3-x+9x^2,5-6x+3x^2,1+1x-5x^2\}
  2.  \{-x^2,1+4x^2\}
  3.  \{2+x+7x^2,3-x+2x^2,4-3x^2\}
  4.  \{8+3x+3x^2,x+2x^2,2+2x+2x^2,8-2x+5x^2\}
Answer

In the cases of independence, that must be proved. Otherwise, a specific dependence must be produced. (Of course, dependences other than the ones exhibited here are possible.)

  1. This set is independent. Setting up the relation  c_1(3-x+9x^2)+c_2(5-6x+3x^2)+c_3(1+1x-5x^2)=0+0x+0x^2 gives a linear system
    
\left(\begin{array}{*{3}{c}|c}
3 &5 &1 &0 \\
-1 &-6 &1 &0 \\
9 &3 &-5 &0
\end{array}\right)
\;\xrightarrow[-3\rho_1+\rho_3]{(1/3)\rho_1+\rho_2}
\;\xrightarrow[]{3\rho_2}
\;\xrightarrow[]{-(12/13)\rho_2+\rho_3}\;
\left(\begin{array}{*{3}{c}|c}
3 &5  &1    &0 \\
0 &-13 &4    &0 \\
0 &0  &-128/13 &0
\end{array}\right)
    with only one solution:  c_1=0 ,  c_2=0 , and  c_3=0 .
  2. This set is independent. We can see this by inspection, straight from the definition of linear independence. Obviously neither is a multiple of the other.
  3. This set is linearly independent. The linear system reduces in this way
    
\left(\begin{array}{*{3}{c}|c}
2 &3 &4 &0 \\
1 &-1 &0 &0 \\
7 &2 &-3 &0
\end{array}\right)
\;\xrightarrow[-(7/2)\rho_1+\rho_3]{-(1/2)\rho_1+\rho_2}
\;\xrightarrow[]{-(17/5)\rho_2+\rho_3}\;
\left(\begin{array}{*{3}{c}|c}
2 &3  &4   &0 \\
0 &-5/2 &-2   &0 \\
0 &0  &-51/5 &0
\end{array}\right)
    to show that there is only the solution c_1=0, c_2=0, and c_3=0.
  4. This set is linearly dependent. The linear system
    
\left(\begin{array}{*{4}{c}|c}
8 &0 &2 &8 &0 \\
3 &1 &2 &-2 &0 \\
3 &2 &2 &5 &0
\end{array}\right)
    must, after reduction, end with at least one variable free (there are more variables than equations, and there is no possibility of a contradictory equation because the system is homogeneous). We can take the free variables as parameters to describe the solution set. We can then set the parameter to a nonzero value to get a nontrivial linear relation.
This exercise is recommended for all readers.
Problem 3

Prove that each set  \{f,g\} is linearly independent in the vector space of all functions from  \mathbb{R}^+ to  \mathbb{R} .

  1.  f(x)=x and  g(x)=1/x
  2.  f(x)=\cos(x) and  g(x)=\sin(x)
  3.  f(x)=e^x and  g(x)=\ln(x)
Answer

Let Z be the zero function Z(x)=0, which is the additive identity in the vector space under discussion.

  1. This set is linearly independent. Consider  c_1\cdot f(x)+c_2\cdot g(x)=Z(x) . Plugging in  x=1 and  x=2 gives a linear system
    
\begin{array}{*{2}{rc}r}
c_1\cdot 1 &+ &c_2\cdot 1   &= &0 \\
c_1\cdot 2 &+ &c_2\cdot (1/2) &= &0
\end{array}
    with the unique solution  c_1=0 ,  c_2=0 .
  2. This set is linearly independent. Consider  c_1\cdot f(x)+c_2\cdot g(x)=Z(x) and plug in  x=0 and  x=\pi/2 to get
    
\begin{array}{*{2}{rc}r}
c_1\cdot 1 &+ &c_2\cdot 0   &= &0 \\
c_1\cdot 0 &+ &c_2\cdot 1   &= &0
\end{array}
    which obviously gives that  c_1=0 ,  c_2=0 .
  3. This set is also linearly independent. Considering  c_1\cdot f(x)+c_2\cdot g(x)=Z(x) and plugging in  x=1 and  x=e
    
\begin{array}{*{2}{rc}r}
c_1\cdot e  &+ &c_2\cdot 0   &= &0 \\
c_1\cdot e^e &+ &c_2\cdot 1   &= &0
\end{array}
    gives that  c_1=0 and  c_2=0 .
This exercise is recommended for all readers.
Problem 4

Which of these subsets of the space of real-valued functions of one real variable is linearly dependent and which is linearly independent? (Note that we have abbreviated some constant functions; e.g., in the first item, the "2" stands for the constant function f(x)=2.)

  1.  \{2,4\sin^2(x),\cos^2(x)\}
  2.  \{1,\sin(x),\sin(2x)\}
  3.  \{x,\cos(x)\}
  4.  \{(1+x)^2,x^2+2x,3\}
  5.  \{\cos(2x),\sin^2(x),\cos^2(x)\}
  6.  \{0,x,x^2\}
Answer

In each case, that the set is independent must be proved, and that it is dependent must be shown by exhibiting a specific dependence.

  1. This set is dependent. The familiar relation \sin^2(x)+\cos^2(x)=1 shows that 2=c_1\cdot(4\sin^2(x))+c_2\cdot(\cos^2(x)) is satisfied by c_1=1/2 and c_2=2.
  2. This set is independent. Consider the relationship c_1\cdot 1+c_2\cdot\sin(x)+c_3\cdot\sin(2x)=0 (that "0" is the zero function). Taking x=0, x=\pi/2 and x=\pi/4 gives this system.
    
\begin{array}{*{3}{rc}r}
c_1 &   &                &   &      &=  &0  \\
c_1  &+  &c_2             &   &      &=  &0  \\
c_1  &+  &(\sqrt{2}/2)c_2 &+  &c_3   &=  &0
\end{array}
    whose only solution is c_1=0, c_2=0, and c_3=0.
  3. By inspection, this set is independent. Any dependence \cos(x)=c\cdot x is not possible since the cosine function is not a multiple of the identity function (we are applying Corollary 1.17).
  4. By inspection, we spot that there is a dependence. Because (1+x)^2=x^2+2x+1, we get that c_1\cdot(1+x)^2+c_2\cdot(x^2+2x)=3 is satisfied by c_1=3 and c_2=-3.
  5. This set is dependent. The easiest way to see that is to recall the trigonometric relationship \cos^2(x)-\sin^2(x)=\cos(2x). (Remark. A person who doesn't recall this, and tries some x's, simply never gets a system leading to a unique solution, and never gets to conclude that the set is independent. Of course, this person might wonder if they simply never tried the right set of x's, but a few tries will lead most people to look instead for a dependence.)
  6. This set is dependent, because it contains the zero object in the vector space, the zero polynomial.
Problem 5

Does the equation  \sin^2(x)/\cos^2(x)=\tan^2(x) show that this set of functions  \{\sin^2(x),\cos^2(x),\tan^2(x)\} is a linearly dependent subset of the set of all real-valued functions with domain the interval  (-\pi/2..\pi/2) of real numbers between  -\pi/2 and  \pi/2) ?

Answer

No, that equation is not a linear relationship. In fact this set is independent, as the system arising from taking  x to be  0 ,  \pi/6 and  \pi/4 shows.

Problem 6

Why does Lemma 1.4 say "distinct"?

Answer

To emphasize that the equation  1\cdot\vec{s}+(-1)\cdot\vec{s}=\vec{0} does not make the set dependent.

This exercise is recommended for all readers.
Problem 7

Show that the nonzero rows of an echelon form matrix form a linearly independent set.

Answer

We have already showed this: the Linear Combination Lemma and its corollary state that in an echelon form matrix, no nonzero row is a linear combination of the others.

This exercise is recommended for all readers.
Problem 8
  1. Show that if the set  \{\vec{u},\vec{v},\vec{w}\} is linearly independent set then so is the set  \{\vec{u},\vec{u}+\vec{v},\vec{u}+\vec{v}+\vec{w}\} .
  2. What is the relationship between the linear independence or dependence of the set  \{\vec{u},\vec{v},\vec{w}\} and the independence or dependence of  \{\vec{u}-\vec{v},\vec{v}-\vec{w},\vec{w}-\vec{u}\} ?
Answer
  1. Assume that the set  \{\vec{u},\vec{v},\vec{w}\} is linearly independent, so that any relationship d_0\vec{u}+d_1\vec{v}+d_2\vec{w}=\vec{0} leads to the conclusion that d_0=0, d_1=0, and d_2=0. Consider the relationship  c_1(\vec{u})+c_2(\vec{u}+\vec{v})+c_3(\vec{u}+\vec{v}+\vec{w}) =\vec{0} . Rewrite it to get  (c_1+c_2+c_3)\vec{u}+(c_2+c_3)\vec{v}+(c_3)\vec{w}=\vec{0} . Taking d_0 to be c_1+c_2+c_3, taking d_1 to be c_2+c_3, and taking d_2 to be c_3 we have this system.
    
\begin{array}{*{3}{rc}r}
c_1  &+  &c_2  &+  &c_3  &=  &0  \\
&   &c_2  &+  &c_3  &=  &0  \\
&   &     &   &c_3  &=  &0
\end{array}
    Conclusion: the c's are all zero, and so the set is linearly independent.
  2. The second set is dependent
    
1\cdot(\vec{u}-\vec{v})
+1\cdot(\vec{v}-\vec{w})
+1\cdot(\vec{w}-\vec{u})
=\vec{0}
    whether or not the first set is independent.
Problem 9

Example 1.10 shows that the empty set is linearly independent.

  1. When is a one-element set linearly independent?
  2. How about a set with two elements?
Answer
  1. A singleton set \{\vec{v}\} is linearly independent if and only if \vec{v}\neq\vec{0}. For the "if" direction, with \vec{v}\neq\vec{0}, we can apply Lemma 1.4 by considering the relationship  c\cdot\vec{v}=\vec{0} and noting that the only solution is the trivial one: c=0. For the "only if" direction, just recall that Example 1.11 shows that \{\vec{0}\} is linearly dependent, and so if the set \{\vec{v}\} is linearly independent then \vec{v}\neq\vec{0}. (Remark. Another answer is to say that this is the special case of Lemma 1.16 where  S=\varnothing .)
  2. A set with two elements is linearly independent if and only if neither member is a multiple of the other (note that if one is the zero vector then it is a multiple of the other, so this case is covered). This is an equivalent statement: a set is linearly dependent if and only if one element is a multiple of the other. The proof is easy. A set \{\vec{v}_1,\vec{v}_2\} is linearly dependent if and only if there is a relationship c_1\vec{v}_1+c_2\vec{v}_2=\vec{0} with either c_1\neq 0 or c_2\neq 0 (or both). That holds if and only if \vec{v}_1=(-c_2/c_1)\vec{v}_2 or \vec{v}_2=(-c_1/c_2)\vec{v}_1 (or both).
Problem 10

In any vector space  V , the empty set is linearly independent. What about all of  V ?

Answer

This set is linearly dependent set because it contains the zero vector.

Problem 11

Show that if  \{\vec{x},\vec{y},\vec{z}\} is linearly independent then so are all of its proper subsets:  \{\vec{x},\vec{y}\} ,  \{\vec{x},\vec{z}\} ,  \{\vec{y},\vec{z}\} ,  \{\vec{x}\} , \{\vec{y}\} ,  \{\vec{z}\} , and  \{\} . Is that "only if" also?

Answer

The "if" half is given by Lemma 1.14. The converse (the "only if" statement) does not hold. An example is to consider the vector space  \mathbb{R}^2 and these vectors.


\vec{x}=\begin{pmatrix} 1 \\ 0 \end{pmatrix},\quad
\vec{y}=\begin{pmatrix} 0 \\ 1 \end{pmatrix},\quad
\vec{z}=\begin{pmatrix} 1 \\ 1 \end{pmatrix}
Problem 12
  1. Show that this
    
S=\{\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix},\begin{pmatrix} -1 \\ 2 \\ 0 \end{pmatrix}\}
    is a linearly independent subset of  \mathbb{R}^3 .
  2. Show that
    
\begin{pmatrix} 3 \\ 2 \\ 0 \end{pmatrix}
    is in the span of S by finding  c_1 and  c_2 giving a linear relationship.
    
c_1\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}
+c_2\begin{pmatrix} -1 \\ 2 \\ 0 \end{pmatrix}
=\begin{pmatrix} 3 \\ 2 \\ 0 \end{pmatrix}
    Show that the pair  c_1,c_2 is unique.
  3. Assume that  S is a subset of a vector space and that  \vec{v} is in  [S] , so that  \vec{v} is a linear combination of vectors from  S . Prove that if  S is linearly independent then a linear combination of vectors from  S adding to  \vec{v} is unique (that is, unique up to reordering and adding or taking away terms of the form  0\cdot\vec{s} ). Thus  S as a spanning set is minimal in this strong sense: each vector in  [S] is "hit" a minimum number of times— only once.
  4. Prove that it can happen when  S is not linearly independent that distinct linear combinations sum to the same vector.
Answer
  1. The linear system arising from
    
c_1\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}
+c_2\begin{pmatrix} -1 \\ 2 \\ 0 \end{pmatrix}
=\begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}
    has the unique solution  c_1=0 and  c_2=0 .
  2. The linear system arising from
    
c_1\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}
+c_2\begin{pmatrix} -1 \\ 2 \\ 0 \end{pmatrix}
=\begin{pmatrix} 3 \\ 2 \\ 0 \end{pmatrix}
    has the unique solution  c_1=8/3 and  c_2=-1/3 .
  3. Suppose that  S is linearly independent. Suppose that we have both \vec{v}=c_1\vec{s}_1+\dots+c_n\vec{s}_n and \vec{v}=d_1\vec{t}_1+\dots+d_m\vec{t}_m (where the vectors are members of S). Now,
    
c_1\vec{s}_1+\dots+c_n\vec{s}_n
=\vec{v}
=d_1\vec{t}_1+\dots+d_m\vec{t}_m
    can be rewritten in this way.
    
c_1\vec{s}_1+\dots+c_n\vec{s}_n
-d_1\vec{t}_1-\dots-d_m\vec{t}_m
=\vec{0}
    Possibly some of the \vec{s}\,'s equal some of the \vec{t}\,'s; we can combine the associated coefficients (i.e., if \vec{s}_i=\vec{t}_j then \cdots+c_i\vec{s}_i+\dots-d_j\vec{t}_j-\cdots can be rewritten as \cdots+(c_i-d_j)\vec{s}_i+\cdots). That equation is a linear relationship among distinct (after the combining is done) members of the set S. We've assumed that S is linearly independent, so all of the coefficients are zero. If i is such that \vec{s}_i does not equal any \vec{t}_j then c_i is zero. If j is such that \vec{t}_j does not equal any \vec{s}_i then d_j is zero. In the final case, we have that c_i-d_j=0 and so c_i=d_j. Therefore, the original two sums are the same, except perhaps for some 0\cdot\vec{s}_i or 0\cdot\vec{t}_j terms that we can neglect.
  4. This set is not linearly independent:
    
S=\{\begin{pmatrix} 1 \\ 0 \end{pmatrix},\begin{pmatrix} 2 \\ 0 \end{pmatrix}\}\subset\mathbb{R}^2
    and these two linear combinations give the same result
    
\begin{pmatrix} 0 \\ 0 \end{pmatrix}=2\cdot\begin{pmatrix} 1 \\ 0 \end{pmatrix}-1\cdot\begin{pmatrix} 2 \\ 0 \end{pmatrix} =4\cdot\begin{pmatrix} 1 \\ 0 \end{pmatrix}-2\cdot\begin{pmatrix} 2 \\ 0 \end{pmatrix}
    Thus, a linearly dependent set might have indistinct sums. In fact, this stronger statement holds: if a set is linearly dependent then it must have the property that there are two distinct linear combinations that sum to the same vector. Briefly, where  c_1\vec{s}_1+\dots+c_n\vec{s}_n=\vec{0} then multiplying both sides of the relationship by two gives another relationship. If the first relationship is nontrivial then the second is also.
Problem 13

Prove that a polynomial gives rise to the zero function if and only if it is the zero polynomial. (Comment. This question is not a Linear Algebra matter, but we often use the result. A polynomial gives rise to a function in the obvious way: x\mapsto c_nx^n+\dots+c_1x+c_0.)

Answer

In this "if and only if" statement, the "if" half is clear— if the polynomial is the zero polynomial then the function that arises from the action of the polynomial must be the zero function x\mapsto 0. For "only if" we write p(x)=c_nx^n+\dots+c_0. Plugging in zero p(0)=0 gives that c_0=0. Taking the derivative and plugging in zero p^\prime(0)=0 gives that c_1=0. Similarly we get that each c_i is zero, and p is the zero polynomial.

Problem 14

Return to Section 1.2 and redefine point, line, plane, and other linear surfaces to avoid degenerate cases.

Answer

The work in this section suggests that an  n -dimensional non-degenerate linear surface should be defined as the span of a linearly independent set of  n vectors.

Problem 15
  1. Show that any set of four vectors in  \mathbb{R}^2 is linearly dependent.
  2. Is this true for any set of five? Any set of three?
  3. What is the most number of elements that a linearly independent subset of \mathbb{R}^2 can have?
Answer
  1. For any a_{1,1}, ..., a_{2,4},
    
c_1\begin{pmatrix} a_{1,1} \\ a_{2,1} \end{pmatrix}
+c_2\begin{pmatrix} a_{1,2} \\ a_{2,2} \end{pmatrix}
+c_3\begin{pmatrix} a_{1,3} \\ a_{2,3} \end{pmatrix}
+c_4\begin{pmatrix} a_{1,4} \\ a_{2,4} \end{pmatrix}
=\begin{pmatrix} 0 \\ 0 \end{pmatrix}
    yields a linear system
    
\begin{array}{*{4}{rc}r}
a_{1,1}c_1 &+ &a_{1,2}c_2 &+ &a_{1,3}c_3 &+ &a_{1,4}c_4 &= &0 \\
a_{2,1}c_1 &+ &a_{2,2}c_2 &+ &a_{2,3}c_3 &+ &a_{2,4}c_4 &= &0
\end{array}
    that has infinitely many solutions (Gauss' method leaves at least two variables free). Hence there are nontrivial linear relationships among the given members of \mathbb{R}^2.
  2. Any set five vectors is a superset of a set of four vectors, and so is linearly dependent. With three vectors from \mathbb{R}^2, the argument from the prior item still applies, with the slight change that Gauss' method now only leaves at least one variable free (but that still gives infinitely many solutions).
  3. The prior item shows that no three-element subset of \mathbb{R}^2 is independent. We know that there are two-element subsets of \mathbb{R}^2 that are independent— one is
    
\{\begin{pmatrix} 1 \\ 0 \end{pmatrix},\begin{pmatrix} 0 \\ 1 \end{pmatrix}\}
    and so the answer is two.
This exercise is recommended for all readers.
Problem 16

Is there a set of four vectors in  \mathbb{R}^3 , any three of which form a linearly independent set?

Answer

Yes; here is one.


\{\begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix},
\begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix},
\begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix},
\begin{pmatrix} 1 \\ 1 \\ 1 \end{pmatrix} \}
Problem 17

Must every linearly dependent set have a subset that is dependent and a subset that is independent?

Answer

Yes. The two improper subsets, the entire set and the empty subset, serve as examples.

Problem 18

In  \mathbb{R}^4 , what is the biggest linearly independent set you can find? The smallest? The biggest linearly dependent set? The smallest? ("Biggest" and "smallest" mean that there are no supersets or subsets with the same property.)

Answer

In  \mathbb{R}^4 the biggest linearly independent set has four vectors. There are many examples of such sets, this is one.


\{\begin{pmatrix} 1 \\ 0 \\ 0 \\ 0 \end{pmatrix},
\begin{pmatrix} 0 \\ 1 \\ 0 \\ 0 \end{pmatrix},
\begin{pmatrix} 0 \\ 0 \\ 1 \\ 0 \end{pmatrix},
\begin{pmatrix} 0 \\ 0 \\ 0 \\ 1 \end{pmatrix} \}

To see that no set with five or more vectors can be independent, set up


c_1\begin{pmatrix} a_{1,1} \\ a_{2,1} \\ a_{3,1} \\ a_{4,1} \end{pmatrix}
+c_2\begin{pmatrix} a_{1,2} \\ a_{2,2} \\ a_{3,2} \\ a_{4,2} \end{pmatrix}
+c_3\begin{pmatrix} a_{1,3} \\ a_{2,3} \\ a_{3,3} \\ a_{4,3} \end{pmatrix}
+c_4\begin{pmatrix} a_{1,4} \\ a_{2,4} \\ a_{3,4} \\ a_{4,4} \end{pmatrix}
+c_5\begin{pmatrix} a_{1,5} \\ a_{2,5} \\ a_{3,5} \\ a_{4,5} \end{pmatrix}
=\begin{pmatrix} 0 \\ 0 \\ 0 \\ 0 \end{pmatrix}

and note that the resulting linear system


\begin{array}{*{5}{rc}r}
a_{1,1}c_1 &+ &a_{1,2}c_2 &+ &a_{1,3}c_3
&+ &a_{1,4}c_4 &+ &a_{1,5}c_5 &= &0  \\
a_{2,1}c_1 &+ &a_{2,2}c_2 &+ &a_{2,3}c_3
&+ &a_{2,4}c_4 &+ &a_{2,5}c_5 &= &0  \\
a_{3,1}c_1 &+ &a_{3,2}c_2 &+ &a_{3,3}c_3
&+ &a_{3,4}c_4 &+ &a_{3,5}c_5 &= &0  \\
a_{4,1}c_1 &+ &a_{4,2}c_2 &+ &a_{4,3}c_3
&+ &a_{4,4}c_4 &+ &a_{4,5}c_5 &= &0
\end{array}

has four equations and five unknowns, so Gauss' method must end with at least one  c variable free, so there are infinitely many solutions, and so the above linear relationship among the four-tall vectors has more solutions than just the trivial solution.

The smallest linearly independent set is the empty set.

The biggest linearly dependent set is  \mathbb{R}^4 . The smallest is  \{\vec{0}\} .

This exercise is recommended for all readers.
Problem 19

Linear independence and linear dependence are properties of sets. We can thus naturally ask how those properties act with respect to the familiar elementary set relations and operations. In this body of this subsection we have covered the subset and superset relations. We can also consider the operations of intersection, complementation, and union.

  1. How does linear independence relate to intersection: can an intersection of linearly independent sets be independent? Must it be?
  2. How does linear independence relate to complementation?
  3. Show that the union of two linearly independent sets need not be linearly independent.
  4. Characterize when the union of two linearly independent sets is linearly independent, in terms of the intersection of the span of each.
Answer
  1. The intersection of two linearly independent sets S\cap T must be linearly independent as it is a subset of the linearly independent set S (as well as the linearly independent set T also, of course).
  2. The complement of a linearly independent set is linearly dependent as it contains the zero vector.
  3. We must produce an example. One, in  \mathbb{R}^2 , is
    
S=\{\begin{pmatrix} 1 \\ 0 \end{pmatrix}\}
\quad\text{and}\quad
T=\{\begin{pmatrix} 2 \\ 0 \end{pmatrix}\}
    since the linear dependence of  S_1\cup S_2 is easily seen.
  4. The union of two linearly independent sets S\cup T is linearly independent if and only if their spans have a trivial intersection [S]\cap [T]=\{\vec{0}\}. To prove that, assume that  S and  T are linearly independent subsets of some vector space. For the "only if" direction, assume that the intersection of the spans is trivial  [S]\cap [T]=\{\vec{0}\} . Consider the set S\cup T. Any linear relationship c_1\vec{s}_1+\dots+c_n\vec{s}_n +d_1\vec{t}_1+\dots+d_m\vec{t}_m=\vec{0} gives c_1\vec{s}_1+\dots+c_n\vec{s}_n= -d_1\vec{t}_1-\dots-d_m\vec{t}_m. The left side of that equation sums to a vector in [S], and the right side is a vector in [T]. Therefore, since the intersection of the spans is trivial, both sides equal the zero vector. Because S is linearly independent, all of the c's are zero. Because T is linearly independent, all of the d's are zero. Thus, the original linear relationship among members of S\cup T only holds if all of the coefficients are zero. That shows that S\cup T is linearly independent. For the "if" half we can make the same argument in reverse. If the union S\cup T is linearly independent, that is, if the only solution to c_1\vec{s}_1+\cdots+c_n\vec{s}_n +d_1\vec{t}_1+\cdots+d_m\vec{t}_m =\vec{0} is the trivial solution c_1=0, ..., d_m=0, then any vector \vec{v} in the intersection of the spans \vec{v}=c_1\vec{s}_1+\cdots+c_n\vec{s}_n =-d_1\vec{t}_1-\cdots=d_m\vec{t}_m must be the zero vector because each scalar is zero.
This exercise is recommended for all readers.
Problem 20

For Theorem 1.12,

  1. fill in the induction for the proof;
  2. give an alternate proof that starts with the empty set and builds a sequence of linearly independent subsets of the given finite set until one appears with the same span as the given set.
Answer
  1. We do induction on the number of vectors in the finite set  S . The base case is that S has no elements. In this case S is linearly independent and there is nothing to check— a subset of S that has the same span as S is S itself. For the inductive step assume that the theorem is true for all sets of size n=0, n=1, ..., n=k in order to prove that it holds when  S has n=k+1 elements. If the k+1-element set  S=\{\vec{s}_0,\dots,\vec{s}_{k}\} is linearly independent then the theorem is trivial, so assume that it is dependent. By Corollary 1.17 there is an  \vec{s}_i that is a linear combination of other vectors in  S . Define  S_1=S-\{\vec{s}_i\} and note that  S_1 has the same span as  S by Lemma 1.1. The set  S_1 has  k elements and so the inductive hypothesis applies to give that it has a linearly independent subset with the same span. That subset of  S_1 is the desired subset of  S .
  2. Here is a sketch of the argument. The induction argument details have been left out. If the finite set  S is empty then there is nothing to prove. If  S=\{\vec{0}\} then the empty subset will do. Otherwise, take some nonzero vector  \vec{s}_1\in S and define  S_1=\{\vec{s}_1\} . If  [S_1]=[S] then this proof is finished by noting that  S_1 is linearly independent. If not, then there is a nonzero vector  \vec{s}_2\in S-[S_1] (if every  \vec{s}\in S is in  [S_1] then  [S_1]=[S] ). Define  S_2=S_1\cup\{\vec{s}_2\} . If  [S_2]=[S] then this proof is finished by using Theorem 1.17 to show that  S_2 is linearly independent. Repeat the last paragraph until a set with a big enough span appears. That must eventually happen because  S is finite, and  [S] will be reached at worst when every vector from  S has been used.
Problem 21

With a little calculation we can get formulas to determine whether or not a set of vectors is linearly independent.

  1. Show that this subset of  \mathbb{R}^2
    
\{\begin{pmatrix} a \\ c \end{pmatrix},\begin{pmatrix} b \\ d \end{pmatrix}\}
    is linearly independent if and only if  ad-bc\neq 0 .
  2. Show that this subset of  \mathbb{R}^3
    
\{\begin{pmatrix} a \\ d \\ g \end{pmatrix},
\begin{pmatrix} b \\ e \\ h \end{pmatrix},
\begin{pmatrix} c \\ f \\ i \end{pmatrix} \}
    is linearly independent iff  aei+bfg+cdh-hfa-idb-gec \neq 0 .
  3. When is this subset of  \mathbb{R}^3
    
\{\begin{pmatrix} a \\ d \\ g \end{pmatrix},
\begin{pmatrix} b \\ e \\ h \end{pmatrix} \}
    linearly independent?
  4. This is an opinion question: for a set of four vectors from  \mathbb{R}^4 , must there be a formula involving the sixteen entries that determines independence of the set? (You needn't produce such a formula, just decide if one exists.)
Answer
  1. Assuming first that  a\neq 0 ,
    
x\begin{pmatrix} a \\ c \end{pmatrix}
+y\begin{pmatrix} b \\ d \end{pmatrix}
=\begin{pmatrix} 0 \\ 0 \end{pmatrix}
    gives
    
\begin{array}{*{2}{rc}r}
ax &+ &by &= &0 \\
cx &+ &dy &= &0
\end{array}
\;\xrightarrow[]{-(c/a)\rho_1+\rho_2}\;
\begin{array}{*{2}{rc}r}
ax  &+  &by           &=  &0  \\
&   &(-(c/a)b+d)y &=  &0
\end{array}
    which has a solution if and only if  0\neq-(c/a)b+d=(-cb+ad)/d (we've assumed in this case that  a\neq 0 , and so back substitution yields a unique solution). The  a=0 case is also not hard— break it into the  c\neq 0 and  c=0 subcases and note that in these cases  ad-bc=0\cdot d-bc . Comment. An earlier exercise showed that a two-vector set is linearly dependent if and only if either vector is a scalar multiple of the other. That can also be used to make the calculation.
  2. The equation
    
c_1\begin{pmatrix} a \\ d \\ g \end{pmatrix}
+c_2\begin{pmatrix} b \\ e \\ h \end{pmatrix}
+c_3\begin{pmatrix} c \\ f \\ i \end{pmatrix}
=\begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}
    gives rise to a homogeneous linear system. We proceed by writing it in matrix form and applying Gauss' method. We first reduce the matrix to upper-triangular. Assume that  a\neq 0 .
    \begin{array}{rcl}
\xrightarrow[]{(1/a)\rho_1}
\left(\begin{array}{*{3}{c}|c}
1   &b/a   &c/a  &0 \\
d   &e     &f    &0 \\
g   &h     &i    &0
\end{array}\right)
&\xrightarrow[-g\rho_1+\rho_3]{-d\rho_1+\rho_2}
&\left(\begin{array}{*{3}{c}|c}
1   &b/a           &c/a        &0   \\
0   &(ae-bd)/a     &(af-cd)/a  &0   \\
0   &(ah-bg)/a     &(ai-cg)/a  &0
\end{array}\right) \\
&\xrightarrow[]{(a/(ae-bd))\rho_2}
&\left(\begin{array}{*{3}{c}|c}
1   &b/a           &c/a             &0  \\
0   &1             &(af-cd)/(ae-bd) &0  \\
0   &(ah-bg)/a     &(ai-cg)/a       &0
\end{array}\right)
\end{array}
    (where we've assumed for the moment that  ae-bd\neq 0 in order to do the row reduction step). Then, under the assumptions, we get this.
    \begin{array}{rcl}
&\xrightarrow[]{((ah-bg)/a)\rho_2+\rho_3}
&\left(\begin{array}{*{3}{c}|c}
1   &\frac{b}{a}   &\frac{c}{a}                           &0 \\
0   &1             &\frac{af-cd}{ae-bd}                   &0 \\
0   &0             &\frac{aei+bgf+cdh-hfa-idb-gec}{ae-bd} &0
\end{array}\right)
\end{array}
    shows that the original system is nonsingular if and only if the  3,3 entry is nonzero. This fraction is defined because of the  ae-bd\neq 0 assumption, and it will equal zero if and only if its numerator equals zero. We next worry about the assumptions. First, if  a\neq 0 but  ae-bd=0 then we swap
    \begin{array}{rcl}
\left(\begin{array}{*{3}{c}|c}
1   &b/a           &c/a        &0   \\
0   &0             &(af-cd)/a  &0   \\
0   &(ah-bg)/a     &(ai-cg)/a  &0
\end{array}\right)
&\xrightarrow[]{\rho_2\leftrightarrow\rho_3}
&\left(\begin{array}{*{3}{c}|c}
1   &b/a           &c/a        &0   \\
0   &(ah-bg)/a     &(ai-cg)/a  &0   \\
0   &0             &(af-cd)/a  &0
\end{array}\right)
\end{array}
    and conclude that the system is nonsingular if and only if either  ah-bg=0 or  af-cd=0 . That's the same as asking that their product be zero:
    \begin{array}{rl}
ahaf-ahcd-bgaf+bgcd
&=0                   \\
ahaf-ahcd-bgaf+aegc
&=0                   \\
a(haf-hcd-bgf+egc)
&=0
\end{array}
    (in going from the first line to the second we've applied the case assumption that ae-bd=0 by substituting ae for bd). Since we are assuming that  a\neq 0 , we have that  haf-hcd-bgf+egc=0 . With ae-bd=0 we can rewrite this to fit the form we need: in this  a\neq 0 and  ae-bd=0 case, the given system is nonsingular when  haf-hcd-bgf+egc-i(ae-bd)=0 , as required. The remaining cases have the same character. Do the  a=0 but  d\neq 0 case and the  a=0 and  d=0 but  g\neq 0 case by first swapping rows and then going on as above. The  a=0 ,  d=0 , and  g=0 case is easy— a set with a zero vector is linearly dependent, and the formula comes out to equal zero.
  3. It is linearly dependent if and only if either vector is a multiple of the other. That is, it is not independent iff
    
\begin{pmatrix} a \\ d \\ g \end{pmatrix}=r\cdot\begin{pmatrix} b \\ e \\ h \end{pmatrix}
\quad\text{or}\quad
\begin{pmatrix} b \\ e \\ h \end{pmatrix}=s\cdot\begin{pmatrix} a \\ d \\ g \end{pmatrix}
    (or both) for some scalars r and s. Eliminating r and s in order to restate this condition only in terms of the given letters a, b, d, e, g, h, we have that it is not independent— it is dependent— iff  ae-bd=ah-gb=dh-ge .
  4. Dependence or independence is a function of the indices, so there is indeed a formula (although at first glance a person might think the formula involves cases: "if the first component of the first vector is zero then ...", this guess turns out not to be correct).
This exercise is recommended for all readers.
Problem 22
  1. Prove that a set of two perpendicular nonzero vectors from  \mathbb{R}^n is linearly independent when  n>1 .
  2. What if  n=1 ?  n=0 ?
  3. Generalize to more than two vectors.
Answer

Recall that two vectors from  \mathbb{R}^n are perpendicular if and only if their dot product is zero.

  1. Assume that  \vec{v} and  \vec{w} are perpendicular nonzero vectors in \mathbb{R}^n, with  n>1 . With the linear relationship  c\vec{v}+d\vec{w}=\vec{0} , apply  \vec{v} to both sides to conclude that  c\cdot|\vec{v}|^2+d\cdot 0=0 . Because  \vec{v}\neq\vec{0} we have that  c=0 . A similar application of  \vec{w} shows that  d=0 .
  2. Two vectors in  \mathbb{R}^1 are perpendicular if and only if at least one of them is zero. We define  \mathbb{R}^0 to be a trivial space, and so both \vec{v} and \vec{w} are the zero vector.
  3. The right generalization is to look at a set  \{\vec{v}_1,\dots,\vec{v}_n\}\subseteq\mathbb{R}^k of vectors that are mutually orthogonal (also called pairwise perpendicular): if  i\neq j then  \vec{v}_i is perpendicular to  \vec{v}_j . Mimicking the proof of the first item above shows that such a set of nonzero vectors is linearly independent.
Problem 23

Consider the set of functions from the open interval (-1..1) to \mathbb{R}.

  1. Show that this set is a vector space under the usual operations.
  2. Recall the formula for the sum of an infinite geometric series:  1+x+x^2+\cdots=1/(1-x) for all  x\in(-1..1) . Why does this not express a dependence inside of the set \{g(x)=1/(1-x),f_0(x)=1,f_1(x)=x,f_2(x)=x^2,\ldots\} (in the vector space that we are considering)? (Hint. Review the definition of linear combination.)
  3. Show that the set in the prior item is linearly independent.

This shows that some vector spaces exist with linearly independent subsets that are infinite.

Answer
  1. This check is routine.
  2. The summation is infinite (has infinitely many summands). The definition of linear combination involves only finite sums.
  3. No nontrivial finite sum of members of  \{g,f_0,f_1,\ldots\} adds to the zero object: assume that
    
c_0\cdot (1/(1-x))+c_1\cdot 1+\dots+c_n\cdot x^n=0
    (any finite sum uses a highest power, here  n ). Multiply both sides by  1-x to conclude that each coefficient is zero, because a polynomial describes the zero function only when it is the zero polynomial.
Problem 24

Show that, where  S is a subspace of  V , if a subset T of  S is linearly independent in  S then T is also linearly independent in  V . Is that "only if"?

Answer

It is both "if" and "only if".

Let  T be a subset of the subspace  S of the vector space  V . The assertion that any linear relationship c_1\vec{t}_1+\dots+c_n\vec{t}_n=\vec{0} among members of  T must be the trivial relationship c_1=0, ..., c_n=0 is a statement that holds in  S if and only if it holds in  V , because the subspace  S inherits its addition and scalar multiplication operations from  V .