Definition.
A set \(S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}\) of vectors of \(\mathbb R^n\)
is linearly independent if the only linear combination of vectors in \(S\) that produces
\(\overrightarrow{0}\) is a trivial linear combination., i.e.,
\[c_1\overrightarrow{v_1}+c_2\overrightarrow{v_2}+\cdots+c_k\overrightarrow{v_k}=\overrightarrow{0}
\implies c_1=c_2=\cdots=c_k=0.\]
\(S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}\) is linearly dependent
if \(S\) is not linearly independent, i.e., there are scalars \(c_1,c_2,\ldots,c_k\), not all zero, such that
\[c_1\overrightarrow{v_1}+c_2\overrightarrow{v_2}+\cdots+c_k\overrightarrow{v_k}=\overrightarrow{0}.\]
Remark.
\(\{ \overrightarrow{0} \}\) is linearly dependent as \(2\overrightarrow{0}=\overrightarrow{0}\).
\(\{ \overrightarrow{v} \}\) is linearly independent if and only if \(\overrightarrow{v}\neq \overrightarrow{0}\).
Let \(S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}\) and \(A=[\overrightarrow{v_1}\:\overrightarrow{v_2}\:\cdots\overrightarrow{v_k}]\).
Then \(S\) is linearly independent if and only if \(\overrightarrow{0}\) is the only solution of \(A\overrightarrow{x}=\overrightarrow{0}\)
if and only if \(\operatorname{NS}\left(A\right)=\{ \overrightarrow{0} \}\).
Example.
Determine if the following vectors are linearly independent.
\[\overrightarrow{v_1}= \left[\begin{array}{r} 1\\ 2 \end{array} \right],\;
\overrightarrow{v_2}= \left[\begin{array}{r} 2\\3 \end{array} \right]\]
Solution. We investigate if \(c_1\overrightarrow{v_1}+c_2\overrightarrow{v_2}=\overrightarrow{0} \implies c_1=c_2=0\).
\[[A\;\overrightarrow{0}]=\left[\begin{array}{rr|r}\boxed{1}&2&0\\ 2&3&0\end{array} \right]
\xrightarrow{-2R_1+R_2}
\left[\begin{array}{rr|r}\boxed{1}&2&0\\ 0&\boxed{-1}&0\end{array} \right](REF)\]
Each column of \(A\) is a pivot column giving no free variables. So there is a unique solution of \(A\overrightarrow{x}=\overrightarrow{0}\)
which is \(\overrightarrow{0}\). Thus \(\overrightarrow{v_1}\) and \(\overrightarrow{v_2}\) are linearly independent.
Note that each of \(\overrightarrow{v_1}\) and \(\overrightarrow{v_2}\) is not a multiple of the other.
Determine if the columns of \(A\) are linearly independent for \(A=\left[\begin{array}{rrrr}
1&2&3&4\\
1&3&5&8\\
1&2&4&7\end{array} \right]\).
Solution.
\[A=\left[\begin{array}{rrrr}
\boxed{1}&2&3&4\\
1&3&5&8\\
1&2&4&7\end{array} \right]
\xrightarrow[-R_1+R_3]{-R_1+R_2}
\left[\begin{array}{rrrr}
\boxed{1}&2&3&4\\
0&\boxed{1}&2&4\\
0&0&\boxed{1}&3 \end{array} \right] (REF)\]
\(A\) has a non-pivot column giving a free variable. So there are infinitely many solutions of
\(A\overrightarrow{x}=\overrightarrow{0}\). Thus the columns of \(A\) are linearly dependent. Verify that one
solution is \((x_1,x_2,x_3,x_4)=(1,2,-3,1)\). So we get the following linear dependence relation among the
columns of \(A\):
\[ 1\left[\begin{array}{r} 1\\ 1\\1 \end{array} \right]
+2\left[\begin{array}{r} 2\\ 3\\2 \end{array} \right]
-3\left[\begin{array}{r} 3\\ 5\\4 \end{array} \right]
+1\left[\begin{array}{r} 4\\ 8\\7 \end{array} \right]=\left[\begin{array}{r} 0\\ 0\\0\end{array} \right].\]
Remark.
The columns of an \(m\times n\) matrix are linearly dependent when \(m < n\) because \(A\) would have a
non-pivot column giving a free variable for solutions of the system \(A\overrightarrow{x}=\overrightarrow{0}\).
Theorem.
A set \(S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}\) of \(k\geq 2\) vectors
in \(\mathbb R^n\) is linearly dependent if and only if there exists a vector in \(S\) that is a linear
combination of the other vectors in \(S\).
Let \(S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}\) be a set of \(k\geq 2\)
vectors in \(\mathbb R^n\). First suppose \(S\) is linearly dependent. Then there are scalars
\(c_1,c_2,\ldots,c_k\), not all zero, such that
\[c_1\overrightarrow{v_1}+c_2\overrightarrow{v_2}+\cdots+c_k\overrightarrow{v_k}=\overrightarrow{0}.\]
Choose \(i\in \{1,2,\ldots,k\}\) such that \(c_i\neq 0\). Then
\[\begin{align*}
c_1\overrightarrow{v_1}+c_2\overrightarrow{v_2}+\cdots+c_k\overrightarrow{v_k}=\overrightarrow{0}
&\implies -c_i\overrightarrow{v_i}=c_1\overrightarrow{v_1}+\cdots +c_{i-1}\overrightarrow{v_{i-1}}+c_{i+1}\overrightarrow{v_{i+1}}+\cdots+c_k\overrightarrow{v_k}\\
& \implies \overrightarrow{v_i}= -\frac{c_1}{c_i}\overrightarrow{v_1}-\cdots -\frac{c_{i-1}}{c_i}\overrightarrow{v_{i-1}}-\frac{c_{i+1}}{c_i}\overrightarrow{v_{i+1}}-\cdots -\frac{c_k}{c_i}\overrightarrow{v_k}.
\end{align*}\]
Conversely suppose there is \(i\in \{1,2,\ldots,k\}\) such that
\[\overrightarrow{v_i}= d_1\overrightarrow{v_1}+\cdots +d_{i-1}\overrightarrow{v_{i-1}}+d_{i+1}\overrightarrow{v_{i+1}}+\cdots +d_k\overrightarrow{v_k},\]
for some scalars \(d_1,\ldots,d_{i-1},d_{i+1},\ldots,d_k\). Then we have a nontrivial linear combination
producing \(\overrightarrow{0}\):
\[ d_1\overrightarrow{v_1}+\cdots +d_{i-1}\overrightarrow{v_{i-1}}-\overrightarrow{v_i}+d_{i+1}\overrightarrow{v_{i+1}}+\cdots +d_k\overrightarrow{v_k}=\overrightarrow{0}.\]
Thus \(S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}\) is linearly dependent in
\(\mathbb R^n\).
Example.
For \(A=[\overrightarrow{a_1}\:\overrightarrow{a_2}\:\overrightarrow{a_3}\:\overrightarrow{a_4}]
=\left[\begin{array}{rrrr}
1&2&3&4\\
1&3&5&8\\
1&2&4&7\end{array} \right],\) we have shown that the columns are linearly dependent and
\(\overrightarrow{a_1}+2\overrightarrow{a_2}-3\overrightarrow{a_3}+\overrightarrow{a_4}=\overrightarrow{0}\).
We can write the first column in terms of the other columns:
\(\overrightarrow{a_1}=-2\overrightarrow{a_2}+3\overrightarrow{a_3}-\overrightarrow{a_4}\). In fact we can write
any column in terms of the others (which may not be the case for any given linearly dependent set of vectors).