Linear Algebra Home

## Linear Independence

Definition. A set $$S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}$$ of vectors of a vector space $$V$$ is linearly independent if the only linear combination of vectors in $$S$$ that produces $$\overrightarrow{0}$$ is a trivial linear combination., i.e., $c_1\overrightarrow{v_1}+c_2\overrightarrow{v_2}+\cdots+c_k\overrightarrow{v_k}=\overrightarrow{0} \implies c_1=c_2=\cdots=c_k=0.$ $$S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}$$ is linearly dependent if $$S$$ is not linearly independent, i.e., there are scalars $$c_1,c_2,\ldots,c_k$$, not all zero, such that $c_1\overrightarrow{v_1}+c_2\overrightarrow{v_2}+\cdots+c_k\overrightarrow{v_k}=\overrightarrow{0}.$

Example.

1. $$\{\overrightarrow{v}\}$$ is linearly independent in $$V$$ if and only if $$\overrightarrow{v}\neq \overrightarrow{0_V}$$.

2. $$\{\overrightarrow{e_1},\overrightarrow{e_2},\ldots,\overrightarrow{e_n}\}$$ is a linearly independent set of vectors in $$\mathbb R^n$$.

3. $$\{\overrightarrow{1},\overrightarrow{t},\overrightarrow{t^2},\ldots,\overrightarrow{t^n}\}$$ is a linearly independent set of vectors in $$P_n$$.

4. $$\{\overrightarrow{e_1},\overrightarrow{e_2},\ldots,\overrightarrow{e_n},\ldots\}$$ is a linearly independent set of vectors in $$\mathbb{R}^{\infty}$$ where $$\overrightarrow{e_i}$$ is the infinite sequence with $$1$$ in the $$i$$th place and $$0$$ elsewhere.

5. $$B=\{\overrightarrow{E_{i,j}}: 1 \leq i \leq m,1 \leq j \leq n\}$$ is a linearly independent set of vectors in $$M_{m, n}(\mathbb R)$$ where $$\overrightarrow{E_{i,j}}$$ is the $$m\times n$$ matrix with $$(i,j)$$-entry 1 and $$0$$ elsewhere.

6. Consider the following three polynomials in $$P_2$$: $$\overrightarrow{p_1}(t)=t+2t^2$$, $$\overrightarrow{p_2}(t)=2+2t^2$$, and $$\overrightarrow{p_3}(t)=1-t-t^2$$. Show that $$\{\overrightarrow{p_1},\;\overrightarrow{p_2},\;\overrightarrow{p_3}\}$$ is a linearly dependent set in $$P_2$$.

Solution. Suppose $$c_1\overrightarrow{p_1}+c_2\overrightarrow{p_2}+c_3\overrightarrow{p_3}=\overrightarrow{0}$$ for some scalars $$c_1,c_2,c_3$$. Then for all $$t$$, \begin{align*} (c_1\overrightarrow{p_1}+c_2\overrightarrow{p_2}+c_3\overrightarrow{p_3})(t)&=0\\ c_1\overrightarrow{p_1}(t)+c_2\overrightarrow{p_2}(t)+c_3\overrightarrow{p_3}(t)&=0\\ c_1(t+2t^2)+c_2(2+2t^2)+c_3(1-t-t^2)&=0\\ (2c_2+c_3)+(c_1-c_3)t+(2c_1+2c_2-c_3)t^2&=0. \end{align*} Thus $$2c_2+c_3=0,\;c_1-c_3=0,\;2c_1+2c_2-c_3=0$$. One solution is $$(c_1,c_2,c_3)=(2,-1,2)$$. So $$2\overrightarrow{p_1}-\overrightarrow{p_2}+2\overrightarrow{p_3}=\overrightarrow{0}$$ and $$\{\overrightarrow{p_1},\;\overrightarrow{p_2},\;\overrightarrow{p_3}\}$$ is a linearly dependent set in $$P_2$$.

Theorem. A set $$S=\{\overrightarrow{v_1},\overrightarrow{v_2},\ldots,\overrightarrow{v_k}\}$$ of $$k\geq 2$$ vectors in a vector space $$V$$ is linearly dependent if and only if there exists a vector in $$S$$ that is a linear combination of the other vectors in $$S$$.

Similar to the proof of the last theorem in Linear Independence in $$\mathbb R^n$$.

Last edited