# 4.6 Linear independence

## 4.6.1 Linear combinations

We met the idea of a linear combination of column vectors in chapter 3. Here it is for elements of an arbitrary vector space.

###### Definition 4.6.1.

Let $V$ be a vector space and $\textbf{v}_{1},\ldots,\textbf{v}_{n}\in V$. A linear combination of $\textbf{v}_{1},\ldots,\textbf{v}_{n}$ is an element of $V$ of the form

 $\lambda_{1}\textbf{v}_{1}+\lambda_{2}\textbf{v}_{2}+\cdots+\lambda_{n}\textbf{% v}_{n}$

where the $\lambda_{i}$ are scalars.

## 4.6.2 Linear independence

###### Definition 4.6.2.

Let $V$ be a vector space.

• A sequence $\textbf{v}_{1},\ldots\textbf{v}_{n}$ of elements of $V$ is linearly independent if and only if the only scalars $\lambda_{1},\ldots,\lambda_{n}$ such that $\sum_{i=1}^{n}\lambda_{i}\textbf{v}_{i}=\mathbf{0}_{V}$ are $\lambda_{1}=\cdots=\lambda_{n}=0.$

• A sequence which is not linearly independent is called linearly dependent.

It is important that linear independence is a property of sequences (not sets) of vectors. Sequences have a particular order, and they can contain the same element multiple times.

Checking whether elements of a vector space are linearly independent is simple. You just have to try and find a linear combination that gives the zero vector where not all the scalars are zero. If you can do it, the sequence is linearly dependent, if you can’t it is linearly independent. When we’re talking about vectors in $\mathbb{F}^{n}$, or matrices, this is just solving linear equations.

## 4.6.3 Examples of linear (in)dependence

###### Example 4.6.1.

$\textbf{u}=\begin{pmatrix}1\\ 0\end{pmatrix}$, $\textbf{v}=\begin{pmatrix}0\\ 1\end{pmatrix},\textbf{w}=\begin{pmatrix}1\\ 1\end{pmatrix}$ are not linearly independent in $\mathbb{R}^{2}$, because $1\times\textbf{u}+1\times\textbf{v}+(-1)\times\textbf{w}=\mathbf{0}$.

###### Example 4.6.2.

$\textbf{u}=\begin{pmatrix}1\\ 1\end{pmatrix},\mathbf{v}=\begin{pmatrix}1\\ -1\end{pmatrix}$ are linearly independent in $\mathbb{R}^{2}$. For if $\alpha\textbf{u}+\beta\textbf{v}=\begin{pmatrix}0\\ 0\end{pmatrix}$ then $\begin{pmatrix}\alpha+\beta\\ \alpha-\beta\end{pmatrix}=\begin{pmatrix}0\\ 0\end{pmatrix}$. This is a system of linear equations:

 $\displaystyle\alpha+\beta$ $\displaystyle=0$ $\displaystyle\alpha-\beta$ $\displaystyle=0$

For such a simple system it’s easy to see that the only solution is $\alpha=\beta=0$. This tells you that the only solution to $\alpha\textbf{u}+\beta\textbf{v}=\mathbf{0}$ is $\alpha=\beta=0$, which is the definition of linear independence for $\textbf{u},\textbf{v}$.

###### Example 4.6.3.

$\begin{pmatrix}1\\ 0\end{pmatrix}$ and $\begin{pmatrix}0\\ 1\end{pmatrix}$ are linearly independent in $\mathbb{R}^{2}$. You can prove this in a similar (but easier) way to the previous example.

More generally if $\textbf{e}_{i}$ is the height $n$ column vector with 0 everywhere except 1 at position $i$, then the sequence $\textbf{e}_{1},\ldots,\textbf{e}_{n}$ is linearly independent.

###### Example 4.6.4.

In $\mathcal{F}$, the vector space of all functions $\mathbb{R}\to\mathbb{R}$, I claim that the functions $f(x)=\cos(x)$ and $g(x)=\sin(x)$ are linearly independent. Suppose that $\alpha f+\beta g=0_{\mathcal{F}}$, that is, suppose $\alpha\cos(x)+\beta\sin(x)=0$ for all $x$.

Take $x=0$. Since $\alpha\cos(0)+\beta\sin(0)=0$ we get $\alpha=0$. Now take $x=\pi/2$ to get $\beta\sin(\pi/2)=0$, that is $\beta=0$. We have shown $\alpha=\beta=0$ and so these functions are linearly independent.

Often it turns out that deciding whether a sequence of vectors is linearly independent is equivalent to seeing whether a system of linear equations has only the solution where every variable is zero — so you can apply the methods we learned in chapter 3.