# 4.12 Finding dimensions

The extension lemma has all sorts of consequences that are very useful for making arguments about the dimension of a vector space. In this section we’ll write down the most common ones.

## 4.12.1 Lower bound for the dimension of a vector space

As soon as you see $k$ linearly independent elements in a vector space, you know its dimension is at least $k$.

###### Corollary 4.12.1.

Let $V$ be a vector space and let $\mathbf{v}_{1},\ldots,\mathbf{v}_{k}$ be linearly independent elements of $V$. Then $\dim V\geqslant k$.

###### Proof.

You can extend these elements to a basis of $V$ having size at least $k$, and the size of that basis is the dimension of $V$. ∎

## 4.12.2 Any $\dim V+1$ elements must be linearly dependent

###### Theorem 4.12.2.

Any sequence of at least $n+1$ elements in a vector space of dimension $n$ is linearly dependent.

###### Proof.

The vector space has a basis of size $n$, which is in particular a spanning sequence of size $n$. By Theorem 4.9.1 (Steinitz Exchange), any linearly independent sequence has size at most $n$. ∎

For example, if you have 4 vectors in $\mathbb{R}^{3}$ you know they must be linearly dependent, no matter what they are.

## 4.12.3 Dimensions of subspaces

###### Proposition 4.12.3.

If $U\leqslant V$ then

1. 1.

$\dim U\leqslant\dim V$, and

2. 2.

if $\dim U=\dim V$ then $U=V$.

###### Proof.
1. 1.

A basis of $U$ is a linearly independent sequence in $V$ and a basis of $V$ is (in particular) a spanning sequence for $V$, so by Theorem 4.9.1 the size of a basis of $U$ is less than or equal to the size of a basis of $V$.

2. 2.

Let $\dim V=n$ and let $\mathbf{u}_{1},\ldots,\mathbf{u}_{n}$ be a basis of $U$, so $U=\operatorname{span}(\mathbf{u}_{1},\ldots,\mathbf{u}_{n})$. Suppose for a contradiction that $U\neq V$, and let $\mathbf{v}$ be an element of $V$ not in $U$. Then $\mathbf{u}_{1},\ldots,\mathbf{u}_{n},\mathbf{v}$ is linearly independent (by the extension lemma, Lemma 4.11.1), which contradicts Theorem 4.12.2. ∎

As soon as you have $n$ linearly independent elements in a vector space of dimension $n$, they must be a basis.

###### Corollary 4.12.4.

Let $V$ be a vector space of dimension $n$. Any sequence of $n$ linearly independent elements of $V$ are a basis of $V$.

###### Proof.

Let $U$ be the span of this sequence. This length $n$ sequence spans $U$ by definition, and it is linearly independent, so it is a basis of $U$ and $\dim U=n$. The previous proposition tells us $U=V$, so in fact the sequence is a basis of $V$. ∎

## 4.12.4 Dimension of a sum of subspaces

Consider two sets $X$ and $Y$. What’s the size of $X\cup Y$ in terms of the size of $X$ and the size of $Y$? It isn’t $|X|+|Y|$, in general, because elements belonging to $X$ and $Y$ get counted twice when you add the sizes like this. The correct answer is $|X|+|Y|-|X\cup Y|$. We would like a similar result for sums of subspaces.

###### Theorem 4.12.5.

Let $V$ be a vector space and $X,Y\leqslant V$. Then

 $\dim(X+Y)=\dim X+\dim Y-\dim X\cap Y.$
###### Proof.

Take a basis $\mathcal{I}=\mathbf{i}_{1},\ldots,\mathbf{i}_{k}$ of $X\cap Y$. Extend $\mathcal{I}$ to a basis $\mathcal{X}=\mathbf{i}_{1},\ldots,\mathbf{i}_{k},\mathbf{x}_{1},\ldots,\mathbf% {x}_{n}$ of $X$, using Proposition 4.11.2. Extend $\mathcal{I}$ to a basis $\mathcal{Y}=\mathbf{i}_{1},\ldots,\mathbf{i}_{k},\mathbf{y}_{1},\ldots,\mathbf% {y}_{m}$ of $Y$. It’s now enough to prove that $\mathcal{J}=\mathbf{i}_{1},\ldots,\mathbf{i}_{k},\mathbf{x}_{1},\ldots,\mathbf% {x}_{n},\mathbf{y}_{1},\ldots,\mathbf{y}_{m}$ is a basis of $X+Y$, because if we do that then we will know the size of $\mathcal{J}$, which is $k+n+m$, equals the size of a basis of $\mathcal{I}$ (which is $k+n$) plus the size of a basis of $Y$ (which is $k+m$) minus the size of a basis of $X\cap Y$ (which is $k$).

To check something is a basis for $X+Y$, as always, we must check that it is a spanning sequence for $X+Y$ and that is it linearly independent.

Spanning: let $\mathbf{x}+\mathbf{y}\in X+Y$, where $\mathbf{x}\in X,\mathbf{y}\in Y$. Then there are scalars such that

 $\displaystyle\mathbf{x}$ $\displaystyle=\sum_{j=1}^{k}a_{j}\mathbf{i}_{j}+\sum_{j=1}^{n}c_{j}\mathbf{x}_% {j}$ $\displaystyle y$ $\displaystyle=\sum_{j=1}^{k}b_{j}\mathbf{i}_{j}+\sum_{j=1}^{m}d_{j}\mathbf{y}_% {j}$

and so

 $\mathbf{x}+\mathbf{y}=\sum_{j=1}^{k}(a_{j}+b_{j})\mathbf{i}_{j}+\sum_{j=1}^{n}% c_{j}\mathbf{x}_{j}+\sum_{j=1}^{m}d_{j}\mathbf{y}_{j}$

Linear independence: suppose

 $\sum_{j=1}^{k}a_{j}\mathbf{i}_{j}+\sum_{j=1}^{n}c_{j}\mathbf{x}_{j}+\sum_{j=1}% ^{m}d_{j}\mathbf{y}_{j}=0.$

Rearrange it:

 $\sum_{j=1}^{k}a_{j}\mathbf{i}_{j}+\sum_{j=1}^{n}c_{j}\mathbf{x}_{j}=-\sum_{j=1% }^{m}d_{j}\mathbf{y}_{j}.$

The left hand side is in $X$ and the right hand side is in $Y$. So both sides are in $X\cap Y$, in particular, the right hand side is in $X\cap Y$. Since $\mathcal{I}$ is a basis of $X\cap Y$, there are scalars $e_{j}$ such that

 $\sum_{j=1}^{k}e_{j}\mathbf{i}_{j}=-\sum_{j=1}^{m}d_{j}\mathbf{y}_{j}$

This is a linear dependence on $\mathcal{Y}$ which is linearly independent, so all the $d_{j}$ are 0. Similarly all the $c_{j}$ are 0. So the $a_{j}$ are 0 too, and we have linear independence. ∎