# 4.14 Kernel and image

## 4.14.1 Definition of kernel and image

To every linear transformation we associate two important subspaces.

###### Definition 4.14.1.

let $T:V\to W$ be linear.

1. 1.

the kernel of $T$, written $\ker T$, is $\{\mathbf{v}\in V:T(\mathbf{v})=\mathbf{0}_{W}\}$

2. 2.

the image of $T$, written $\operatorname{im}T$, is $\{T(\mathbf{v}):\mathbf{v}\in V\}$

In other words, the image is what we normally mean by the image of a function.

An important family of examples are the linear maps $T_{A}:\mathbb{F}^{n}\to\mathbb{F}^{m}$ defined by left-multiplication by an $m\times n$ matrix $A$ with entries from the field $\mathbb{F}$. In that case the image $\operatorname{im}T_{A}$ is equal to the column space $C(A)$ by Proposition 3.2.1, and the kernel $\ker T_{A}$ is the nullspace $N(A)$.

## 4.14.2 A property of all linear maps

###### Lemma 4.14.1.

Let $T:V\to W$ be a linear map. Then $T(\mathbf{0}_{V})=\mathbf{0}_{W}$.

###### Proof.
 $\displaystyle T(\mathbf{0}_{V})$ $\displaystyle=T(\mathbf{0}_{V}+\mathbf{0}_{V})$ $\displaystyle=T(\mathbf{0}_{V})+T(\mathbf{0}_{V})$

by the second part of the definition of linearity. Now add $-T(\mathbf{0}_{V})$ to both sides:

 $\displaystyle T(\mathbf{0}_{V})-T(\mathbf{0}_{V})$ $\displaystyle=T(\mathbf{0}_{V})+T(\mathbf{0}_{V})-T(\mathbf{0}_{V})$ $\displaystyle\mathbf{0}_{W}$ $\displaystyle=T(\mathbf{0}_{V})\qed$

## 4.14.3 Kernels and images are subspaces

###### Lemma 4.14.2.

Let $T:V\to W$ be linear. Then $\ker T\leqslant V$ and $\operatorname{im}T\leqslant W$.

###### Proof.

To show something is a subspace you must check the three conditions: it contains the zero vector, it is closed under addition, it is closed under scalar multiplication.

First, the kernel.

1. 1.

To show that the kernel contains $\mathbf{0}_{V}$, we must show that $T(\mathbf{0}_{V})=\mathbf{0}_{W}$. That’s exactly Lemma 4.14.1.

2. 2.

If $\mathbf{v},\mathbf{w}\in\ker T$ then $T(\mathbf{v}+\mathbf{w})=T(\mathbf{v})+T(\mathbf{w})=\mathbf{0}_{W}+\mathbf{0}% _{W}=\mathbf{0}_{W}$, so $\mathbf{v}+\mathbf{w}\in\ker T$.

3. 3.

If $\mathbf{v}\in\ker T$ and $\lambda\in\mathbb{F}$ then $T(\lambda\mathbf{v})=\lambda T(\mathbf{v})$ by the second part of the definition of linearity, and this is $\lambda\mathbf{0}_{W}$ which equals $\mathbf{0}_{W}$. Since $T(\lambda\mathbf{v})=\mathbf{0}_{W}$, we have $\lambda\mathbf{v}\in\ker T$.

Next, the image.

1. 1.

We know from Lemma 4.14.1 that $T(\mathbf{0}_{V})=\mathbf{0}_{W}$, so $\mathbf{0}_{W}\in\operatorname{im}T$.

2. 2.

Any two elements of $\operatorname{im}T$ have the form $T(\mathbf{u}),T(\mathbf{v})$ some $\mathbf{u},\mathbf{v}\in V$. Then $T(\mathbf{u})+T(\mathbf{v})=T(\mathbf{u}+\mathbf{v})$ (second part of the linearity definition), which is an element if $\operatorname{im}T$, so $\operatorname{im}T$ is closed under addition.

3. 3.

If $T(\mathbf{u})\in\operatorname{im}T$ and $\lambda\in\mathbb{F}$ then $\lambda T(\mathbf{u})=T(\lambda\mathbf{u})$ by the second part of the definition of linearity, and this is an element of $\operatorname{im}T$ as it is $T$ applied to something, so $\operatorname{im}T$ is closed under scalar multiplication.

###### Example 4.14.1.

Let $A=\begin{pmatrix}0&1\\ 0&0\end{pmatrix}$ so that we have a linear map $T_{A}:\mathbb{R}^{2}\to\mathbb{R}^{2}$ given by $T_{A}(\mathbf{x})=A\mathbf{x}$. We will find $\operatorname{im}T_{A}$ and $\ker T_{A}$.

 $\displaystyle\operatorname{im}T_{A}$ $\displaystyle=\left\{T_{A}\begin{pmatrix}x\\ y\end{pmatrix}:x,y\in\mathbb{R}\right\}$ $\displaystyle=\left\{\begin{pmatrix}0&1\\ 0&0\end{pmatrix}\begin{pmatrix}x\\ y\end{pmatrix}:x,y\in\mathbb{R}\right\}$ $\displaystyle=\left\{\begin{pmatrix}y\\ 0\end{pmatrix}:x,y\in\mathbb{R}\right\}$

Another way to write this is that $\operatorname{im}T_{A}=\operatorname{span}\begin{pmatrix}1\\ 0\end{pmatrix}$, and so $\dim\operatorname{im}T_{A}=1$.

Now we’ll do the kernel.

 $\displaystyle\ker T_{A}$ $\displaystyle=\left\{\begin{pmatrix}x\\ y\end{pmatrix}\in\mathbb{R}^{2}:T_{A}\begin{pmatrix}x\\ y\end{pmatrix}=\begin{pmatrix}0\\ 0\end{pmatrix}\right\}$ $\displaystyle=\left\{\begin{pmatrix}x\\ y\end{pmatrix}\in\mathbb{R}^{2}:\begin{pmatrix}0&1\\ 0&0\end{pmatrix}\begin{pmatrix}x\\ y\end{pmatrix}=\begin{pmatrix}0\\ 0\end{pmatrix}\right\}$ $\displaystyle=\left\{\begin{pmatrix}x\\ y\end{pmatrix}\in\mathbb{R}^{2}:\begin{pmatrix}y\\ 0\end{pmatrix}=\begin{pmatrix}0\\ 0\end{pmatrix}\right\}$ $\displaystyle=\left\{\begin{pmatrix}x\\ 0\end{pmatrix}:x\in\mathbb{R}\right\}$

Again we could write this as $\ker T_{A}=\operatorname{span}\begin{pmatrix}1\\ 0\end{pmatrix}$. The kernel and image are equal in this case.

###### Example 4.14.2.

Let $D:\mathbb{R}_{\leqslant n}[x]\to\mathbb{R}_{\leqslant n}[x]$ be $D(f)=\frac{df}{dx}$. We will describe $\ker D$ and $\operatorname{im}D$.

A polynomial has derivative zero if and only if it is constant, so $\ker D$ is the set of all constant polynomials. This is spanned by any (nonzero) constant polynomial, so it has dimension one.

Next consider $\operatorname{im}D$. Let $S\leqslant\mathbb{R}_{\leqslant n}[x]$ be the subspace spanned by $1,x,\ldots,x^{n-1}$, that is, the subspace consisting of all polynomials of degree at most $n-1$. Certainly $\operatorname{im}D\leqslant S$, since when you differentiate a polynomial of degree at most n you get a polynomial of degree at most $n-1$. But if $s(x)\in S$ then $s(x)$ has an indefinite integral $t(x)$ in $\mathbb{R}_{\leqslant n}[x]$ and $D(t)=s$, so every $s\in S$ is in $\operatorname{im}D$, so $\operatorname{im}D=S$.

A useful property of the kernel of a linear map is that it tells you whether or not the map is injective.

###### Proposition 4.14.3.

A linear map $T:U\to V$ is injective if and only if $\ker T=\{\mathbf{0}_{U}\}$.

###### Proof.

Suppose $T$ is injective. By Lemma 4.14.1, $T(\mathbf{0}_{U})=\mathbf{0}_{V}$, so $\mathbf{0}_{U}\in\ker T$. Now if $\mathbf{u}\in\ker T$ then $T(\mathbf{u})=\mathbf{0}_{V}$ so $T(\mathbf{u})=T(\mathbf{0}_{U})$, so by injectivity $\mathbf{u}=\mathbf{0}_{U}$. It follows $\ker T=\{\mathbf{0}_{U}\}$.

Conversely suppose $\ker T=\{\mathbf{0}_{U}\}$ and that $T(\mathbf{x})=T(\mathbf{y})$. By linearity $T(\mathbf{x}-\mathbf{y})=\mathbf{0}_{V}$, so $\mathbf{x}-\mathbf{y}\in\ker T$, so $\mathbf{x}-\mathbf{y}=\mathbf{0}_{U}$, so $\mathbf{x}=\mathbf{y}$ and $T$ is injective. ∎