# 3.6 Systems of linear equations

## 3.6.1 Definition of a linear system

###### Definition 3.6.1.

A system of $m$ linear equations in $n$ unknowns $x_{1},\ldots,x_{n}$ with coefficients $a_{ij},1\leqslant i\leqslant m,1\leqslant j\leqslant n$ and $b_{1},\ldots,b_{m}$ is a list of simultaneous equations

 $\displaystyle a_{11}x_{1}+a_{12}x_{2}+\cdots+a_{1n}x_{n}$ $\displaystyle=b_{1}$ $\displaystyle a_{21}x_{1}+a_{22}x_{2}+\cdots+a_{2n}x_{n}$ $\displaystyle=b_{2}$ $\displaystyle\phantom{a_{11}x_{1}+}\vdots\phantom{+a_{2n}x_{n}}$ $\displaystyle\phantom{=b}\vdots$ $\displaystyle a_{m1}x_{1}+a_{m2}x_{2}+\cdots+a_{mn}x_{n}$ $\displaystyle=b_{m}$

As the notation suggests, we can turn a system of linear equations into a matrix equation and study it using matrix methods.

## 3.6.2 Matrix form of a linear system

Every system of linear equations can be written in matrix form: the above system is equivalent to saying that $A\mathbf{x}=\mathbf{b}$, where $A=(a_{ij})$, $\mathbf{x}=\begin{pmatrix}x_{1}\\ \vdots\\ x_{n}\end{pmatrix}$, and $\mathbf{b}=\begin{pmatrix}b_{1}\\ \vdots\\ b_{m}\end{pmatrix}$.

###### Example 3.6.1.

The system of linear equations

 $\displaystyle 2x+3y+4z$ $\displaystyle=5$ $\displaystyle x+5z$ $\displaystyle=0$ (3.10)

has matrix form

 $\begin{pmatrix}2&3&4\\ 1&0&5\end{pmatrix}\begin{pmatrix}x\\ y\\ z\end{pmatrix}=\begin{pmatrix}5\\ 0\end{pmatrix}.$

This connection means that we can use systems of linear equations to learn about matrices, and use matrices to learn about systems of linear equations. For example, if $A$ is invertible then we can solve the matrix equation

 $A\mathbf{x}=\mathbf{b}$

by multiplying both sides by $A^{-1}$ to get that there is a unique solution $\mathbf{x}=A^{-1}\mathbf{b}$.

We are going to make two more observations about solving linear systems based on what we know about matrix multiplication. The first is that by Proposition 3.2.1, the vectors which can be written as $A\mathbf{u}$ for some $\mathbf{u}$ are exactly the ones which are linear combinations of the columns of $A$, that is, vectors of the form

 $u_{1}\mathbf{c}_{1}+\cdots+u_{n}\mathbf{c}_{n}$

where $\mathbf{c}_{j}$ is the $j$th column of $A$. So the matrix equation $A\mathbf{x}=\mathbf{b}$ has a solution if and only if $\mathbf{b}$ can be written as a linear combination of the columns of $A$. This set of linear combinations is therefore important enough to have a name.

###### Definition 3.6.2.

The column space of a matrix $A$, written $C(A)$, is the set of all linear combinations of the columns of $A$.

We can now restate the above by saying that $A\mathbf{x}=\mathbf{b}$ has a solution if and only if $\mathbf{b}$ belongs to the column space of $A$.

###### Definition 3.6.3.

A homogeneous matrix equation is one of the form $A\mathbf{x}=\mathbf{0}$.

What does it mean for $A\mathbf{k}=\mathbf{0}$ to be true? Using Proposition 3.2.1 again, it says that

 $k_{1}\mathbf{c}_{1}+\cdots+k_{n}\mathbf{c}_{n}=\mathbf{0}$ (3.11)

where the $k_{j}$ are the entries of $\mathbf{k}$ and the $\mathbf{c}_{j}$ are the columns of $A$. An equation of the form (3.11) is called a linear dependence relation, or just a linear dependence, on $\mathbf{c}_{1},\ldots,\mathbf{c}_{n}$.

###### Definition 3.6.4.

Let $\mathbf{v}_{1},\ldots,\mathbf{v}_{n}$ be matrices of the same size. A linear dependence relation on $\mathbf{v}_{1},\ldots,\mathbf{v}_{n}$ is an equation

 $a_{1}\mathbf{v}_{1}+\cdots+a_{n}\mathbf{v}_{n}=\mathbf{0}$

where the $a_{i}$ are numbers. A linear dependence relation is called nontrivial or nonzero if not all of the $a_{i}$ are zero.

We’ve seen that solutions of the matrix equation $A\mathbf{x}=\mathbf{0}$ correspond to linear dependence relations on the columns of $A$.

The solutions of the matrix equation $A\mathbf{x}=\mathbf{0}_{m}$ are so important that they get their own name.

###### Definition 3.6.5.

The nullspace of an $m\times n$ matrix $A$, written $N(A)$, is $\{\mathbf{v}\in\mathbb{R}^{n}:A\mathbf{v}=\mathbf{0}_{m}\}$.

The homogeneous equation $A\mathbf{x}=\mathbf{0}_{m}$ has the property that the zero vector is a solution, if $\mathbf{u}$ and $\mathbf{v}$ are solutions then so is $\mathbf{u}+\mathbf{v}$, and if $\lambda$ is a number then $\lambda\mathbf{u}$ is also a solution. This is what it means to say that $N(A)$ is a subspace of $\mathbb{R}^{n}$, something we will cover in the final chapter of MATH0005.

## 3.6.3 Augmented matrix

The augmented matrix of a system of linear equations whose matrix form is $A\mathbf{x}=\mathbf{b}$ is the matrix which you get by adding $\mathbf{b}$ as an extra column on the right of $A$. We write this as $(A\mid\mathbf{b})$ or just $(A\;\mathbf{b})$.

For example, the augmented matrix for the system of linear equations (3.10) above would be

 $\begin{pmatrix}2&3&4&5\\ 1&0&5&0\end{pmatrix}.$
###### Definition 3.6.6.
• A solution to a matrix equation $A\mathbf{x}=\mathbf{b}$ is a vector $\mathbf{y}$ (of numbers this time, not unknowns) such that $A\mathbf{y}=\mathbf{b}$.

• A matrix equation $A\mathbf{x}=\mathbf{b}$ is called consistent if it has a solution and inconsistent otherwise.

A system of linear equations may have a unique solution, many different solutions, or no solutions at all. In future lectures we will see how to find out how many solutions, if any, a system has.