# 3.12 Invertibility and RREF

We are going to prove a theorem connecting invertibility, RREF, and solutions of matrix equations. To do it we need a lemma on the RREF of square matrices.

###### Lemma 3.12.1.

Let $R$ be an $n\times n$ matrix in RREF which is not $I_{n}$. Then $R$ has a column with no leading entry.

###### Proof.

We’ll prove the contrapositive: if every column of $R$ has a leading entry, then $R=I_{n}$. Suppose every column has a leading entry, so there are $n$ leading entries. There’s at most one leading entry per row and there are $n$ rows, so every row must have a leading entry.

The leading entries go from left to right as we move down the rows of the matrix, so the leading entries in row 1, 2, …, $n$ must be in columns 1, 2, …$n$ otherwise there would be no room to fit them in.

Because $R$ is in RREF, columns with leading entries have zeroes in all other positions. So the first column is $\begin{pmatrix}1\\ 0\\ 0\\ \vdots\\ 0\end{pmatrix}$, the second column is $\begin{pmatrix}0\\ 1\\ 0\\ \vdots\\ 0\end{pmatrix}$, and so on. These are the columns of the identity matrix, so $R=I_{n}$. ∎

Now we can prove the theorem.

###### Theorem 3.12.2.

For an $n\times n$ matrix $A$, the following statements are equivalent.

1. (1)

$A$ is invertible.

2. (2)

The only solution to $A\mathbf{x}=\mathbf{0}$ is $\mathbf{x}=\mathbf{0}$.

3. (3)

There is a sequence of row operations taking $A$ to $I_{n}$.

###### Proof.

We will show that (1) implies (2), (2) implies (3), and (3) implies (1). When we’ve done this, it’s easy to see that the truth of any one of these three statements implies the truth of any other.

• (1) $\Rightarrow$ (2). If $A\mathbf{x}=\mathbf{0}$ then multiplying on the left by $A^{-1}$ gives $\mathbf{x}=A^{-1}\mathbf{0}=\mathbf{0}$.

• (2) $\Rightarrow$ (3). Let $R$ be a matrix in RREF obtained by doing a sequence of row operations to $A$; we know such a matrix exists by Theorem 3.10.1. The solutions to $A\mathbf{x}=\mathbf{0}$ are the same as the solutions to $R\mathbf{x}=\mathbf{0}$ by Theorem 3.9.1, so $R\mathbf{x}=\mathbf{0}$ has only the solution $\mathbf{x}=\mathbf{0}$. $R$ must be $I_{n}$, otherwise by Lemma 3.12.1 it would have a column with no leading entry so there would be a free variable in the solutions to $R\mathbf{x}=\mathbf{0}$ and in particular there would be a nonzero solution.

• (3) $\Rightarrow$ (1). Since there is a sequence of row operations taking $A$ to $I_{n}$, there is an invertible matrix $E$ such that $EA=I_{n}$ by Theorem 3.8.3. Then $A=E^{-1}I_{n}=E^{-1}$, so $A$ is invertible with inverse $E$. ∎

## 3.12.1 Left and right inverses

We saw in the previous chapter that an arbitrary function could have a left inverse but not a right inverse, or a right inverse but not a left inverse. The same is not true of square matrices.

###### Theorem 3.12.3.

Let $A$ be an $n\times n$ matrix. The following are equivalent:

1. (i)

$A$ is invertible.

2. (ii)

There is a matrix $L$ such that $LA=I_{n}$.

3. (iii)

There is a matrix $R$ such that $AR=I_{n}$.

###### Proof.

We will show that (i) $\iff$ (ii) and then that (i) $\iff$ (iii).

Certainly (i) implies (ii). Conversely, suppose that $LA=I_{n}$. Let $\mathbf{v}$ be any solution of $A\mathbf{v}=\mathbf{0}$. Multiplying on the left by $L$ we get $\mathbf{v}=\mathbf{0}$, so $A$ is invertible by Theorem 3.12.2.

Again, (i) obviously implies (iii). If $AR=I_{n}$ then transposing both sides, $R^{T}A^{T}=I_{n}^{T}=I_{n}$. By the same argument as above, $A^{T}$ is invertible, and so $A$ is invertible by Theorem 3.5.5. ∎