An $n\times n$ matrix $A$ is called invertible if and only if there exists an $n\times n$ matrix $B$ such that $AB=BA={I}_{n}$.

If there is such a matrix $B$, we can prove that there is only one such matrix $B$:

If $A\mathit{}B\mathrm{=}B\mathit{}A\mathrm{=}{I}_{n}$ and $A\mathit{}C\mathrm{=}C\mathit{}A\mathrm{=}{I}_{n}$ then $B\mathrm{=}C$.

This means that when a matrix is invertible we can talk about *the*
inverse of $A$. We write ${A}^{-1}$ for the inverse of $A$ when it exists.

If an $n\mathrm{\times}n$ matrix $A$ has a row of zeroes, or a column of zeroes, then it is not invertible.

Suppose $A$ has a column of zeroes and that $B$ is any other $n\times n$ matrix. By Theorem 3.2.3, the columns of $BA$ are $B$ times the columns of $A$. In particular, one of these columns is $B$ times the zero vector, which is the zero vector. Since one of the columns of $BA$ is all zeroes, $BA$ is not the identity.

If $A$ has a row of zeroes, we can make a similar argument using Theorem 3.2.6. ∎

If you multiply any number of invertible matrices together, the result is invertible. Recall the shoes-and-socks result about the inverse of a composition of two functions: exactly the same thing is true.

If ${A}_{\mathrm{1}}\mathrm{,}\mathrm{\dots}\mathrm{,}{A}_{k}$ are invertible $n\mathrm{\times}n$ matrices then ${A}_{\mathrm{1}}\mathit{}\mathrm{\cdots}\mathit{}{A}_{k}$ is invertible with inverse ${A}_{k}^{\mathrm{-}\mathrm{1}}\mathit{}\mathrm{\cdots}\mathit{}{A}_{\mathrm{1}}^{\mathrm{-}\mathrm{1}}$.

The proof is the same as for functions: you can simply check that ${A}_{k}^{-1}\mathrm{\cdots}{A}_{1}^{-1}$ is a two sided inverse to ${A}_{1}\mathrm{\cdots}{A}_{k}$ using the associativity property for matrix multiplication.

This theorem has a useful corollary about when matrix products are invertible.

Let $A$ and $E$ be $n\mathrm{\times}n$ matrices with $E$ invertible. Then $E\mathit{}A$ is invertible if and only if $A$ is invertible, and $A\mathit{}E$ is invertible if and only if $A$ is invertible.

If $A$ is invertible then the theorem tells us that so are $EA$ and $AE$.

Suppose $EA$ is invertible. Certainly ${E}^{-1}$ is invertible (its inverse is $E$), so by the theorem ${E}^{-1}EA$ is invertible, that is, $A$ is invertible. The argument for $AE$ is similar. ∎

Let $A$ be an $n\mathrm{\times}n$ matrix. Then $A$ is invertible if and only if ${A}^{T}$ is invertible.

Suppose $A$ is invertible. We claim that ${A}^{T}$ is invertible with inverse ${({A}^{-1})}^{T}$. This is true because

${A}^{T}{({A}^{-1})}^{T}$ | $={({A}^{-1}A)}^{T}$ | Proposition 3.4.1 part 4 | ||

$={I}_{n}^{T}$ | ||||

$={I}_{n}$ |

and similarly ${({A}^{-1})}^{T}{A}^{T}={I}_{n}$.

Conversely, suppose ${A}^{T}$ is invertible. The above argument shows that ${({A}^{T})}^{T}$ is invertible, that is, $A$ is invertible. ∎