3.4 Multiplication properties

Proposition 3.4.1.

Let $A$ and $A^{\prime}$ be $m\times n$ matrices, let $B$ and $B^{\prime}$ be $n\times p$ matrices, let $C$ be a $p\times q$ matrix, and let $\lambda$ be a number. Then

1. 1.

$A(BC)=(AB)C$ (associativity),

2. 2.

$(A+A^{\prime})B=AB+A^{\prime}B$, and $A(B+B^{\prime})=AB+AB^{\prime}$ (distributivity),

3. 3.

$(\lambda A)B=\lambda(AB)=A(\lambda B)$, and

4. 4.

$(AB)^{T}=B^{T}A^{T}$.

Proof.

Let $A=(a_{ij}),A^{\prime}=(a^{\prime}_{ij}),B=(b_{ij}),B^{\prime}=(b^{\prime}_{ij}% ),C=(c_{ij})$. During this proof we also write $X_{ij}$ to mean the $i,j$ entry of a matrix $X$.

1. 1.

$AB$ has $i,j$ entry $\sum_{k=1}^{n}a_{ik}b_{kj}$, so the $i,j$ entry of $(AB)C$ is

 $\sum_{l=1}^{p}(AB)_{il}c_{lj}=\sum_{l=1}^{p}\sum_{k=1}^{n}a_{ik}b_{kl}c_{lj}.$ (3.8)

On the other hand, the $i,j$ entry of $BC$ is $\sum_{l=1}^{p}b_{il}c_{lj}$ so the $i,j$ entry of $A(BC)$ is

 $\displaystyle\sum_{k=1}^{n}a_{ik}(BC)_{kj}$ $\displaystyle=\sum_{k=1}^{n}a_{ik}\sum_{l=1}^{p}b_{kl}c_{lj}$ $\displaystyle=\sum_{k=1}^{n}\sum_{l=1}^{p}a_{ik}b_{kl}c_{lj}.$ (3.9)

(3.9) and (3.8) are the same because it doesn’t matter if we do the $k$ or $l$ summation first: we just get the same terms in a different order.

2. 2.

The $i,j$ entry of $(A+A^{\prime})B$ is $\sum_{k=1}^{n}(a_{ik}+a^{\prime}_{ik})b_{jk}$ which equals $\sum_{k=1}^{n}a_{ik}b_{kj}+\sum_{k=1}^{n}a^{\prime}_{ik}b_{kj}$, but this is the sum of the $i,j$ entry of $AB$ and the $i,j$ entry of $A^{\prime}B$, proving the first equality. The second is similar.

3. 3.

The $i,j$ entry of $\lambda A$ is $\lambda a_{ij}$, so the $i,j$ entry of $(\lambda A)B$ is

 $\sum_{k=1}^{n}(\lambda a_{ik})b_{kj}=\lambda\sum_{k=1}^{n}a_{ik}b_{kj}=\lambda% (AB)_{ij}$

so $(\lambda A)B$ and $\lambda(AB)$ have the same $i,j$ entry for any $i,j$, and are therefore equal. The second equality can be proved similarly.

4. 4.

One special case of this result is very easy: for any row vector $\mathbf{a}=\begin{pmatrix}a_{1}&\cdots&a_{n}\end{pmatrix}$ and column vector $\mathbf{b}=\begin{pmatrix}b_{1}\\ \vdots\\ b_{n}\end{pmatrix}$ we have

 $\mathbf{a}\mathbf{b}=\sum_{k=1}^{n}a_{i}b_{i}=\sum_{k=1}^{n}b_{i}a_{i}=\mathbf% {b}^{T}\mathbf{a}^{T}.$

For the general case, we start of by checking that $(AB)^{T}$ and $B^{T}A^{T}$ have the same size. $AB$ is $m\times p$ so $(AB)^{T}$ is $p\times m$, while $B^{T}$ and $A^{T}$ are $p\times n$ and $n\times m$ respectively so $B^{T}A^{T}$ is also $p\times m$. Now we only need to show that for any $i$ and $j$, they have the same $i,j$ entry. Let $\mathbf{r}_{i}$ be the $i$th row of $A$ and $\mathbf{c}_{j}$ the $j$th column of $B$. The $i,j$ entry of $AB$ is $\mathbf{r}_{i}\mathbf{c}_{j}$, so the $i,j$ entry of $(AB)^{T}$ is $\mathbf{r}_{j}\mathbf{c}_{i}$. On the other hand, the $i$th row of $B^{T}$ is $\mathbf{c}_{i}^{T}$ and the $j$th column of $A^{T}$ is $\mathbf{r}_{j}^{T}$, so the $i,j$ entry of $B^{T}A^{T}$ is $\mathbf{c}_{i}^{T}\mathbf{r}_{j}^{T}$. These two are equal by the special case mentioned at the start of this proof. ∎

These results tell you that you can use some of the normal rules of algebra when you work with matrices, like what happened for permutations. Again, like permutations, what you can’t do is use the commutative property.

3.4.1 Matrix multiplication isn’t commutative

Definition 3.4.1.

Two matrices $A$ and $B$ are said to commute if $AB$ and $BA$ are both defined and $AB=BA$.

For some pairs of matrices, the product $AB$ is defined but $BA$ is not. For example, if $A$ is $2\times 3$ and $B$ is $3\times 4$ then $AB$ is defined but $BA$ isn’t. Even when both $AB$ and $BA$ are defined and have the same size they won’t in general be equal.

Example 3.4.1.

let $A=\begin{pmatrix}1&2\\ 3&4\end{pmatrix}$ and $B=\begin{pmatrix}5&6\\ 7&8\end{pmatrix}$. Then

 $\displaystyle AB$ $\displaystyle=\begin{pmatrix}19&22\\ 43&50\end{pmatrix}$ $\displaystyle BA$ $\displaystyle=\begin{pmatrix}23&34\\ 31&46\end{pmatrix}.$

3.4.2 The identity matrix

Definition 3.4.2.

The $n\times n$ identity matrix $I_{n}$ is the matrix with $i,j$ entry 1 if $i=j$ and 0 otherwise.

For example,

 $I_{2}=\begin{pmatrix}1&0\\ 0&1\end{pmatrix},I_{3}=\begin{pmatrix}1&0&0\\ 0&1&0\\ 0&0&1\end{pmatrix}.$

The most important property of identity matrices is that they behave like the number $1$ does when you multiply by them.

Theorem 3.4.2.

If $A$ is an $m\times n$ matrix then $I_{m}A=AI_{n}=A$.

Proof.

Let $A=(a_{ij}),I_{n}=(\delta_{ij})$, so $\delta_{ij}$ is 1 if $i=j$ and 0 otherwise. The formula for matrix multiplication tells us that for any $i$ and $j$, the $i,j$ entry of $I_{m}A$ is $\sum_{k=1}^{m}\delta_{ik}a_{kj}$ The only term in this sum that can be nonzero is the one when $k=i$, so the sum equals $1\times a_{ij}=a_{ij}$. Thus the $i,j$ entry of $I_{m}A$ equals $a_{ij}$, the $i,j$ entry of $A$.

The other equality can be proved similarly. ∎