3.4 Multiplication properties

Proposition 3.4.1.

Let $A$ and $A^{\prime}$ be $m\times n$ matrices, let $B$ and $B^{\prime}$ be $n\times p$ matrices, let $C$ be a $p\times q$ matrix, and let $\lambda$ be a number. Then

1. 1.

$A(BC)=(AB)C$ (associativity),

2. 2.

$(A+A^{\prime})B=AB+A^{\prime}B$, and $A(B+B^{\prime})=AB+AB^{\prime}$ (distributivity),

3. 3.

$(\lambda A)B=\lambda(AB)=A(\lambda B)$, and

4. 4.

$(AB)^{T}=B^{T}A^{T}$.

Proof.

Let $A=(a_{ij}),A^{\prime}=(a^{\prime}_{ij}),B=(b_{ij}),B^{\prime}=(b^{\prime}_{ij}% ),C=(c_{ij})$. During this proof we also write $X_{ij}$ to mean the $i,j$ entry of a matrix $X$.

1. 1.

$AB$ has $i,j$ entry $\sum_{k=1}^{n}a_{ik}b_{kj}$, so the $i,j$ entry of $(AB)C$ is

 $\sum_{l=1}^{p}(AB)_{il}c_{lj}=\sum_{l=1}^{p}\sum_{k=1}^{n}a_{ik}b_{kl}c_{lj}.$ (3.7)

On the other hand, the $i,j$ entry of $BC$ is $\sum_{l=1}^{p}b_{il}c_{lj}$ so the $i,j$ entry of $A(BC)$ is

 $\displaystyle\sum_{k=1}^{n}a_{ik}(BC)_{kj}$ $\displaystyle=\sum_{k=1}^{n}a_{ik}\sum_{l=1}^{p}b_{kl}c_{lj}$ $\displaystyle=\sum_{k=1}^{n}\sum_{l=1}^{p}a_{ik}b_{kl}c_{lj}.$ (3.8)

(3.8) and (3.7) are the same because it doesn’t matter if we do the $k$ or $l$ summation first: we just get the same terms in a different order.

2. 2.

The $i,j$ entry of $(A+A^{\prime})B$ is $\sum_{k=1}^{n}(a_{ik}+a^{\prime}_{ik})b_{jk}$ which equals $\sum_{k=1}^{n}a_{ik}b_{kj}+\sum_{k=1}^{n}a^{\prime}_{ik}b_{kj}$, but this is the sum of the $i,j$ entry of $AB$ and the $i,j$ entry of $A^{\prime}B$, proving the first equality. The second is similar.

3. 3.

The $i,j$ entry of $\lambda A$ is $\lambda a_{ij}$, so the $i,j$ entry of $(\lambda A)B$ is

 $\sum_{k=1}^{n}(\lambda a_{ik})b_{kj}=\lambda\sum_{k=1}^{n}a_{ik}b_{kj}=\lambda% (AB)_{ij}$

so $(\lambda A)B$ and $\lambda(AB)$ have the same $i,j$ entry for any $i,j$, and are therefore equal. The second equality can be proved similarly.

4. 4.

This will be an exercise on one of your problem sets. ∎

These results tell you that you can use some of the normal rules of algebra when you work with matrices, like what happened for permutations. Again, like permutations, what you can’t do is use the commutative property.

3.4.1 Matrix multiplication isn’t commutative

Definition 3.4.1.

Two matrices $A$ and $B$ are said to commute if $AB$ and $BA$ are both defined and $AB=BA$.

For some pairs of matrices, the product $AB$ is defined but $BA$ is not. For example, if $A$ is $2\times 3$ and $B$ is $3\times 4$ then $AB$ is defined but $BA$ isn’t. Even when both $AB$ and $BA$ are defined and have the same size they won’t in general be equal.

Example 3.4.1.

let $A=\begin{pmatrix}1&2\\ 3&4\end{pmatrix}$ and $B=\begin{pmatrix}5&6\\ 7&8\end{pmatrix}$. Then

 $\displaystyle AB$ $\displaystyle=\begin{pmatrix}19&22\\ 43&50\end{pmatrix}$ $\displaystyle BA$ $\displaystyle=\begin{pmatrix}23&34\\ 31&46\end{pmatrix}.$

3.4.2 The identity matrix

Definition 3.4.2.

The $n\times n$ identity matrix $I_{n}$ is the matrix with $i,j$ entry 1 if $i=j$ and 0 otherwise.

For example,

 $I_{2}=\begin{pmatrix}1&0\\ 0&1\end{pmatrix},I_{3}=\begin{pmatrix}1&0&0\\ 0&1&0\\ 0&0&1\end{pmatrix}.$

The most important property of identity matrices is that they behave like the number $1$ does when you multiply by them.

Theorem 3.4.2.

If $A$ is an $m\times n$ matrix then $I_{m}A=AI_{n}=A$.

Proof.

Let $A=(a_{ij}),I_{n}=(\delta_{ij})$, so $\delta_{ij}$ is 1 if $i=j$ and 0 otherwise. The formula for matrix multiplication tells us that for any $i$ and $j$, the $i,j$ entry of $I_{m}A$ is $\sum_{k=1}^{m}\delta_{ik}a_{kj}$ The only term in this sum that can be nonzero is the one when $k=i$, so the sum equals $1\times a_{ij}=a_{ij}$. Thus the $i,j$ entry of $I_{m}A$ equals $a_{ij}$, the $i,j$ entry of $A$.

The other equality can be proved similarly. ∎