3 Matrices

3.4 Multiplication properties

Proposition 3.4.1.

Let A and A be m×n matrices, let B and B be n×p matrices, let C be a p×q matrix, and let λ be a number. Then

  1. 1.

    A(BC)=(AB)C (associativity),

  2. 2.

    (A+A)B=AB+AB, and A(B+B)=AB+AB (distributivity),

  3. 3.

    (λA)B=λ(AB)=A(λB), and

  4. 4.

    (AB)T=BTAT.

Proof.

Let A=(aij),A=(aij),B=(bij),B=(bij),C=(cij). During this proof we also write Xij to mean the i,j entry of a matrix X.

  1. 1.

    AB has i,j entry k=1naikbkj, so the i,j entry of (AB)C is

    l=1p(AB)ilclj=l=1pk=1naikbklclj. (3.8)

    On the other hand, the i,j entry of BC is l=1pbilclj so the i,j entry of A(BC) is

    k=1naik(BC)kj =k=1naikl=1pbklclj
    =k=1nl=1paikbklclj. (3.9)

    (3.9) and (3.8) are the same because it doesn’t matter if we do the k or l summation first: we just get the same terms in a different order.

  2. 2.

    The i,j entry of (A+A)B is k=1n(aik+aik)bjk which equals k=1naikbkj+k=1naikbkj, but this is the sum of the i,j entry of AB and the i,j entry of AB, proving the first equality. The second is similar.

  3. 3.

    The i,j entry of λA is λaij, so the i,j entry of (λA)B is

    k=1n(λaik)bkj=λk=1naikbkj=λ(AB)ij

    so (λA)B and λ(AB) have the same i,j entry for any i,j, and are therefore equal. The second equality can be proved similarly.

  4. 4.

    One special case of this result is very easy: for any row vector 𝐚=(a1an) and column vector 𝐛=(b1bn) we have

    𝐚𝐛=k=1naibi=k=1nbiai=𝐛T𝐚T.

    For the general case, we start of by checking that (AB)T and BTAT have the same size. AB is m×p so (AB)T is p×m, while BT and AT are p×n and n×m respectively so BTAT is also p×m. Now we only need to show that for any i and j, they have the same i,j entry. Let 𝐫i be the ith row of A and 𝐜j the jth column of B. The i,j entry of AB is 𝐫i𝐜j, so the i,j entry of (AB)T is 𝐫j𝐜i. On the other hand, the ith row of BT is 𝐜iT and the jth column of AT is 𝐫jT, so the i,j entry of BTAT is 𝐜iT𝐫jT. These two are equal by the special case mentioned at the start of this proof. ∎

These results tell you that you can use some of the normal rules of algebra when you work with matrices, like what happened for permutations. Again, like permutations, what you can’t do is use the commutative property.

3.4.1 Matrix multiplication isn’t commutative

Definition 3.4.1.

Two matrices A and B are said to commute if AB and BA are both defined and AB=BA.

For some pairs of matrices, the product AB is defined but BA is not. For example, if A is 2×3 and B is 3×4 then AB is defined but BA isn’t. Even when both AB and BA are defined and have the same size they won’t in general be equal.

Example 3.4.1.

let A=(1234) and B=(5678). Then

AB =(19224350)
BA =(23343146).

3.4.2 The identity matrix

Definition 3.4.2.

The n×n identity matrix In is the matrix with i,j entry 1 if i=j and 0 otherwise.

For example,

I2=(1001),I3=(100010001).

The most important property of identity matrices is that they behave like the number 1 does when you multiply by them.

Theorem 3.4.2.

If A is an m×n matrix then ImA=AIn=A.

Proof.

Let A=(aij),In=(δij), so δij is 1 if i=j and 0 otherwise. The formula for matrix multiplication tells us that for any i and j, the i,j entry of ImA is k=1mδikakj The only term in this sum that can be nonzero is the one when k=i, so the sum equals 1×aij=aij. Thus the i,j entry of ImA equals aij, the i,j entry of A.

The other equality can be proved similarly. ∎