14
Jun 18

## Roadmap for studying matrix multiplication

In a way, multiplication of numbers is similar to summation. More precisely, replacing the addition/subtraction with multiplication/division and zero with unity we have the following properties:

A) Commutativity: $ab=ba$ for any two numbers $a,b.$

B) Associativity: $a(bc)=(ab)c$ for any three numbers $a,b,c.$

C) Existence of unity: there is a special number, denoted $1,$ such that $a1=a$ for any number $a.$

D) Existence of an inverse number: for any number $a\neq 0,$ there is another number, denoted $\frac{1}{a}$, such that $a\frac{1}{a}=1.$

The new element is that in D) there is an existence condition $a\neq 0.$

E) Distributivity (link between summation and multiplication). $a(b+c)=ab+ac$ for any three numbers $a,b,c.$

Matrix multiplication is more challenging than summation. We check one by one which of these properties hold for matrices and which do not. For most of this, considering $2\times 2$ matrices is sufficient.

### Commutativity of matrices

Global idea 3. Among all matrices, symmetric square matrices have properties most close to those of real numbers. A full disclosure of this statement would require a foray into imaginary numbers, which I want to avoid for now.

Surprise #1. In general, commutativity is not true for matrices.

Proof. To prove this type of statement, we have to produce an example of matrices such that $AB\neq BA.$ A trivial possibility is to choose rectangular $A,B$ so that both products $AB$ and $BA$ exist but have different sizes. They will not be equal. A more interesting example would be to find square matrices that don't commute. Remembering Global Idea 3, bad matrices should be looked for among asymmetric matrices. Therefore let us put $A=\left(\begin{array}{cc}0&1\\-1&0\end{array}\right) .$ Also let $B=\left(\begin{array}{cc}1&0\\0&-1\end{array}\right)$ (for $B$ I first took a general $2\times 2$ matrix, found the two products and then chose the elements of $B$ to make the example simpler.) The equations

$AB=\left(\begin{array}{cc}0&1\\-1&0\end{array}\right)\left(\begin{array}{cc}1&0\\0&-1\end{array}\right) =\left(\begin{array}{cc}0&-1\\-1&0\end{array}\right),$ $BA=\left(\begin{array}{cc}1&0\\0 &-1\end{array}\right)\left(\begin{array}{cc}0&1\\-1&0\end{array}\right)=\left(\begin{array}{cc}0&1\\1&0\end{array}\right) .$

prove the statement.

### Associativity of matrices

The property looks like this: $A(BC)=(AB)C$ for any three matrices $A,B,C.$ These can be any compatible, not necessarily square, matrices.

Statement. Associativity is true.

No surprises here, no proofs either. Matrix algebra is full of long boring proofs most of which are better skipped, as long as you understand the meaning.

### Distributivity

The property looks like this: $A(B+C)=AB+AC$ for any three matrices $A,B,C.$ These can be any compatible, not necessarily square, matrices. Since there is no commutativity, we have to add that $(B+C)A=BA+CA$.

Statement. Distributivity is true.

Again, no surprises and no proofs either.