21
Jun 18

## Matrix algebra: questions for repetition

Matrix algebra is a bunch of dreadful rules is all that many students remember after studying it. It's a relief to know that a simple property like $AA^{-1}=I$ is more important than remembering how to calculate an inverse. The theoretical formula for the inverse for the most part of this course can be avoided; if you need to find the inverse of a numerical matrix, you can use Excel.

### First things first

Three big No's: 1) there is no commutativity, in general, 2) determinants don't apply to non-square matrices, and 3) don't try to invert a non-square matrix. There are ways around these problems but all of them are deficient, so better stick to good cases.

Three big ideas: 1) the analogy with real numbers is the best guide to study matrices, 2) the matrix product definition is motivated by the desire to compactify a system of equations, 3) symmetric matrices have properties closest to those of real numbers.

Three surprises: 1) in general, matrices don't commute (can you give an example?), 2) a nonzero matrix is not necessarily invertible (can you give an example?), 3) when you invert a product, you have to change the order of the factors (same goes for transposition). These two properties are called reverse order laws.

Comforting news: 1) properties of summation of numbers have complete analogs for matrices, 2) in case of multiplication, it's good to know that existence of a unity, associativity and distributivity generalize to matrices.

### Particulars and extensions

Answer the following questions, with proofs where possible. None of the answers requires long boring calculations.

Multiplication. 1) If $A^{2}$ exists, what can you say about $A$? 2) If the last row of $A$ is zero and the product $AB$ exists, what can you say about this product? 3) Where did we use associativity of multiplication?

In what way the rules for the inverse of a product and transposed of a product are similar? Can you tell any differences between them?

Commutativity: 1) If two matrices commute, do you think their inverses commute? 2) Does a matrix commute with its inverse?

Properties of inverses: 1) inverse of an inverse, 2) inverse of a product, 3) inverse of a transpose.

Properties of determinants: 1) why we need them, 2) determinant of a product, 3) determinant of an inverse, 4) determinant of a transpose. 5) Prove the multiplication rule for the determinant of the product of three matrices.

Properties of the identity matrix: 1) use the definition of the inverse to find the inverse of the identity matrix, 2) do you think the identity matrix commutes with any other matrix? 3) Can you name any matrices, other than the identity, satisfying the equation $A^{2}=A?$ If a matrix satisfies this equation, what can you say about its determinant? 4) What is the determinant of the identity matrix?

If a nonzero number is close to zero, then its inverse must be a large number (in absolute value). True or wrong? Can you indicate any analogs of this statement for matrices?

Suppose matrices $A,B$ are given and $\det A\neq 0.$ How would you solve the linear matrix equation $AX=B$ for $X?$

Symmetric matrices: 1) For any matrix $A,$ both matrices $AA^T$ and $A^TA$ are symmetric. True or wrong? 2) If a matrix is symmetric and its inverse exists, will the inverse be symmetric?

14
Jun 18

## Roadmap for studying matrix multiplication

In a way, multiplication of numbers is similar to summation. More precisely, replacing the addition/subtraction with multiplication/division and zero with unity we have the following properties:

A) Commutativity: $ab=ba$ for any two numbers $a,b.$

B) Associativity: $a(bc)=(ab)c$ for any three numbers $a,b,c.$

C) Existence of unity: there is a special number, denoted $1,$ such that $a1=a$ for any number $a.$

D) Existence of an inverse number: for any number $a\neq 0,$ there is another number, denoted $\frac{1}{a}$, such that $a\frac{1}{a}=1.$

The new element is that in D) there is an existence condition $a\neq 0.$

E) Distributivity (link between summation and multiplication). $a(b+c)=ab+ac$ for any three numbers $a,b,c.$

Matrix multiplication is more challenging than summation. We check one by one which of these properties hold for matrices and which do not. For most of this, considering $2\times 2$ matrices is sufficient.

### Commutativity of matrices

Global idea 3. Among all matrices, symmetric square matrices have properties most close to those of real numbers. A full disclosure of this statement would require a foray into imaginary numbers, which I want to avoid for now.

Surprise #1. In general, commutativity is not true for matrix multiplication.

Proof. To prove this type of statement, we have to produce an example of matrices such that $AB\neq BA.$ A trivial possibility is to choose rectangular $A,B$ so that both products $AB$ and $BA$ exist but have different sizes. They will not be equal. A more interesting example would be to find square matrices that don't commute. Remembering Global Idea 3, bad matrices should be looked for among asymmetric matrices. Therefore let us put $A=\left(\begin{array}{cc}0&1\\-1&0\end{array}\right) .$ Also let $B=\left(\begin{array}{cc}1&0\\0&-1\end{array}\right)$ (for $B$ I first took a general $2\times 2$ matrix, found the two products and then chose the elements of $B$ to make the example simpler.) The equations

$AB=\left(\begin{array}{cc}0&1\\-1&0\end{array}\right)\left(\begin{array}{cc}1&0\\0&-1\end{array}\right) =\left(\begin{array}{cc}0&-1\\-1&0\end{array}\right),$ $BA=\left(\begin{array}{cc}1&0\\0 &-1\end{array}\right)\left(\begin{array}{cc}0&1\\-1&0\end{array}\right)=\left(\begin{array}{cc}0&1\\1&0\end{array}\right) .$

prove the statement.

### Associativity of matrices

The property looks like this: $A(BC)=(AB)C$ for any three matrices $A,B,C.$ These can be any compatible, not necessarily square, matrices.

Statement. Associativity is true.

No surprises here, no proofs either. Matrix algebra is full of long boring proofs most of which are better skipped, as long as you understand the meaning.

### Distributivity

The property looks like this: $A(B+C)=AB+AC$ for any three matrices $A,B,C.$ These can be any compatible, not necessarily square, matrices. Since there is no commutativity, we have to add that $(B+C)A=BA+CA$.

Statement. Distributivity is true.

Again, no surprises and no proofs either.