18
Feb 19

Determinant of a transpose

Determinant of a transpose

Exercise 1. \det (A^T)=\det A.

Proof. The proof is similar to the derivation of the Leibniz formula. Using the notation from that derivation, we decompose rows of A into linear combinations of unit row-vectors A_i=\sum_{j=1}^na_{ij}e_j. Hence A_i^T=\sum_{j=1}^na_{ij}e_j^T. Therefore by multilinearity in columns

(1) \det (A^{T})=\sum_{j_{1},...,j_{n}=1}^{n}a_{1j_{1}}...a_{nj_{n}}\det  \left( e_{j_{1}}^{T},...,e_{j_{n}}^{T}\right)

=\sum_{j_{1},...,j_{n}:\det  P_{j_{1},...,j_{n}}^{T}\neq 0}a_{1j_{1}}...a_{nj_{n}}\det  P_{j_{1},...,j_{n}}^{T}.

Now we want to relate \det P_{j_1,...,j_n}^T to \det P_{j_1,...,j_n}.

(2) \det P_{j_1,...,j_n}^T=\det (P_{j_1,...,j_n}^{-1}) (the transpose of an orthogonal matrix equals its inverse)

=\frac{1}{\det (P_{j_1,...,j_n})}          (the determinant of the inverse is the inverse of the determinant)

=\det (P_{j_1,...,j_n})      (because P_{j_1,...,j_n} is orthogonal and its determinant is either 1 or -1 ).

(1) and (2) prove the statement.

Apart from being interesting in its own right, Exercise 1 allows one to translate properties in terms of rows to properties in terms of columns, as in the next corollary.

Corollary 1. \det A=0 if two columns of A are linearly dependent.

Indeed, columns of A are rows of its transpose, so Exercise 1 and Property III yield the result.

21
Jun 18

Matrix algebra: questions for repetition

Matrix algebra: questions for repetition

Matrix algebra is a bunch of dreadful rules is all that many students remember after studying it. It's a relief to know that a simple property like AA^{-1}=I is more important than remembering how to calculate an inverse. The theoretical formula for the inverse for the most part of this course can be avoided; if you need to find the inverse of a numerical matrix, you can use Excel.

First things first

Three big No's: 1) there is no commutativity, in general, 2) determinants don't apply to non-square matrices, and 3) don't try to invert a non-square matrix. There are ways around these problems but all of them are deficient, so better stick to good cases.

Three big ideas: 1) the analogy with real numbers is the best guide to study matrices, 2) the matrix product definition is motivated by the desire to compactify a system of equations, 3) symmetric matrices have properties closest to those of real numbers.

Three surprises: 1) in general, matrices don't commute (can you give an example?), 2) a nonzero matrix is not necessarily invertible (can you give an example?), 3) when you invert a product, you have to change the order of the factors (same goes for transposition). These two properties are called reverse order laws.

Comforting news: 1) properties of summation of numbers have complete analogs for matrices, 2) in case of multiplication, it's good to know that existence of a unity, associativity and distributivity generalize to matrices.

Particulars and extensions

Answer the following questions, with proofs where possible. None of the answers requires long boring calculations.

Multiplication. 1) If A^{2} exists, what can you say about A? 2) If the last row of A is zero and the product AB exists, what can you say about this product? 3) Where did we use associativity of multiplication?

In what way the rules for the inverse of a product and transposed of a product are similar? Can you tell any differences between them?

Commutativity: 1) If two matrices commute, do you think their inverses commute? 2) Does a matrix commute with its inverse?

Properties of inverses: 1) inverse of an inverse, 2) inverse of a product, 3) inverse of a transpose.

Properties of determinants: 1) why we need them, 2) determinant of a product, 3) determinant of an inverse, 4) determinant of a transpose. 5) Prove the multiplication rule for the determinant of the product of three matrices.

Properties of the identity matrix: 1) use the definition of the inverse to find the inverse of the identity matrix, 2) do you think the identity matrix commutes with any other matrix? 3) Can you name any matrices, other than the identity, satisfying the equation A^{2}=A? If a matrix satisfies this equation, what can you say about its determinant? 4) What is the determinant of the identity matrix?

If a nonzero number is close to zero, then its inverse must be a large number (in absolute value). True or wrong? Can you indicate any analogs of this statement for matrices?

Suppose matrices A,B are given and \det A\neq 0. How would you solve the linear matrix equation AX=B for X?

Symmetric matrices: 1) For any matrix A, both matrices AA^T and A^TA  are symmetric. True or wrong? 2) If a matrix is symmetric and its inverse exists, will the inverse be symmetric?