Feb 19

Determinant of a transpose

Determinant of a transpose

Exercise 1. \det (A^T)=\det A.

Proof. The proof is similar to the derivation of the Leibniz formula. Using the notation from that derivation, we decompose rows of A into linear combinations of unit row-vectors A_i=\sum_{j=1}^na_{ij}e_j. Hence A_i^T=\sum_{j=1}^na_{ij}e_j^T. Therefore by multilinearity in columns

(1) \det (A^{T})=\sum_{j_{1},...,j_{n}=1}^{n}a_{1j_{1}}...a_{nj_{n}}\det  \left( e_{j_{1}}^{T},...,e_{j_{n}}^{T}\right)

=\sum_{j_{1},...,j_{n}:\det  P_{j_{1},...,j_{n}}^{T}\neq 0}a_{1j_{1}}...a_{nj_{n}}\det  P_{j_{1},...,j_{n}}^{T}.

Now we want to relate \det P_{j_1,...,j_n}^T to \det P_{j_1,...,j_n}.

(2) \det P_{j_1,...,j_n}^T=\det (P_{j_1,...,j_n}^{-1}) (the transpose of an orthogonal matrix equals its inverse)

=\frac{1}{\det (P_{j_1,...,j_n})}          (the determinant of the inverse is the inverse of the determinant)

=\det (P_{j_1,...,j_n})      (because P_{j_1,...,j_n} is orthogonal and its determinant is either 1 or -1 ).

(1) and (2) prove the statement.

Apart from being interesting in its own right, Exercise 1 allows one to translate properties in terms of rows to properties in terms of columns, as in the next corollary.

Corollary 1. \det A=0 if two columns of A are linearly dependent.

Indeed, columns of A are rows of its transpose, so Exercise 1 and Property III yield the result.

Leave a Reply

You must be logged in to post a comment.