Feb 19

Properties IV-VI

Properties IV-VI

Exercise 1. Let S be a linearly independent system of n-1 vectors in R^n. Then it can be completed with a vector B to form a basis in R^n.

Proof. One way to obtain B is this. Let P be a projector onto the span of S and let Q=I-P. Take as B any nonzero vector from the image of Q. It is orthogonal to any element of the image of P and, in particular, to elements of S. Therefore S completed with B gives a linearly independent system \tilde{S}. \tilde{S} is a basis because x=Px+Qx for any x\in R^n.

Property IV. Additivity. Suppose the ith row of A is a sum of two vectors:

A=\left(\begin{array}{c}A_1 \\ ... \\ A_i^\prime+A_i^{\prime\prime } \\ ... \\ A_n\end{array}\right).


A^\prime=\left(\begin{array}{c}A_1 \\... \\A_i^\prime \\... \\A_n\end{array}\right), A^{\prime\prime}=\left(\begin{array}{c}A_1 \\... \\A_i^{\prime\prime} \\... \\A_n\end{array}\right)

(except for the ith row, all others are the same for all three matrices). Then \det A=\det A^\prime+\det A^{\prime\prime}.

Proof. Denote S the system of n-1 vectors A_1,...,A_{i-1},A_{i+1},...,A_n.

Case 1. If S is linearly dependent, then the system of all rows of A is also linearly dependent. By Property III the determinants of all three matrices A,\ A^\prime,\ A^{\prime\prime} are zero and the statement is true.

Case 2. Let S be linearly independent. Then by Exercise 1 it can be completed with a vector B to form a basis in R^n. A_i^\prime,\ A_i^{\prime\prime } can be represented as linear combinations of elements of \tilde{S}. We are interested only in the coefficients of B in those representations. So let A_i^\prime=C+kB, A_i^{\prime\prime}=D+lB, where C and D are linear combinations of elements of S. Hence, A_i^\prime+A_i^{\prime\prime}=C+D+(k+l)B.

We can use Property II to eliminate C, D and C+D from the ith rows of A^\prime, A^{\prime\prime} and A, respectively, without changing the determinants of those matrices. Let A^0 denote the matrix obtained by replacing the ith row of A with B. Then by Property II and Axiom 1

\det A=(k+l)\det A^0, \det A^\prime=k\det A^0, \det A^{\prime\prime}=l\det A^0,

which proves the statement.

Combining homogeneity and additivity, we get the following important property that some people use as a definition:

Property V. Multilinearity. The determinant of A is a multilinear function of its rows, that is, for each i, it is linear in row A_i, when the other rows are fixed.

Property VI. Antisymmetry. If the matrix A^0 is obtained from A by changing places of two rows, then \det A^0=-\det A.

Proof. Let

A=\left( \begin{array}{c} ... \\ A_i \\ ... \\ A_j \\ ... \end{array} \right), A^0=\left( \begin{array}{c} ... \\ A_j \\ ... \\ A_i \\ ... \end{array} \right)

(all other rows of these matrices are the same). Consider the next sequence of transformations:

\left(\begin{array}{c} ... \\ A_i \\ ... \\ A_j \\ ... \end{array}\right) \rightarrow \left(\begin{array}{c} ... \\ A_i+A_j \\ ... \\ A_j \\ ... \end{array}\right)  \rightarrow \left(\begin{array}{c} ... \\ A_i+A_j \\ ... \\ A_j-(A_i+A_j) \\ ... \end{array}\right)

=\left( \begin{array}{c} ... \\ A_i+A_j \\ ... \\ -A_i \\ ... \end{array} \right) \rightarrow \left( \begin{array}{c} ... \\ A_i+A_j+(-A_i) \\ ... \\ -A_i \\ ... \end{array}\right)=\left( \begin{array}{c} ... \\ A_j \\ ... \\ -A_i \\ ... \end{array} \right).

By Property II, each of these transformations preserves \det A. Recalling homogeneity, we finish the proof.