Final touches on linear independence
Correctness of the dimension definition
The main definition of linear dependence is of type
(1) for some nonzero
(a non-trivial annihilating set of coefficients exists).
Exercise 1. Prove that the main definition is equivalent to the following: vectors are called linearly dependent if one of them can be expressed as a linear combination of the others.
Proof. This exercise is about passing from (1) to
(2) (one is a function of the others).
If there is dependence of the first type, it can be solved for a vector whose coefficient is different from zero. Dependence of the second type can be converted to the first type by sending everything to one side.
Application. In multiple regression, we try to explain the behavior of a random vector with the help of the vectors
(called regressors) and a random vector
(3)
Suppose that these regressors are linearly dependent. By Exercise 1, we can plug in (3) and the behavior of
can be explained with the help of a smaller number of regressors. After eliminating all regressors that depend on the others, we obtain a more economical model. See a related discussion.
Exercise 2 (going from a small system to large and back) a) If some subsystem of is linearly dependent, then the whole system is linearly dependent. b) If the system
is linearly independent, then any its subsystem is linearly independent.
Proof. a) If we have a non-trivial annihilating set of coefficients for a subsystem, it can be completed with zeros to obtain a non-trivial annihilating set of coefficients for the whole system. b) Proof by contradiction. Assume that some subsystem is linearly dependent. Then by part a) the whole system is too.
Theorem (correctness of the dimension definition) Any basis in consists of
vectors.
The proof in Halmos (Finite-dimensional vector spaces, Springer, 1987) is the shortest I could find. To understand it, you will need Exercise 2. I don't give the proof here because I don't find it to be of general interest. Note that the theorem itself is important for any statement involving the dimension notion.
Using square matrices to study non-square ones
We'll need the upper bound on the rank:
(1) if
is of size
Exercise 3.
Proof. We know that Applying
to both sides we get
[In detail: any
can be represented as
where
for some
Hence, applying
we get
When
runs over
the left side runs over
while the right side runs over
] If two subspaces coincide, their dimensions are the same.
Exercise 4. In any set of vectors
with
is linearly dependent.
Proof. Put
is of size
By Exercise 3 and (1)
(2)
Suppose are linearly independent. By the criterion of linear independence then
Since
is a square matrix of size
, this implies that
and, consequently,
which contradicts (2).
Leave a Reply
You must be logged in to post a comment.