30
Jul 18

## Linear dependence of vectors: definition and principal result

This is a topic most students have trouble with. It's because there is a lot of logic involved, so skimming it is not a good idea. We start with the most common definition.

Definition 1. Let $x^{(1)},...,x^{(k)}\in R^n$ be some vectors. They are called linearly dependent if there exist numbers $a_1,...,a_k,$ not all of which are zero, such that

(1) $a_1x^{(1)}+...+a_kx^{(k)}=0.$

The sheer length of this definition scares some people and all they remember is equation (1). Stating just (1) instead of the whole definition is like skinning an animal.

### Seizing the bull by the horns

The first matter of business is to shorten the definition and relate it to what we already know.

It's better to join the set of numbers $a_1,...,a_k$ into a vector $a=(a_1,...,a_k).$ Then the requirement that "not all of $a_1,...,a_k$ are zero" is equivalent to a single equation $a\neq 0.$ Further, let us write the vectors $x^{(1)},...,x^{(k)}$ as columns and put them into a matrix $X=\left( x^{(1)},...,x^{(k)}\right) .$ Using multiplication of partitioned matrices we see that (1) is equivalent to

(2) $Xa=0$

and therefore Definition 1 is equivalent to the following.

Definition 2. Vectors $x^{(1)},...,x^{(k)}\in R^n$ are called linearly dependent if the homogeneous equation (2) has nonzero solutions $a\in R^k,$ that is, the null space of $X$ is not $\{0\}.$

Negating Definition 2 gives the definition of linear independence. Negating Definition 2 is easier than negating Definition 1.

Definition 3. Vectors $x^{(1)},...,x^{(k)}\in R^{n}$ are called linearly independent if $N(X)=\{0\}$ (for any nonzero $a\in R^k$ we have $Xa\neq 0$ or, alternatively, $Xa=0$ only for $a=0$).

Exercise 1. For any matrix $X,$ one has

(3) $N(X)=N(X^TX).$

Proof. 1) Proving that $N(X)\subset N(X^TX).$ If $a\in N(X),$ then $Xa=0$ and $X^TXa=0,$ so $a\in N(X^TX).$ 2) Proving that $N(X^TX)\subset N(X).$ If $a\in N(X^TX),$ then $X^TXa=0$ and $0=(X^TXa)\cdot a=(Xa)\cdot(Xa)=\|Xa\|^2,$ so $a\in N(X).$

Exercise 2. $X^TX$ is a square symmetric matrix, for any $X.$

Proof. If $X$ is of size $n\times k,$ then $X^TX$ is $k\times k.$ It is symmetric: $(X^TX)^T=X^T(X^T)^T=X^TX.$ By the way, some students write $(X^TX)^{-1}=X^{-1}(X^T)^{-1}.$ You cannot do this if $X$ is not square.

Criterion of linear independence. Vectors $x^{(1)},...,x^{(k)}\in R^n$ are linearly independent if and only if $\det X^TX\neq 0.$

Proof. We are going to use (3). By Exercise 2, $A=X^TX$ is a square matrix and for a square matrix we have the equivalence $N(A)=\{0\}\Longleftrightarrow \det A\neq 0.$ Application of this result proves the statement.

Direct application of Definition 1 can be problematic. To prove that some vectors are linearly dependent, you have to produce $a_1,...,a_k$ such that Definition 1 is satisfied. This usually involves some guesswork, see exercises below. The criterion above doesn't require guessing and can be realized on the computer. The linear independence requirement is common in multiple regression analysis but not all econometricians know this criterion.

### Putting some flesh on the bones

These are simple facts you need to know in addition to the above criterion.

Exercise 3. 1) Why do we exclude the case when all $a_1,...,a_k$ are zero?

2) What happens if among $x^{(1)},...,x^{(k)}$ there are zero vectors?

3) Show that in case of two non-zero vectors Definition 1 is equivalent to just proportionality of one vector to another.

Solution. 1) If all $a_1,...,a_k$ are zero, (1) is trivially satisfied, no matter what the vectors are.

2) If one of the vectors $x^{(1)},...,x^{(k)}$ is zero, the coefficient of that vector can be set to one and all others to zero, so such vectors will be linearly dependent by Definition 1.

3) Consider two non-zero vectors $x^{(1)},x^{(2)}.$ If they are linearly dependent, then

(4) $a_1x^{(1)}+a_2x^{(2)}=0$

where at least one of $a_1,a_2$ is not zero. Suppose $a_1\neq 0.$ Then $a_2\neq 0$ because otherwise $x^{(1)}$ would be zero. Hence

(5) $x^{(1)}=-a_2/a_1x^{(2)}=cx^{(2)}$

where the proportionality coefficient $c$ is not zero. Conversely, (5) implies (4) (you have to produce the coefficients).

Exercise 4. Prove that if to the system of unit vectors we add any vector $x\in R^n$, the resulting system will be linearly dependent.