Eigenvalues and eigenvectors
Motivation: what matrix can be called simplest?
Of course, nothing can be simpler than the zero and identity matrices. What is the next level of simplicity? In the one-dimensional case, consider a linear mapping Any real number
can be written as
By homogeneity
Denoting
we see that
is just multiplication by a number:
Let's look at the 2-dimensional generalization. We know that a linear mapping is given by a matrix. Consider
is scaling by
along the
-axis:
(1)
Similarly, is scaling by
along the
-axis. Matrices of type
are called diagonal and the short notation is Any simpler than that, and we'll be in the kindergarten.
It remains to name the things and link them to what we know
Definition 1. Based on (1), we say that is an eigenvalue of the matrix
if there exists a nonzero vector
such that
(2)
Such an is called an eigenvector corresponding to
With a zero (2) would be true for any
and there would be no definition. Since
is a real matrix,
in principle cannot be complex if
Definition 2. If one satisfies (2), then the whole straight line passing through
satisfies (2). Moreover, if we have two eigenvectors
corresponding to
then for any linear combination
we have
Thus the set of all eigenvectors corresponding to one eigenvalue, completed with zero, is a subspace. We call it an eigenspace corresponding to
Its dimension is called a multiplicity of
Exercise 1. a) The eigenspace corresponding to coincides with
and the multiplicity of
equals
b) The set of eigenvalues coincides with the set of roots of the equation
(3)
Proof. a) This is obvious because (2) is equivalently written as
(4)
b) Existence of nonzero solutions of (4) is equivalent to and we know that this condition is equivalent to (3).
Keep in mind that part b) of Exercise 1 implicitly assumes that both sets are of the same nature (are both subsets of either or
).
Definition 3. (3) is called a characteristic equation and its roots - characteristic roots. Obviously, For convenience, people define
and use
to find the characteristic roots. The function
is called a characteristic polynomial.
After finding characteristic roots we can plug them in (4) to find
Definition 4. The set of eigenvalues of is called the spectrum of
and denoted
Remark. When convenient, we use the following notation. as a mapping from
to
will be denoted
and as a mapping from
to
will be denoted
We have to remember that there are two versions of statement b) from Exercise 1:
coincides with the set of real characteristic roots.
coincides with the set of complex characteristic roots.
Digression on determinants
As much as possible I try to avoid using explicit formulas for the determinant but here we have to use one. The simplest version is this:
Perhaps a verbal description is better than this formula:
1) the determinant is a sum of products of elements of the matrix.
2) Each product contains factors and is obtained as follows: take one element
from the first row and cross out the column it belongs to; take one element
from the second row and not from the column you have crossed out, and cross out the column it belongs to; take one element
from the third row and not from the two columns you have crossed out, and cross out the column it belongs to. Continue like that. After each step, the number of crossed-out columns increases by 1. The first factor can be chosen in
ways, the second in
ways,..., and for the last factor
there will be only one choice.
3) The sign depends on the choices made and doesn't matter at this point.
Let's see what this implies for finding eigenvalues.
Exercise 2. is a polynomial of power
in
Proof.
The verbal description given above shows that contains the product of diagonal elements
which, in turn, contains the power
Because of the crossing-out procedure, other products will contain powers of
lower than
, and there is no way the power
could be cancelled out. Besides,
may contain only non-negative integer powers of
not exceeding
This proves the statement.
Exercise 3. In any matrix
has at least one eigenvector.
Proof. Exercise 1 reduces the problem of finding eigenvalues to the problem of finding characteristic roots. By the fundamental theorem of algebra every non-constant single-variable polynomial has at least one complex root. Thus we have at least one eigenvalue and from (4) at least one eigenvector.
The reduction of the problem of finding eigenvalues to the problem of finding characteristic roots poses a problem. Even when has real entries, the characteristic roots may be complex. Watch later how this problem is handled.
You must be logged in to post a comment.