Finding eigenstuff

Recall definitions of eigenvalues, eigenvectors, and eigenspaces.

Here is more involved example of finding eigenvectors given the eigenvalues.

Example 1.

Say we want to find the eigenvectors of

given that the eigenvalues of are and .
For , we have

Using the fact that , we can say that the any vector in the 2-eigenspace of is in the span of the basis vectors of . That is

Note that the -Eigenspace of is a plane in .
We can repeat the same procedure for .

Here the basis of the null space turns out to be

So, we have obtained three linearly independent eigenvectors and two eigenvalues. Since cannot have more than 3 linearly independent eigenvectors, we can be sure that no more eigenvalues exist.

Info

If is an eigenvalue of , then is nonzero is non-invertable.

Characteristic Polynomial

We know how to find the eigenvectors of a matrix given its eigenvalues. So, how do we find the eigenvalues?

Definition 2.

The characteristic polynomial of an matrix is defined as

Theorem 3.

is an eigenvalue of .

Proof

Example 4.

Say we want to find the eigenvalues of

The characteristic polynomial of is

the roots of which are .

Theorem 5.

Let be a matrix. Then, the characteristic polynomial of is of degree and is of the form .

Proof
It is obvious from the nature of the determinant that is of degree (this in fact gives another proof of having at most eigenvalues). For a matrix , we have

Now we shall prove the theorem for general . Observe from the definition of that . Thus, the constant term is .

If use cofactor expansion along the first row to compute the determinant, observe that only the first term in the expansion will have a and term. Thus, we only need to bother with the first term.

If we proceed inductively with as our base case, we have

Thus, we have

Important

For a upper/lower triangular matrix , the characteristic polynomial is of the form

since the determinant of a triangular matrix is the product of its diagonal entries. We can see that the diagonal entries are the eigenvalues of .


Eigenstuff of abstract operators

Linear operators can have different matrices. For linear operators in and , geometric intuition tells us that eigenvalues should be properties of the map itself, and not the matrix used to represent it. This is true in general.

Definition 6.

Two matrices and are similar if there exists an invertible matrix such that

Note that the determinants of similar matrices are equal. .

Notation

So far, the matrix of a linear transformation in bases and has been denoted by . From now on, it will be denoted by . Note that the change of basis matrix between two bases and can now be denoted as . If , the coordinate vector of in basis is denoted as . Thus, we have .

One should think of two similar matrices as representing the same abstract linear operator in different bases. For example, if and are bases of , then and are similar matrices, related by .

From the previous section, we know how to find the eigenvalues of a matrix. How do we find the eigenvalues of an abstract linear operator ? We pick an arbitrary basis, and compute the eigenvalues of the matrix of the operator in that basis. We can do this because similar matrices have the same characteristic polynomial:

Let .
.
So, .

Therefore, we can define the characteristic polynomial of an operator as the characteristic polynomial of its matrix in some basis. As we have discussed above, the result does not depend on the choice of the basis, so characteristic polynomial of an operator is well defined.

Similar matrices do not have the same eigenvectors!

Let . Let be an eigenvector of with eigenvalue . Then, . So, . Hence, , i.e, is the eigenvector in with eigenvalue .

Observe that is the only matrix similar to .
Note that similarity is an equivalence relation.
Also note that if is similar to , is similar to .


tut

If and are similar, they have the same rank.
symmetric matrices have real eigenvalues.
minimal polynomials
must be unique
minimal polynomial always divides any poly which has as a root