Recall
Theorem 1(Lemma).
The characteristic Polynomial of a linear operator is independent of the basis.
Proof
Theorem 2(Corollary).
Similar Matrices have the same eigenvalues.
Exercise: Let and be the kernel and the image of a linear operator . Show that the following are equivalent:
- ;
- ;
- .
Complex vs Real vector spaces
Theorem 3(Proposition).
Let be a nontrivial vector space over . Then any linear operator will have at least one eigenvalue, and hence at least one eigenvector.
Proof
The characteristic polynomial of will be of degree . By the Fundamental Theorem of Algebra, this will have at least one complex root. Hence proved.
Info
The fundamental theorem of algebra states that every non-constant polynomial with complex coefficients has at least one complex root. This implies that any polynomial of degree with complex coefficients has exactly complex roots counting multiplicity. So, any operator in a complex vector space has exactly eigenvalues counting multiplicity.
Example 4.
On the other hand, it is easy to construct a linear map in a real vector space with no real eigenvalues. Consider the rotation map, . It’s characteristic polynomial does not have real roots for . But has eigenvalues and .
Theorem 5(Proposition).
Every complex matrix is similar to an upper triangular matrix.
Proof
We want to show that for every complex matrix there exists a complex invertible matrix such that is upper triangular.has at least one eigenvalue, call it and its corresponding eigenvector . Extend to a basis of . Observe that
for some matrix . By induction hypothesis on , there exists an invertible matrix such that is upper triangular. Define
Using block multiplication of matrices it is evident that has the above form and that will be upper triangular.
If we denote the standard basis by , we can write . Observe that
where we have used the fact that and . Put . ❏
Remark
If is an matrix over a field such that its characteristic polynomial is a product of linear factors in , then there exists an invertible matrix (with entries in ) such that is upper triangular.
Diagonalization
Diagonalization of a linear operator entails finding a basis of such that is a diagonal matrix. Such a basis does not always exist, i.e, not all operators can be diagonalized.
For operators in , the diagonalizability of implies can be expressed as , where is a diagonal matrix and is an invertible matrix with entries in , i.e, is similar to a diagonal matrix.
Theorem 6(Proposition).
A matrix (with values in ) is diagonalizable there exists a basis of that consists of eigenvectors of .
Proof of
Let for some diagonal matrix and invertible matrix . Let be the column vectors of . Note that . Notice what happens when we compute :So, the column vectors of are eigenvectors of (with ‘s being the eigenvalues, but this is not important for the proof). Since the column vectors of form a basis of , we are done.
Proof of
Suppose there exists a basis of that consists of eigenvectors of , i.e, such that . ThenIf we represent the standard basis by , we have . We have shown above that is diagonal, and we know that . ❏
Obviously, an abstract operator is diagonalizable iff its matrix in any basis is diagonalizable. Thus, it follows that is diagonalizable iff there exists a basis of that consists of eigenvectors of .
A simple sufficient condition for an operator to be diagonalizable
Theorem 7.
If an operator has exactly distinct eigenvalues, then is diagonalizable.
Proof
Let be the distinct eigenvalues. For each eigenvalue , let be a corresponding eigenvector. Then, are linearly independent. Hence, they form a basis of . ❏
Example 8.
The characteristic polynomial will be .
2-Eigenspace:
3-Eigenspace:
Sanity Check
So must give a diagonalized matrix
BEHOLD! THE DIAGONALISED MATRIX !!!!!11!!!
Multiplicity of an eigenvalue
To arrive at a stronger criterion of diagonalizability, we need to understand multiplicities of eigenvalues.
Definition 9.
Let be an eigenvalue of an operator .
The Geometric Multiplicity of is the dimension of the -eigenspace, .
The Algebraic Multiplicity of is the number of times appears in the factored characteristic polynomial of .
Let be the distinct eigenvalues for (over ). Denote the geometric multiplicity of by and the algebraic multiplicity of the same by .
Theorem 10.
for each .
Proof
Fix , call . Take to be the basis of . Extend this to a basis of by appending . By writing the matrix of in this basis we getThe characteristic polynomial of will be of the form . Now, may contain a factor of . Hence the number of times appears in will be for some . Hence Proved.
Criterion of diagonalizability
This theorem holds for vector spaces over general fields.
Theorem 11.
Let an operator have exactly eigenvalues (counting multiplicities) (Since any operator in a complex vector space has exactly eigenvalues, this assumption is moot in the complex case). Then, is diagonalizable if and only if for each eigenvalue , the geometric multiplicity of coincides with the algebraic multiplicity of .
We know that and the hypothesis requires . Thus, this theorem boils down to
Proof of
If is diagonalizable, is diagonal for some basis . Observe that geometric and algebraic multiplicities coincide for diagonal matrices. It follows that they must coincide for too.Proof of
Let be the distinct eigenvalues of , and let be the -eigenspace for . We know that these subspaces are linearly independent, i.e, . Let be a basis of . It follows that is linearly independent. Also,A linearly independent system of size is a basis. Thus, is a basis of . ❏
The theorem can be restated specifically for real matrices:
Theorem 12.
A real matrix admits real factorization (i.e representation as where and are real matrices, is diagonal and is invertible) iff it admits complex factorization and all eigenvalues of are real.
Note:
- The requirement of having eigenvalues is not moot for real matrices.
- The eigenvalues being real forces the eigenvectors to be real, which in turn forces to be real.