Let
be a finite dimensional vector space over the field . Suppose that are linear maps. Let be an eigenvalue of and write for the -eigenspace of ; i.e. If show that is invariant under – i.e. show that .Solution:
Let . We must show that . To do this, we must establish that is a -eigenvector for .We have
This completes the proof.Let
be a non-zero natural number, and let be an dimensional -vector space with a given basis .Consider the linear transformation
given by the rule In other wordsShow that
is invertible and that .To check that
, we check that for .From the definition, it follows by induction on the natural number
that Thus . Since this holds for every , conclude .Now
is invertible since its inverse is given by .Consider the vector
. Show that is a -eigenvector for .We compute
Thus so indeed is a -eigenvector.
Let
be a primitive -th root of unity. (e.g. if you assume , you may as well take ).Let
. Show that is a -eigenvector for .We compute
Thus so indeed is a -eigenvector.More generally, let
and let Show that is a -eigenvector for .The calcuation in the solution to part (c) is valid for any
-th root of unity unity . Applying this calculation for shows that is a -eigenvector for as required.Conclude that
is a basis of consisting of eigenvectors for , so that is diagonalizable.Hint: You need to use the fact that eigenvectors for distinct eigenvalues are linearly independent.
What is the matrix of
in this basis?Since eigenvectors for distinct eigenvalues are linearly independent, conclude that the vectors
are linearly independent. Since there vectors in and since , conclude that is a basis for .The matrix of
in the basis is given by(This form explains why an
matrix is diagonalizable iff has a basis of eigenvectors for ).
Let
be the additive group of order , and let be a primitive rd root of unity in .To define a representation
, it is enough to find a matrix with ; in turn, determines a representation by the rule .Consider the representation
given by the matrix and consider the representation given by the matrixShow that the representations
and are equivalent (alternative terminology: are isomorphic). In other words, find a linear bijection with the property that for every and .Hint: First find a basis of
consisting of eigenvectors for the matrix .The matrix
is diagonal, which is to say that the standard basis vectors are eigenvectors for with respective eigenvalues .By the work in problem 2, we see that
are eigenvectors for with respective eigenvalues .Now let
be the linear transformation for which .We claim that
defines an isomorphism of -representationsWe must check that
for all and all .Since
is cyclic it suffices to check that .(Indeed,
amounts to “checking on a generator”. If holds then for every natural number a straightforward induction argument shows for every that )In turn, it suffices to verify the
holds for the basis vectors for .Since
and are -eigenvectors for resp. , we haveSince
and are -eigenvectors for resp. , we haveSince
and are -eigenvectors for resp. , we have Thus holds and the proof is complete.
Alternatively, note that the matrix of
in the standard basis is given byNow, to prove that
, it suffices to check that i.e. thatIN fact, both products yield the matrix
Let
be a dimensional -vector space for .Let
denote the group where the group operation is composition of linear transformations.Recall that
denotes the group of all invertible matrices.If
is a choice of basis, show that the assignment determines an isomorphismHere
denotes the matrix of in the basis defined by equationsLets write
for the mapping defined above.An important property – proved in Linear Algebra – is that for
we have In words: “once you choose a basis, composition of linear transformations corresponds to multiplication of the corresponding matrices”.Now, since the matrix of the endomorphism
is equal to the identity matrix if and only if , shows at once that a linear transformation is invertible if and only if is an invertible matrix.This confirms that
is indeed a group homomorphism.To show that
is an isomorphism, we exhibit its inverse. Namely, we defined a group homomorphism and check that is the inverse to .TO define
, we introduce the linear isomorphism defined by the ruleFor an invertible matrix
, we define by the ruleIf
then for every we have This confirms that is a group homomorphism.It remains to observe that for
we have which amounts to the fact that is the matrix of , and we must observe for hat which amounts to the observation that the transformation is determined by its effect on the basis vectors and hence by the matrix .