ProblemSet 1 -- Linear algebra and representations **Solutions**

Posted on 2024-01-29 by George McNinch

F denotes an algebraically closed field of characteristic 0. If you like, you can suppose that F=C is the field of complex numbers.

  1. Let V be a finite dimensional vector space over the field F. Suppose that ϕ,ψ:VV are linear maps. Let λF be an eigenvalue of ϕ and write W for the λ-eigenspace of ϕ; i.e. W={vVϕ(v)=λv}. If ϕψ=ψϕ show that W is invariant under ψ – i.e. show that ψ(W)W.

    Solution:
    Let wW. We must show that x=ψ(w)W. To do this, we must establish that x=ψ(w) is a λ-eigenvector for ϕ.

    We have ϕ(x)=ϕ(ψ(x))=ψ(ϕ(w))since ϕψ=ψϕ=ψ(λw)since w is a λ-eigenvector=λψ(w)since ψ is linear=λx This completes the proof.

  2. Let nN be a non-zero natural number, and let V be an n dimensional F-vector space with a given basis e1,e2,,en.

    Consider the linear transformation T:VV given by the rule Tei=ei+1(modn). In other words Tei={ei+1i<ne1i=n.

    1. Show that T is invertible and that Tn=idV.

      To check that Tn=idV, we check that Tn(ei)=ei for 1in.

      From the definition, it follows by induction on the natural number m that Tm(ei)=ei+m(modn). Thus Tn(ei)=ei+n(modn)=ei. Since this holds for every i, conclude Tn=idV.

      Now T is invertible since its inverse is given by Tn1.

    2. Consider the vector v0=i=1nei. Show that v0 is a 1-eigenvector for T.

      We compute T(v0)=T(i=1nei)=i=1nT(ei)=i=1nei+1(modn)=j=2n+1ej(modn)(let j=i+1)=j=1nej(modn)=v0 Thus T(v0)=v0 so indeed v0 is a 1-eigenvector.

    Let ζF be a primitive n-th root of unity. (e.g. if you assume F=C, you may as well take ζ=e2πi/n).

    1. Let v1=i=1nζiei. Show that v1 is a ζ1-eigenvector for T.

      We compute T(v1)=T(i=1nζiei)=i=1nζiT(ei)=i=1nζiei+1(modn)=j=2n+1ζj1ej(modn)(let j=i+1)=ζ1j=2n+1ζjej(modn)=ζ1j=1nζjej(modn)(since ζj=ζj(modn) j)=ζ1v1 Thus T(v1)=ζ1v1 so indeed v0 is a ζ1-eigenvector.

    2. More generally, let 0j<n and let vj=i=1nζijei. Show that vj is a ζj-eigenvector for T.

      The calcuation in the solution to part (c) is valid for any n-th root of unity unity ζ. Applying this calculation for ζj shows that vj is a ζj-eigenvector for T as required.

    3. Conclude that v0,v1,,vn1 is a basis of V consisting of eigenvectors for T, so that T is diagonalizable.

      Hint: You need to use the fact that eigenvectors for distinct eigenvalues are linearly independent.

      What is the matrix of T in this basis?

      Since eigenvectors for distinct eigenvalues are linearly independent, conclude that the vectors B={v0,v1,,vn1} are linearly independent. Since there n vectors in B and since dimV=n, conclude that B is a basis for V.

      The matrix of T in the basis B is given by [T]B=[10000ζ10000ζ20000ζn+1]

      (This form explains why an n×n matrix M is diagonalizable iff Fn has a basis of eigenvectors for M).

  3. Let G=Z/3Z be the additive group of order 3, and let ζ be a primitive 3rd root of unity in F.

    To define a representation ρ:GGLn(F), it is enough to find a matrix MGLn(F) with M3=1; in turn, M determines a representation ρ by the rule ρ(i+3Z)=Mi.

    Consider the representation ρ1:GGL3(F) given by the matrix ρ1(1+3Z)=M1=[1000ζ000ζ2] and consider the representation ρ2:GGL3(F) given by the matrix ρ2(1+3Z)=M2=[001100010].

    Show that the representations ρ1 and ρ2 are equivalent (alternative terminology: are isomorphic). In other words, find a linear bijection Φ:F3F3 with the property that Φ(ρ2(g)v)=ρ1(g)Φ(v) for every gG and vF3.

    Hint: First find a basis of F3 consisting of eigenvectors for the matrix M2.

    The matrix M1 is diagonal, which is to say that the standard basis vectors e1=[100],e2=[010],e3=[001] are eigenvectors for M1 with respective eigenvalues 1,ζ,ζ2.

    By the work in problem 2, we see that v1=e1+e2+e3,v2=e1+ζe2+ζ2e3,v3=e1+ζ2e2+ζe3 are eigenvectors for M2 with respective eigenvalues 1,ζ2,ζ.

    Now let Φ:F3F3 be the linear transformation for which Φ(e1)=v1,Φ(e2)=v3,Φ(e3)=v2.

    We claim that Φ defines an isomorphism of G-representations (ρ1,F3)(ρ2,F3).

    We must check that Φ(ρ1(g)v)=ρ2(g)Φ(v) for all gG and all vF3.

    Since G is cyclic it suffices to check that ()Φ(M1v)=M2Φ(v)vF3.

    (Indeed, () amounts to “checking on a generator”. If () holds then for every natural number i a straightforward induction argument shows for every vF3 that Φ(ρ1(i+3Z)v)=Φ(ρ1(1+3Z)iv)=Φ(M1iv)=M2iΦ(v)=ρ2(1+3Z)iΦ(v)=ρ2(i+3Z)Φ(v) )

    In turn, it suffices to verify the () holds for the basis vectors e1,e2,e3 for V=F3.

    Since e1 and v1 are 1-eigenvectors for M1 resp. M2, we have Φ(M1e1)=Φ(e1)=v1=M2v1.

    Since e2 and v3 are ζ-eigenvectors for M1 resp. M2, we have Φ(M1e2)=Φ(ζe2)=ζΦ(e2)=ζv3=M2v3.

    Since e3 and v2 are ζ2-eigenvectors for M1 resp. M2, we have Φ(M1e3)=Φ(ζ2e3)=ζ2Φ(e3)=ζ2v2=M2v2. Thus () holds and the proof is complete.


    Alternatively, note that the matrix of Φ in the standard basis is given by [Φ]=[1111ζ2ζ1ζζ2]

    Now, to prove that Φρ1(g)=ρ2(g)Φ, it suffices to check that M2[Φ]=[Φ]M1 i.e. that [001100010][1111ζ2ζ1ζζ2]=[1111ζ2ζ1ζζ2][1000ζ000ζ2]

    IN fact, both products yield the matrix [1ζζ21111ζ2ζ]

  4. Let V be a n dimensional F-vector space for nN.

    Let GL(V) denote the group GL(V)={all invertible F-linear transformations ϕ:VV} where the group operation is composition of linear transformations.

    Recall that GLn(F) denotes the group of all invertible n×n matrices.

    If B={b1,b2,,bn} is a choice of basis, show that the assignment ϕ[ϕ]B determines an isomorphism GL(V)GLn(F).

    Here [ϕ]B=[Mij] denotes the matrix of ϕ in the basis B defined by equations

    ϕ(bi)=k=1nMkibk.

    Lets write Φ for the mapping Φ:GL(V)GLn(F) defined above.

    An important property – proved in Linear Algebra – is that for ϕ,ψ:VV we have ()[ϕψ]B=[ϕ]B[ψ]B. In words: “once you choose a basis, composition of linear transformations corresponds to multiplication of the corresponding matrices”.

    Now, since the matrix of the endomorphism ϕ:VV is equal to the identity matrix In if and only if ϕ=idV, () shows at once that a linear transformation ϕ:VV is invertible if and only if [ϕ]B is an invertible matrix.

    This confirms that Φ is indeed a group homomorphism.

    To show that Φ is an isomorphism, we exhibit its inverse. Namely, we defined a group homomorphism Ψ:GLn(F)GL(V) and check that Ψ is the inverse to Φ.

    TO define Ψ, we introduce the linear isomorphism β:FnV defined by the rule β[a1a2an]=i=1naibi.

    For an invertible matrix M, we define Ψ(M):VV by the rule Ψ(M)(v)=βMβ1v

    If M1,M2GLn(F) then for every vV we have Ψ(M1M2)v=βM1M2β1v=βM1β1βM2β1v=Ψ(M1)Ψ(M2)v This confirms that Ψ is a group homomorphism.

    It remains to observe that for MGLn(F) we have ΦΨ(M)=M, which amounts to the fact that M is the matrix of Ψ(M), and we must observe for gGL(V) hat ΨΦ(g)=g which amounts to the observation that the transformation g:VV is determined by its effect on the basis vectors bi and hence by the matrix Φ(g).