2.5.8 Let U be a vector space over a field F and T ∈ L(U). If λ1, λ2 ∈ F are distinct eigenvalues and u1, u2 are the respectively associated eigenvectors of T , show that, for any nonzero a1, a2 ∈ F, the vector u = a1u1 + a2u2 cannot be an eigenvector of T .
If λ2, λ2 ∈ F are distinct eigenvalues and u1, u2 are the respectively associated eigenvectors of T, then u1, u2 are linearly independent vectors.
Now, let us assume that u = a1u1 + a2u2 is an eigenvector of T corresponding to the eigenvalue ʎ. Then Tu = ʎu or, T(a1u1 + a2u2) = ʎ(a1u1 + a2u2) or, a1T(u1)+a2T(u) = ʎ(a1u1 + a2u2) or, a1ʎ1u1+ a2 ʎ2u2= ʎ(a1u1 + a2u2) or, a1u1(ʎ1- ʎ) = a2u2(ʎ -ʎ2).Further, since u1, u2 are linearly independent vectors, neither of these can be a scalar multiple of the other. This implies that either ʎ1- ʎ = 0 = ʎ -ʎ2 or, a1= a2 = 0.
Now, ʎ1- ʎ = 0 = ʎ -ʎ2 means that ʎ1 =ʎ = ʎ2 which is a contradiction as λ2, λ2 are distinct eigenvalues of T.
Also, a1, a2 are given to be non-zero.
This implies that we cannot have T(a1u1 + a2u2) = ʎ(a1u1 + a2u2) for any ʎ, i.e. a1u1 + a2u2 cannot be an eigenvector of T.
Get Answers For Free
Most questions answered within 1 hours.