Description
An eigenvector of a linear transformation is a non-trivial vector that is only scaled when a linear transformation is applied to it. This means that the direction of the eigenvector is preserved after the operation.
In mathematical notation: Let v be an eigenvector of the matrix A. Then Av=λv for a unique λ∈R.
This equation says that the transformed vector Ax is just a scaled version of the original vector v.
We also define the eigenvector to be non-trivial because A0=0 for all matrices.
The eigenvalue associated with this eigenvector is the scalar quantity by which the eigenvector is scaled. In the above equation, λ would the eigenvalue associated with the vector v. The eigenvalue associated with an eigenvector can be 0.
The eigenspace of an eigenvalue is the span of all eigenvectors with this eigenvalue because any linear combination of the eigenvectors is still an eigenvector with the same eigenvalue. In the above equation the eigenspace Eλ=span{v} because there is one eigenvector v corresponding to λ.
Proof: Let v be an eigenvector of the matrix A. Then Av=λv for a unique λ. Now, let’s scale v by a constant c. A(cv)=c(Av)=c(λv)=λ(cv) Therefore, λ is still the associated eigenvalue of cv.
Last updated
Was this helpful?