Eigenvalue & Eigenvector Calculator
Compute eigenvalues and eigenvectors of square matrices using the characteristic polynomial method. Step-by-step solutions for 2×2 and 3×3 matrices with PCA and spectral decomposition applications.
The Eigenvalue Equation
An eigenvalue λ (lambda) of a square matrix A is a scalar that satisfies the equation:
Av = λv
where v is a non-zero vector called the eigenvector. The word "eigen" comes from German, meaning "own" or "characteristic." Geometrically, multiplying matrix A by the eigenvector v only stretches or compresses it by the factor λ without changing its direction. If λ is negative, the direction is reversed. If λ = 1, the eigenvector is unchanged by the transformation.
To find eigenvalues, we rearrange the equation: Av - λv = 0, which gives (A - λI)v = 0. For a non-zero solution v to exist, the matrix (A - λI) must be singular, meaning its determinant must be zero:
det(A - λI) = 0
This equation produces the characteristic polynomial, whose roots are the eigenvalues.
Step-by-Step Example: 2×2 Matrix
Find the eigenvalues of A = [[4, 1], [2, 3]].
Step 1: Form A - λI = [[4-λ, 1], [2, 3-λ]]
Step 2: Compute the determinant: det(A - λI) = (4-λ)(3-λ) - (1)(2) = 12 - 4λ - 3λ + λ² - 2 = λ² - 7λ + 10
Step 3: Solve λ² - 7λ + 10 = 0. Factor: (λ - 5)(λ - 2) = 0.
Eigenvalues: λ1 = 5, λ2 = 2.
Verification: The sum of eigenvalues (5 + 2 = 7) equals the trace (4 + 3 = 7). The product of eigenvalues (5 × 2 = 10) equals the determinant (4·3 - 1·2 = 10). Both checks pass.
Finding eigenvectors: For λ1 = 5: solve (A - 5I)v = 0, giving [[-1, 1], [2, -2]]v = 0. Row reduce to get v1 = [1, 1]. For λ2 = 2: solve (A - 2I)v = 0, giving [[2, 1], [2, 1]]v = 0. Row reduce to get v2 = [1, -2].
Diagonalization and Applications
If a matrix A has n linearly independent eigenvectors, it can be diagonalized: A = PDP-1, where P is the matrix whose columns are the eigenvectors and D is the diagonal matrix of eigenvalues. Diagonalization makes matrix powers trivial: Ak = PDkP-1, and Dk simply raises each diagonal entry to the kth power.
The spectral theorem guarantees that every real symmetric matrix (A = AT) can be diagonalized with an orthogonal matrix: A = QΛQT. The eigenvalues are always real, and the eigenvectors can be chosen to be orthonormal. This is the mathematical foundation of Principal Component Analysis (PCA).
In PCA, you compute the covariance matrix of your data (which is symmetric), find its eigenvalues and eigenvectors, and sort them by eigenvalue magnitude. The eigenvector with the largest eigenvalue points in the direction of maximum variance. By projecting data onto the top k eigenvectors, you reduce dimensionality while preserving as much variance as possible. If the top 3 eigenvalues capture 95% of the total variance (sum of all eigenvalues), a 3D projection loses only 5% of the information.
Eigenvalues also determine the stability of dynamical systems. In a linear system dx/dt = Ax, the eigenvalues of A determine whether the system converges to equilibrium (all eigenvalues have negative real parts), diverges (any eigenvalue has a positive real part), or oscillates (purely imaginary eigenvalues). Google's PageRank algorithm computes the dominant eigenvector of a modified web adjacency matrix to rank pages. Quantum mechanics represents observables as Hermitian matrices whose eigenvalues are the possible measurement outcomes.
Frequently Asked Questions
What is an eigenvalue?
An eigenvalue λ of a matrix A is a scalar such that Av = λv for some non-zero vector v. In other words, multiplying the matrix by the eigenvector v only scales it by λ without changing its direction. Eigenvalues reveal the fundamental scaling behavior of a linear transformation.
How do you find eigenvalues from the characteristic polynomial?
Form the matrix A - λI (subtract λ from each diagonal entry), then compute its determinant and set it equal to zero: det(A - λI) = 0. This produces the characteristic polynomial, whose roots are the eigenvalues. For a 2×2 matrix, this gives a quadratic equation. For 3×3, a cubic.
What is the relationship between eigenvalues and the determinant?
The determinant of a matrix equals the product of all its eigenvalues: det(A) = λ1 · λ2 · ... · λn. Similarly, the trace (sum of diagonal entries) equals the sum of eigenvalues. These relationships provide quick consistency checks when computing eigenvalues.
How are eigenvalues used in PCA (Principal Component Analysis)?
PCA computes the eigenvalues and eigenvectors of the covariance matrix of the data. The eigenvectors define the principal component directions (axes of maximum variance), and the eigenvalues indicate how much variance each component captures. Components with the largest eigenvalues carry the most information, allowing dimensionality reduction by discarding components with small eigenvalues.
Can eigenvalues be complex numbers?
Yes. Even if a matrix has only real entries, its eigenvalues can be complex. For real matrices, complex eigenvalues always occur in conjugate pairs: if a + bi is an eigenvalue, then a - bi is also an eigenvalue. Complex eigenvalues indicate rotation in the transformation. A 2×2 rotation matrix, for example, has eigenvalues cos(θ) ± i·sin(θ).
Related Tools
- Determinant Calculator — det(A) equals the product of eigenvalues
- Matrix Transpose Calculator — symmetric matrices (A = AT) have real eigenvalues
- Inverse Matrix Calculator — eigenvalues of A-1 are 1/λ
Built by Michael Lip. Try the ML3X Matrix Calculator for interactive step-by-step solutions.