The crux of the idea is that when a matrix is multiplied to a vector, the matrix transforms the vector into a new space with different basis vectors.1
So, are there vectors that only get scaled and not transformed completely when multiplied with matrix A?
Eigenvectors are vectors (with a particular basis) that only get scaled by factors, eigenvalues, when transformed by A.2 So for eigenvectors the transformation is essentially just a scale multiplication.
Eigendecomposition ‘decomposes’ the transformation matrix to eigenvectors and eigenvalues. The above equation can be written as:
This can be generalized for all eigenvectors, represented by matrix Q (each vector is a column vectors):
where, is a diagonal matrix with eigenvalues as the diagonal elements.
If A has n linearly independent eigenvectors, the eigendecomposition can be written as:
If the matrix is linearly independent, then Q forms a basis for the space. If A is a normal matrix, then Q forms the orthonormal basis for the space. In both these cases, one than think of A matrix multiplication as changing of basis, a scaling and reverting back to the original basis.