Eigenvalue and (non-trivial) Eigenvector: \lambda \in \mathbb{R} is a eigenvalue of the matrix A^{n \times n} if
where v is a eigenvector
v \in N(A - \lambda I) \land v \neq 0 \iff v \text{ is a non-trivial eigenvector}
Find Eigenvalue given Matrix:
Eigenspace (E_\lambda correspond to A): nullspace of (A - \lambda I), all the eigenvector correspond to A, \lambda
Eigenfunction: eigenfunction of a linear operator (linear transform) D is any non-zero function f such that for scalar eigenvalue \lambda: Here we treat eigenfunction f as a vector.
Multiplicity:
Algebratic Multiplicity (\text{mult}_{a}): Algebratic Multiplicity of \lambda (as a solution to the characteristic polynomial) is its multiplicity as a root of the characteristic equation (how many non-distinct value of this solution).
Algebratic Multiplicities: a set of non-distinct solution to characteristic polynomial
Geometric Multiplicity (\text{mult}_{g}): Geometric Multiplicity of \lambda is the dimension of its eigenspace. (\text{mult}_{g}(\lambda) =_A \text{dim}(E_{\lambda}) = \text{dim}(N(A - \lambda I)))
Theorem: \text{mult}_{g} \leq \text{mult}_{a} // QUESTION: why
Diagonalizable: A^{n \times n} is diagonalizable iff \text{mult}_{a}(A) = \text{mult}_{g}(A) // QUESTION: why
Diagonalize Matrix as a scale (backward): S^{-1}AS = D \implies \text{the diagonal of } D \text{ are eigenvalues, and the column of } S \text{ are corresponding eigenvectors}
Linearly Independent Basis: Let \{\lambda_1, \lambda_2, \lambda_n\}, \{E_1, E_2, ..., E_n\}, \{B_1, B_2, ..., B_n\} are distinct eigenvalues, eigenspace, and basis of eigenspace, then all vectors together from the basis are linearly independent. (all vectors in all the basis of all eigenspace are linearly independent) // QUESTION: why, what if A is a scaling matrix?
Corollary: \text{dim}(E_1) + \text{dim}(E_2) + ... + \text{dim}(E_k) \leq n (that is \text{mult}_g(\lambda_1) + \text{mult}_g(\lambda_2) + ... + \text{mult}_g(\lambda_k) \leq n)
// TODO: understand below Properties of Eigenvector and Eigenvalue
Trace: trace of A^{n \times n} is the sum of its entries on the diagonal (\text{tr}(A) = \sum_i a_{ii} given A = (a_{ij})_{1 \leq i, j, \leq n})
Assume that together with multiplicities A^{n \times n} has n eigenvalues, then \text{tr}(A) = \lambda_1 + \lambda_2 + ... + \lambda_n and \det A = \lambda_1 \cdot \lambda_2 \cdot ... \cdot \lambda_n // TODO: 3b1b proof
A \simeq B \implies \text{tr}(A) = \text{tr}(B) (but not vise-versa)
Similar: A^{n \times n} \simeq B^{n \times n} \iff (\exists S^{n \times n})(S^{-1}AS = B)
effect of transformation A is similar to effect of transformation of A applied in some change of basis matrix S.
It is equivalence relation:
Theorem: A^{n \times n} \simeq B^{n \times n} \implies
Diagonalizable Matrix: A^{n \times n} is diagonalizable if (\exists \text{diagonal matrix} D)(A \simeq D)
Theorem: A is diagonalizable iff A has n independent eigenvectors. // TODO: proof
Corollary: if A^{n\times n} has n real different eigenvalues, then A is diagonalizable
Application: given diagonalizable A^{n \times n}, calculate A^k
Table of Content