# Lecture 007

## Eigenvalue and Eigenvector

Eigenvalue and (non-trivial) Eigenvector: $\lambda \in \mathbb{R}$ is a eigenvalue of the matrix $A^{n \times n}$ if

(\exists v \in \mathbb{R}^n)(v \neq 0 \land Av = \lambda v)

where $v$ is a eigenvector

• Find Eigenvector given Eigenvalue:
\begin{align*} Av &= \lambda v\\ Av - \lambda v &= 0\\ Av - \lambda I v &= 0\\ (A - \lambda I) v &= 0\\ \end{align*}
• $v \in N(A - \lambda I) \land v \neq 0 \iff v \text{ is a non-trivial eigenvector}$

• Find Eigenvalue given Matrix:

\begin{align*} &\lambda \text{ is eigenvalue of } A\\ \iff &(A - \lambda I)x = 0 \text{ has non-trivial solution} \\ \iff &(A - \lambda I) \text{ is singular}\\ \iff &\det(A - \lambda I) = 0\\ \end{align*}
• Characteristic polynomial: $\det (A - \lambda I)$ (expanding determinant = 0 leads to find root of a polynomial = 0)

Eigenspace ($E_\lambda$ correspond to $A$): nullspace of $(A - \lambda I)$, all the eigenvector correspond to $A, \lambda$

Eigenfunction: eigenfunction of a linear operator (linear transform) $D$ is any non-zero function $f$ such that for scalar eigenvalue $\lambda$: Here we treat eigenfunction $f$ as a vector.

Df = \lambda f

Multiplicity:

• Algebratic Multiplicity ($\text{mult}_{a}$): Algebratic Multiplicity of $\lambda$ (as a solution to the characteristic polynomial) is its multiplicity as a root of the characteristic equation (how many non-distinct value of this solution).

• Algebratic Multiplicities: a set of non-distinct solution to characteristic polynomial

• Geometric Multiplicity ($\text{mult}_{g}$): Geometric Multiplicity of $\lambda$ is the dimension of its eigenspace. ($\text{mult}_{g}(\lambda) =_A \text{dim}(E_{\lambda}) = \text{dim}(N(A - \lambda I))$)

• Theorem: $\text{mult}_{g} \leq \text{mult}_{a}$ // QUESTION: why

• Diagonalizable: $A^{n \times n}$ is diagonalizable iff $\text{mult}_{a}(A) = \text{mult}_{g}(A)$ // QUESTION: why

• Diagonalize Matrix as a scale (backward): $S^{-1}AS = D \implies \text{the diagonal of } D \text{ are eigenvalues, and the column of } S \text{ are corresponding eigenvectors}$

• Linearly Independent Basis: Let $\{\lambda_1, \lambda_2, \lambda_n\}, \{E_1, E_2, ..., E_n\}, \{B_1, B_2, ..., B_n\}$ are distinct eigenvalues, eigenspace, and basis of eigenspace, then all vectors together from the basis are linearly independent. (all vectors in all the basis of all eigenspace are linearly independent) // QUESTION: why, what if A is a scaling matrix?

• Corollary: $\text{dim}(E_1) + \text{dim}(E_2) + ... + \text{dim}(E_k) \leq n$ (that is $\text{mult}_g(\lambda_1) + \text{mult}_g(\lambda_2) + ... + \text{mult}_g(\lambda_k) \leq n$)

// TODO: understand below Properties of Eigenvector and Eigenvalue

1. $0$ is an eigenvalue of $A$ iff $A$ is singular (determinant zero, squashing space)
2. If $\lambda_1, \lambda_2, ..., \lambda_n$ are different eigenvalues of $A \in \mathbb{R}^{n \times n}$ and $v_1, v_2, ..., v_n$ are non-trivial eigenvectors corresponding to them, then $v_1, v_2, ..., v_n$ are linearly independent // TODO: proof
3. By above, $A^{n \times n}$ can have at most $n$ different eigenvalues (by algebratic multiplicities)
4. If $\lambda$ is an eigenvalue of $A$, then $\lambda^k$ is an eigenvalue of $A^k$ (by induction)

Trace: trace of $A^{n \times n}$ is the sum of its entries on the diagonal ($\text{tr}(A) = \sum_i a_{ii}$ given $A = (a_{ij})_{1 \leq i, j, \leq n}$)

1. Assume that together with multiplicities $A^{n \times n}$ has $n$ eigenvalues, then $\text{tr}(A) = \lambda_1 + \lambda_2 + ... + \lambda_n$ and $\det A = \lambda_1 \cdot \lambda_2 \cdot ... \cdot \lambda_n$ // TODO: 3b1b proof

2. $A \simeq B \implies \text{tr}(A) = \text{tr}(B)$ (but not vise-versa)

### Diagonalization

Similar: $A^{n \times n} \simeq B^{n \times n} \iff (\exists S^{n \times n})(S^{-1}AS = B)$

• effect of transformation $A$ is similar to effect of transformation of $A$ applied in some change of basis matrix $S$.

• It is equivalence relation:

• $A \simeq A$ (reflexive)
• $A \simeq B \implies B \simeq A$ (symmetry)
• $A \simeq B \land B \simeq C \implies A \simeq C$ (transitive)

Theorem: $A^{n \times n} \simeq B^{n \times n} \implies$

1. $\det A = \det B$
2. $A \text{ is invertible} \iff B \text{ is invertible}$
3. $\text{rank}(A) = \text{rank}B$
4. $A, B$ have the same characteristic polynomial
5. $A, B$ have the same eigenvalues
6. $(\forall m \geq 0)(A^m \simeq B^m)$ // TODO: proof above

Diagonalizable Matrix: $A^{n \times n}$ is diagonalizable if $(\exists \text{diagonal matrix} D)(A \simeq D)$

• Theorem: $A$ is diagonalizable iff $A$ has $n$ independent eigenvectors. // TODO: proof

• Corollary: if $A^{n\times n}$ has $n$ real different eigenvalues, then $A$ is diagonalizable

Application: given diagonalizable $A^{n \times n}$, calculate $A^k$

1. find $\lambda_1, \lambda_2, ..., \lambda_n$
2. find $v_1, v_2, ..., v_n$
3. Put $v_1, v_2, ..., v_n$ into columns of $S$
4. find $S^{-1}$
5. calculate $D = S^{-1}AS$
6. then $A = SDS^{-1}$
7. then $A^k = SD^kS^{-1}$

Table of Content