Planer Vector: \mathbb{R}^2: \begin{bmatrix} x\\ y\\ \end{bmatrix} \in \mathbb{R}^2 3-Dim Vector: \mathbb{R}^n: \begin{bmatrix} x_1\\ ...\\ x_n\\ \end{bmatrix} \in \mathbb{R}^n
Linear Combination: for v_1, ..., v_k \in \mathbb{R}^k, c_1, ..., c_k \in \mathbb{R}^k, c_1v_1 + ... + c_kv_k \in \mathbb{R}^k is a linear combination of v_1, ..., v_k \in \mathbb{R}^k.
linear combination of x is the result of map elements of x with a constant in front.
typically element-wise multiplication
\overrightarrow{u} \cdot \overrightarrow{v} = \overrightarrow{u} \cdot \overrightarrow{w} \iff (\overrightarrow{v} - \overrightarrow{w}) \perp \overrightarrow{u}
A matrix times a column vector is the linear combination of columns of the matrix (split into columns).
Example 1: \begin{pmatrix} 2&0\\ 1&3\\ \end{pmatrix} \cdot \begin{pmatrix} 1\\ 2\\ \end{pmatrix} = 1\begin{pmatrix} 2\\ 1\\ \end{pmatrix} + 2\begin{pmatrix} 0\\ 3\\ \end{pmatrix} = \begin{pmatrix} 2 \cdot 1 + 0 \cdot 2\\ 1 \cdot 1 + 3 \cdot 2\\ \end{pmatrix} = \begin{pmatrix} 2\\ 7\\ \end{pmatrix}
Example 2: \begin{bmatrix} 1&1\\ 2&3\\ 3&4\\ \end{bmatrix} \langle{c, d}\rangle = c \begin{bmatrix} 1\\ 2\\ 3\\ \end{bmatrix} + d \begin{bmatrix} 1\\ 3\\ 4\\ \end{bmatrix}
Let A \in \mathbb{R}^{M\times N} (denotes B = (b_{ij})_{MN}) and A \in \mathbb{R}^{M\times E}, then AB \in \mathbb{R}^{M \times E}.
Matrix Multiplication Identity:
Associative: ABC = (AB)C = A(BC)
Distributive: A(B+C) = AB + AC
Note: AB = 0 does not imply A = 0 \lor B = 0
Corollary: Columns of AB is a linear combination of the columns of A
Corollary: Rows of AB are linear combinations of the rows of B (by C_i = A_i \times B)
I_n: identity matrix of n \times n
Elementary Matrices: matrix describe row operation
Diagonal Matrices (D): only diagonal non-zero, row multiply of identity matrix.
Permutation Matrices (P): row swap of identity matrix (every row and column there is exactly one 1 and others are 0)
Inverse (square matrix only): let A, B \in R^{n \times n}, if AB = BA = I, then B = A^{-1}
Note: not all matrices have an inverse
\exists A^{-1} \implies A^{-1} \text{ is unique}
AB = I \implies BA = I
if A has an inverse, (A^{-1})^{-1} = A
Left Inverse: B is left inverse of A iff BA = I Right Inverse: B is left inverse of A iff AB = I
Left Inverse = Right Inverse
If left inverse exists, right inverse exists (AB = I \implies BAB = BI \implies BAB = IB \implies BA = I)
Invertible: if the inverse of a matrix exists
\begin{pmatrix} 0 & 0\\ 0 & 0\\ \end{pmatrix} is not invertible because AB = BA = 0 \neq I
for the inverse of matrix \begin{pmatrix} a & b\\ c & d\\ \end{pmatrix} is \frac{1}{ad-bc} \begin{pmatrix} d & -b\\ -c & a\\ \end{pmatrix} if ad - bc \neq 0
Product: if both A, B \in \mathbb{R}^{n \times n} are invertible, then AB is invertible and (AB)^{-1} = B^{-1}A^{-1} // WARNING: be careful
Uniqueness: if a inverse of a matrix exists, it is unique.
Left inverse is unique: XA = I \land YA = I \implies X = Y
Right inverse is unique: AX = I \land AY = I \implies X = Y
Matrix inverse in System: if Ax = b for A \in \mathbb{R}^{n \times n}, b \in \mathbb{R}^n and A is invertible, then x = A^{-1}b is the unique solution
Power of Matrix: A^n = AAAA....A with n many As.
Transpose: transpose of A \in \mathbb{R}^{n \times n} is A^\intercal (by reflect on main diagonal)
B is the transpose of A if the i-th column of A has the same entries in the same order as the i-th row of B
A \text{ is invertible } \implies A^\intercal \text{ is invertible}
(AB)^T = B^TA^T
Theorem: A \in \mathbb{R}^{n \times n}, the following are equivalent
A is invertible
Ax = b has a unique solution for every b \in \mathbb{R}^n (A is Non-Singular Matrix)
The only solution for Ax = 0 is x = 0
Using row operation on Ax = 0 can reach \begin{pmatrix} x_1 & 0 & 0\\ 0 & x_2 & 0\\ 0 & 0 & x_3\\ \end{pmatrix} = 0
Using row operation on Ax = b can reach \begin{pmatrix} x_1 & 0 & 0\\ 0 & x_2 & 0\\ 0 & 0 & x_3\\ \end{pmatrix} = \text{ something}
A is a product of elementary matrices
Gauss Jordan Method:
Lower Triangular Matrix: \begin{pmatrix} \circ & 0 & 0 & 0\\ x & \circ & 0 & 0\\ x & x & \circ & 0\\ x & x & x & \circ\\ \end{pmatrix}
Upper Triangular Matrix: \begin{pmatrix} \circ & x & x & x\\ 0 & \circ & x & x\\ 0 & 0 & \circ & x\\ 0 & 0 & 0 & \circ\\ \end{pmatrix}
LU factorization: Every invertible matrix A can be written as a product of a lower and an upper triangular matrix A = LU
E_{a, b} is elementary matrix that uses row b to modify row a.
A = LU (L^{-1}A = U) where L^{-1} = E_0E_1E_2...E_{n-1} and U is upper triangular
Using LU to find x: to solve A \overrightarrow{x} = \overrightarrow{b}, \overrightarrow{a} = U^{-1}(L^{-1}\overrightarrow{b})
LU factorization is not unique
To Find LU:
Matrix Optimization using LU:
finding LU costs n^3
finding L^{-1}\overrightarrow{b} costs n^2
finding \overrightarrow{a} costs n^2
if there are k similar A with different b, then it cost n^3 + 2kn^2 in total compared to kn^3
Table of Content