Lecture 004

Vectors

Planer Vector: \mathbb{R}^2: \begin{bmatrix} x\\ y\\ \end{bmatrix} \in \mathbb{R}^2 3-Dim Vector: \mathbb{R}^n: \begin{bmatrix} x_1\\ ...\\ x_n\\ \end{bmatrix} \in \mathbb{R}^n

Linear Combination: for v_1, ..., v_k \in \mathbb{R}^k, c_1, ..., c_k \in \mathbb{R}^k, c_1v_1 + ... + c_kv_k \in \mathbb{R}^k is a linear combination of v_1, ..., v_k \in \mathbb{R}^k.

\overrightarrow{u} \cdot \overrightarrow{v} = \overrightarrow{u} \cdot \overrightarrow{w} \iff (\overrightarrow{v} - \overrightarrow{w}) \perp \overrightarrow{u}

A matrix times a column vector is the linear combination of columns of the matrix (split into columns).

Matrix

Let A \in \mathbb{R}^{M\times N} (denotes B = (b_{ij})_{MN}) and A \in \mathbb{R}^{M\times E}, then AB \in \mathbb{R}^{M \times E}.

Matrix Multiplication Identity:

Matrix Multiplication

Matrix Multiplication Definition 1: dot

Matrix Multiplication Definition 1: dot
Definition 1: C = (c_{ij})_{ij} where c_{ij} is the i-th row of A times j-th column of B = \sum_{l=1}^n a_{il} b_{lj}

Matrix Multiplication Definition 2: Column Matrix times Vector

Matrix Multiplication Definition 2: Column Matrix times Vector
Definition 2: j-th column of AB is A \cdot \text{ j-th column of }B

Matrix Multiplication Definition 3: Row = Vector times Matrix

Matrix Multiplication Definition 3: Row = Vector times Matrix
Definition 3: let v be vector, A be matrix. vA is a linear combination of ROWS of A (va = v_1A[:,1] + v_2A[:,2] +...+v_mA[:,m]) (by (vA)_j = v\cdot A[:, j])

Corollary: Columns of AB is a linear combination of the columns of A

Corollary: Rows of AB are linear combinations of the rows of B (by C_i = A_i \times B)

Special Matrix

I_n: identity matrix of n \times n

Elementary Matrices

Elementary Matrices: matrix describe row operation

Diagonal Matrices

Diagonal Matrices

Diagonal Matrices (D): only diagonal non-zero, row multiply of identity matrix.

Permutation Matrices

Permutation Matrices

Permutation Matrices (P): row swap of identity matrix (every row and column there is exactly one 1 and others are 0)

Inverse Matrices

Inverse (square matrix only): let A, B \in R^{n \times n}, if AB = BA = I, then B = A^{-1}

Left Inverse: B is left inverse of A iff BA = I Right Inverse: B is left inverse of A iff AB = I

Invertible: if the inverse of a matrix exists

Uniqueness: if a inverse of a matrix exists, it is unique.

Matrix inverse in System: if Ax = b for A \in \mathbb{R}^{n \times n}, b \in \mathbb{R}^n and A is invertible, then x = A^{-1}b is the unique solution

Power of Matrix: A^n = AAAA....A with n many As.

Transpose: transpose of A \in \mathbb{R}^{n \times n} is A^\intercal (by reflect on main diagonal)

Theorem: A \in \mathbb{R}^{n \times n}, the following are equivalent

Gauss Jordan Method:

Gauss Jordan Method

Gauss Jordan Method

Lower Triangular Matrix: \begin{pmatrix} \circ & 0 & 0 & 0\\ x & \circ & 0 & 0\\ x & x & \circ & 0\\ x & x & x & \circ\\ \end{pmatrix}

Upper Triangular Matrix: \begin{pmatrix} \circ & x & x & x\\ 0 & \circ & x & x\\ 0 & 0 & \circ & x\\ 0 & 0 & 0 & \circ\\ \end{pmatrix}

LU factorization: Every invertible matrix A can be written as a product of a lower and an upper triangular matrix A = LU

Matrix Optimization using LU:

Table of Content