# Lecture 005

## Vector Space

Vector Space: a vector space is a set of vectors $V$ together with an addition and scalar multiplication operation, satisfying the following 10 axioms:

1. Closed Under Addition: $(\forall v, w \in V)(v + w \in V)$
2. Commutative Under Addition: $(\forall v, w \in V)(v + w = w + v)$
3. Associative Under Addition: $(\forall v, w \in V)(v + (w + z) = (v + w) + z)$
4. Exists Zero Vector: $(\exists 0 \in V)(\forall v \in V)(v + 0 = v)$
5. Additive Inverse Vector: $(\forall v \in V)(\exists w \in V)(v + w = 0)$
6. Closed Under Scalar Multiplication: $(\forall v \in V, c \in \mathbb{R})(cv \in V)$
• $Z^2$ is not vector space because $(\forall z \in V, z \in \mathbb{Z}, c \in \mathbb{R})(cz \notin V)$
7. Vector Distributive Under Scalar Multiplication: $(\forall v,w \in V, c \in \mathbb{R})(c(w+v)=cw+cv)$
8. Scalar Distributive Under Scalar Multiplication: $(\forall v \in V, c,d \in \mathbb{R})((c+d)v = cv + dv)$
9. Associative Under Scalar Multiplication: $(\forall v \in V, c,d \in \mathbb{R})(c(dv) = (cd)v)$
10. Exists One Vector: $(\exists 1 \in V)(\forall v \in V)(1v = v)$

// QUESTION: why not closed under multiplication // QUESTION: why not 0 * A = 0

Example: For $a, b, c, d, ... \in \mathbb{R}$

• $R^{2 \times 2} = \begin{pmatrix} a & b\\ c & d\\ \end{pmatrix}$ is a vector space.

• $(a_1x^2 + b_1x + c_1)+(a_2x^2 + b_2x + c_2)=(a_1+a_2)x^2 + (b_1+b_2)x + (c_1 + c_2)$ is a vector space.

• $d(a_1x^2 + b_1x + c_1)=da_1x^2 + db_1x + dc_1$ is a vector space.

• $\forall n \in \mathbb{R}$, $R^n$ is a subspace (assume without proof)

Scalar Vector Theorem: For any $v \in V$ and scalar $c\in\mathbb{R}$

1. $0 \cdot v = 0$
2. $c \cdot 0 = 0$
3. $(-1)v = -v$
4. $c \cdot v = 0 \implies c \perp v$

Subspace: A subset $w \subseteq V$ is a subspace with the operations on $V$

• $w = \{kv | k\in\mathbb{R}\}$ for a fixed $v \neq 0$ is a subspace (note: line or plane must go through origin)

• $w = \{\begin{pmatrix} x\\ y\\ z\\ \end{pmatrix} | ax + by + cz = 0\}$ is a subspace (note: line or plane must go through origin)

Example:

• $V = \begin{pmatrix} a & b & c\\ d & e & f\\ g & h & i\\ \end{pmatrix} \in \mathbb{R}^{3 \times 3}$

• then triangular matrix $W = \begin{pmatrix} x & 0 & 0\\ y & z & 0\\ w & p & q\\ \end{pmatrix}$ is a subspace of $V$

Subspace Theorem: If $V$ is a vector space, then $W$ is a subspace of $V$ iff

1. $W \subseteq V$
2. $W \neq \emptyset$
3. $W$ is closed under addition, and
4. $W$ is closed for scalar multiplication

Polynomial Solution Subspace: let $A \in \mathbb{R}^{m \times n}$ then the set of solutions to $Ax$, $\{x \in \mathbb{R}^n | Ax = 0\}$, is a subspace of $R^n$

Nullspace: nullspace / kernel of $A \in \mathbb{R}^{m \times n}$ is $N(A) = \{x \in \mathbb{R}^n | Ax = 0\}$

Column Space: all possible set of output of a transformation, space where columns vectors live

Trivial Subspace: A subspace $W \subseteq V$ is a trivial subspace of $V$ if $W = \{0\} \lor W = V$

Linear Combination: A linear combination of a set of vectors $V' \subseteq V \subseteq R^{n}$ ($V$ is a vector space) is an expression of the form $C \cdot V'$ ($c_1v_1+c_2v_2+...+c_nv_n$) for some scalar vector $C \in \mathbb{R}^n$

Trivial Linear Combination: $\{\text{expression}(C \cdot V) | (\forall c \in C)(c = 0)\}$ is trivial linear combination

Span: the span of $S \subseteq V \subseteq R^{n}$, denoted $\text{span}(S)$, ($V$ is a vector space) is the set of all linear combinations of $S$

• $\text{span}(S) = \{C \cdot S | C \in \mathbb{R}^n\}$

Span: the span of $s \in S$ for $S \subseteq V$ is $\text{span}(s) = \{cs | c \in \mathbb{R}^n\}$.

Hanke's Conjecture of Subspace: Given subspace $A \in \mathbb{R}^{n}$ and $(n-1)$-many vectors $v_1, v_2, ..., v_{n-1} \in A$, a non-trivial subspace is the hyper-plane formed by those vectors.

• Span-Subspace Theorem: $\text{span}(\{v_1, v_2, ..., v_n | v \in V \land n \in \mathbb{N}\})$ is a subspace of $V$.

Spanning Set: $S$ is a spanning set of $V$ if $V = \text{span}(S)$. ($S$ spans $V$)

Column Space: The column space of $A \in \mathbb{R}^{m \times n}$ is $C(A) = \text{span}(\{A^T[i] | (\forall i)(1 \leq i \leq n) \})$. (the subspace of $R^{?}$ spanned by the columns of $A$.)

• $Ax = b$ has a solution iff $b \in C(A)$

Linear independence: a set of vectors is linearly independent iff $c_1v_1 + c_2v_2 + ... + c_nv_n = 0 \implies c_1 = c_2 = ... = c_n = 0$ (or $CV = 0 \implies C = \overrightarrow{0}$)

• Let $w \in \text{span}(V)$, then $\{w\} \cup V$ is linearly dependent. (Let $w \notin \text{span}(V)$, then $\{w\} \cup V$ is linearly independent.)

• If $V$ is linearly dependent, there exists $w \in V$ such that $w \in \text{span}(V - \{w\})$

• $v_1, v_2, ..., v_n \in V$ are linearly independent iff $\forall w \in \text{span}(V)$, $w$ can be expressed as a unique linear combination of $V$.

• If $v_1, v_2, ..., v_n$ are linearly independent, but $v_1, v_2, ..., v_n, m$ are linearly dependent, then $m \in \text{span}(\{v_1, ..., v_n\})$

Let $A \in \mathbb{R}^{n \times n}$, then the followings are equivalent:

• $A$ is invertible

• $N(A) = \{0\}$ (nullspace)

• column space of $A$ are linearly independent

• rows of $A$ are linearly independent

• $Ax = b$ has a unique solution for every $b \in \mathbb{R}^n$ ($A$ is Non-Singular Matrix)

• The only solution for $Ax = 0$ is $x = 0$

• Using row operation on $Ax = 0$ can reach $\begin{pmatrix} x_1 & 0 & 0\\ 0 & x_2 & 0\\ 0 & 0 & x_3\\ \end{pmatrix} = 0$

• Using row operation on $Ax = b$ can reach $\begin{pmatrix} x_1 & 0 & 0\\ 0 & x_2 & 0\\ 0 & 0 & x_3\\ \end{pmatrix} = \text{ something}$

• $A$ is a product of elementary matrices

Basis (inverse span): $B \subseteq V$ is a bases of $V$ if 1. the vectors in $B$ are linearly independent and 2. $\text{span}(B) = V$

• ($(\forall b \in B)(|B| = \text{dim}(b))$)

• $B$ is a basis of $V$ if $\forall v \in V$ can be written as a unique linear combination of the vectors in B ($(\forall v \in V)(\exists C \in \mathbb{R}^{n \times n})(CB = v)$)

• Any two basis of a vector $V$ has the same number of vectors (The cardinality of any basis in $R^n$ is n)

• Find Basis: choose one variable to be $1$, others to be $0$ until there is a constraint. Repeat process for many of variables. Or you can write in explicit form.

• Explicit Vector Space: write in a way that the number of variables is the dimension.

• Minimum spanning set, maximum independent vector in the set.

Rank: how many dimension you get at maximum after applying transformation

Dimension: number of vectors in any of $V$'s basis

• $\text{dim}(\mathbb{R}^{n \times n}) = n^2$

• $\text{dim}(\{0\}) = 0$

Dimensional Theorem:

• Any linear independent set of vector can be extended to a basis by adding vectors to it. (keep adding independent vectors increase dimension)

• For any spanning set we can always leave vectors to get a bases.

Isomorphism: writing $2 \times 2$ matrix in vector space is the same as $4$-dimensional vectors. (proof needed)

Four Fundamental Subspace: Let $A \in \mathbb{R}^{n \times n}$

• columnspace: $C(A)$ is the space spanned by the columns of $A$

• rowspace: $RS(A)$ of $A$ is the space spanned by the rows of $A$

• nullspace: $N(A)$ of $A$ is the nullspace of $A$ ($Ax = 0$ where $x \in \mathbb{R}^n$)

• left nullspace: $N(A^T) = \{x | A^T x = 0\} = \{x | x^TA = 0\}$ where $x \in \mathbb{R}^n$

Column-Rank: the dimension of $A \in \mathbb{R}^{m \times n}$ column space. (maximum possible number of linear independent columns of $A$) Row-Rank: Dimension of $RS(A)$ (maximum possible number of linear independent rows of $A$)

• For any $A \subseteq \mathbb{R}^{m \times n}$, $\text{row-rank}(A) = \text{column-rank}(A)$

• Rank: column-rank $r_k(A), \text{rank}(A), r(A)$

Gaussian Elimination: let $A \rightarrow U \rightarrow R$, then $RS(A) = RS(U) = RS(R)$

Rank Theorems:

1. $\text{row-rank}(A) = \text{column-rank}(A) = \text{num-pivots}(\text{rref}(A))$
2. Rank–nullity theorem: $\text{dim}(N(A)) + \text{rank}(A) = \text{num-column}(A)$
3. $\text{row-rank}(\text{rref}(A)) = \text{num-pivots}(\text{rref}(A)) = \text{num-non-zero-row}(\text{rref}(A)) = \text{column-rank}(\text{rref}(A))$ (prove by columns are linear independent by definition)
4. $\text{row-rank}(A) = \text{row-rank}(\text{rref}(A))$
5. $RS(A) = RS(\text{rref}(A))$ (column space are not the same for $A$ and $rref(A)$, but the rank will not change)
6. $\text{column-rank}(A) = \text{column-rank}(\text{rref}(A))$
7. $N(A) = N(\text{rref}(A))$ ()
8. $\text{dim}(N(A)) + \text{column-rank}(\text{rref}(A)) = \text{num-column}(A)$ (prove by construct set of null space, number of free variable = number of columns - number of pivots)

Corollary:

1. $Ax = 0 \iff rref(A)x = 0$
2. $(rref(A)x = 0 \implies x = 0) \implies A \text{ is linearly independent}$ // QUESTION

Find $RS(A)$: pick non-zero row of $rref(A)$ (begin with 1) Find $C(A)$: pick pivot columns of $rref(A)$ and look up in $A$

Extension of theorem of non-singular matrices: Let $A \in \mathbb{R}^{n \times n}$, then te followings are equivalent

1. $A$ is invertible
2. $\text{rank}(A) = n$

Linear Transformation: a linear transformation is a function $T: V \rightarrow W$ where $V, W$ are vector spaces such that

1. $(\forall v, v' \in V)(T(v + v') = T(v) + T(v'))$
2. $(\forall v \in V, \alpha \in \mathbb{R})(T(\alpha v) = \alpha T(v))$

3. To check if $T$ is linear transformation, check $T(\alpha v + \beta w) = \alpha T(v) + \beta T(w)$

Simple Linear Transform:

• Scaling: $\begin{pmatrix} c & 0\\ 0 & c\\ \end{pmatrix}$

• Reflection: $\begin{pmatrix} 0 & 1\\ 1 & 0\\ \end{pmatrix}$

• Rotation 90 degree: $\begin{pmatrix} 0 & -1\\ 1 & 0\\ \end{pmatrix}$

• Projection on x-axis: $\begin{pmatrix} 1 & 0\\ 0 & 0\\ \end{pmatrix}$

• Projection on x-axis reduce dimension $\begin{pmatrix} 1 & 0\\ \end{pmatrix}$

• Rotation: $\begin{pmatrix} \cos \theta & -\sin \theta\\ \sin \theta & \cos \theta\\ \end{pmatrix}$

(you have to check if it fits the definition of linear transformation)

Matrix is Transform Theorem: for any $A \in R^{n \times n}$, then $T_A : \mathbb{R}^n \rightarrow \mathbb{R}^n$ is a linear transformation.

Weak Transform Basis Theorem: If $T: V \rightarrow W$ is a linear transformation and $B = \{v_1, ..., v_n\} \subseteq V$ is a basis for $V$, and $T(v_1), ... T(v_n)$ are determined such that $T(v_1) = w_1, ..., T(v_n) = w_n$, then $T$ is determined on the whole $V$.

Change Basis Transform Theorem: Let $\{v_1, ..., v_n\}$ be basis of $V$, and arbitrary $v = c_1v_1 + ... + c_nv_n$, $w_1, ..., w_n \in W$ then $T(v) = c_1w_1 + ... c_nw_n$ is a linear transformation.

Linear Transformation and Matrix:

• every matrix $A \in \mathbb{R}^{m \times n}$ describes a linear transformation $T : \mathbb{R}^n \rightarrow \mathbb{R}^m$.

• every linear transformation $T : \mathbb{R}^n \rightarrow \mathbb{R}^m$ can be described as a matrix multiplication with $A \in \mathbb{R}^{m \times n}$

Coordinate System:

• given vector space $V$

• given $V$'s bases $B(V) = \{v_1, ..., v_n\}$

• then coordinate system can be represented by matrix $B = \begin{pmatrix} v_1 & ... & v_n\\ \end{pmatrix}$

• the coordinate vector $C$ is: $BC = \text{... the actual numerical vector ...}$

Find Transformation:

• given $B = \{\begin{pmatrix}2\\3\\\end{pmatrix}, \begin{pmatrix}1\\1\\\end{pmatrix}\}, C = \{\begin{pmatrix}1\\1\\2\\\end{pmatrix}, \begin{pmatrix}2\\2\\3\\\end{pmatrix}, \begin{pmatrix}3\\2\\1\\\end{pmatrix}\}$ and $\begin{cases}T(B_1) = 2C_1 + C_2 - C_3\\T(B_2) = C_1 - C_2 + 3C_3\\\end{cases}$

• to use $T$ in normal coordinate:

1. write normal coordinate system as B coordinate system: $\begin{pmatrix}1\\2\\\end{pmatrix} = \begin{pmatrix}2\\3\\\end{pmatrix} - \begin{pmatrix}1\\1\\\end{pmatrix} = B_1 - B_2 = \begin{pmatrix}1\\-1\\\end{pmatrix}_B$
2. write $T: B \rightarrow C$ as $T = \begin{pmatrix}2 & 1\\1 & -1\\-1 & 3\\\end{pmatrix}$ by applying transfer using unit vectors in $B$
3. do $T(\begin{pmatrix}1\\-1\\\end{pmatrix}_B) = T\begin{pmatrix}1\\-1\\\end{pmatrix}_B$
• to find $T$ for normal coordinate: do the same thing for unit vectors for normal coordinate system

Change of Basis:

• given basis $B = \{v_1, ..., v_n\}, C = \{w_1, ..., w_n\}$

• Chang of basis matrix $P_{B \rightarrow C} = \begin{pmatrix}(v_1)_c & (v_2)_c & ... & (v_n)_c\\\end{pmatrix}$

• $P_{B \rightarrow B}P_{C \rightarrow B} = I$ and it is always invertible (since they are the same dimension).

Composing Linear Transformation:

• If $T_A^{b \times a} : R^a \rightarrow R^b$, $T_B^{c \times b} : R^b \rightarrow R^c$, then $T_{B} \circ T_{A} = T_B T_A$.

Inverse of Transformation: The inverse of a linear transformation $T: V \rightarrow W$ is $T^{-1}: W \rightarrow V$.

• Transformation is invertible iff matrix is invertible

• Projections are generally not invertible

• Projections onto line not through origin is not linear transformation

Projection onto rotation angle: $\begin{pmatrix} \cos^2 \theta & \sin \theta \cos \theta \\ \sin \theta \cos \theta & \sin^2 \theta \\ \end{pmatrix}$

Kernel: $\text{Ker}(T) = \{v | T(v) = 0\}$

• $\text{Ker}(T) = \text{Ker}(N)$ if $T$ is a matrix

• $\text{Ker}(T)$ are subspace of $T: \mathbb{R}^n \rightarrow \mathbb{R}^n$

Image: $\text{Img}(T) = \{w \in W | (\exists v \in V)(T(v) = w)\}$

• $\text{Img}(T) = C(T)$ if $T$ is a matrix

• $\text{Img}(T)$ are subspace of $T: \mathbb{R}^n \rightarrow \mathbb{R}^n$

• kernel (line) is orthogonal to image (plane) at origin

Theorem: $\text{dim}(\text{Img}(T_A)) + \text{dim}(\text{Ker}(T_A)) = n$ (where n is the dimension of starting space)

Injective: $T(v) = T(w) \iff v = w$

• rotation, scaling is injective

• projection is not injective

• Increase dimension is injective, but not surjective

Surjective: transformation T is surjective if $(\forall w \in W)(\exists v \in W)(T(v) = w)$

• rotation, scaling is surjective

• projection of same dimension ($T: \mathbb{R}^3 \rightarrow \mathbb{R}^3$) is not surjective

• setting last bit to 0 cannot reach (x, y, 1)
• projection of reduced dimension ($T: \mathbb{R}^3 \rightarrow \mathbb{R}^2$) is surjective, but not injective

$T^{m \times n} : \mathbb{R}^n \rightarrow \mathbb{R}^m$ injective not injective
surjective rank(T) = n rank(T) = m (m<n)
not surjective rank(T) = n (m>n) rank(T) < m, n

Orthogonality:

• $x \perp y \iff x \cdot y = \|x\| \cdot \|y\| \cdot \cos \theta = 0$

• orthogonal set: $(\forall v_i, v_j \in V)(i \neq j \implies v_i \cdot v_j = 0)$

• orthogonal basis of $V$: an orthogonal set that is also a basis of $V$

• orthogonal, but not basis: space too small
• basis, but not orthogonal: bad, uncomfortable basis

Orthonormal Basis: an orthogonal basis in which every vector is a unit vector.

• Any set $\{v_1, ..., v_n\} \subseteq \mathbb{R}^n$ of orthogonal vectors vectors (with $v_i \neq 0$) is linearly independent.

• $v \perp W$: $v \in \mathbb{R}^n$ is orthogonal to subspace $W \subseteq \mathbb{R}^n$ if $(\forall w \in W)(v \perp w)$

• $V \perp W$: $V \subseteq \mathbb{R}^n$ is orthogonal to subspace $W \subseteq \mathbb{R}^n$ if $(\forall v \in V)(\forall w \in W)(v \perp w)$

• (x-y-plane and y-z-plane are not orthogonal because they have intersections that is not $\{\overrightarrow{0}\}$, converse is not true)

• If $V = span(\{v_1, ..., v_m\} \in \mathbb{R}^m), W = span(\{w_1, ..., w_n\} \in \mathbb{R}^n)$ and $(\forall 1 \leq i \leq m, 1 \leq j \leq n)(v_i \perp v_j)$, then $V \perp W$ (proof: $v \cdot w = (\sum_{i=1}^m \alpha_i v_i)(\sum_{j=1}^n \beta_i w_j) = \sum_{1 \leq i \leq m, 1 \leq j \leq n} \alpha_i \beta_j v_i \cdot w_j = 0$)

Orthogonal Complement: subspace of all vectors $\{w \in \mathbb{R}^n | (\forall v \in V \subseteq \mathbb{R}^n)(w \perp v)\}$

• dimension of subspace and its complement must equal to $n$. ($\text{dim}(V) + \text{dim}(V^\perp) = n$)

• $V^\perp = W \iff W^\perp = V$

Theorem: For any matrix $A \subseteq \mathbb{R}^{m \times n}$

• $RS(A)^\perp = N(A)$

• $C(A)^\perp = \text{left-nullspace}(A)$

• Lemma: $N(A) \perp RS(A)$

Finding Orthogonal Complement:

1. Find a basis of $V$
2. Find a set of linear independent vectors $W$ that are perpendicular to $\forall v \in V$
3. $\text{span}(W) = V^\perp$

Projection to Line: project $b$ onto $a$ is $T(b) = \frac{a \cdot b}{a \cdot a} a$

• projection matrix: $P = \frac{a a^T}{a^T a} = \frac{a a^T}{\|a\|^2}$

Projection to plane: let $\{b_1, b_2\}$ be an orthogonal basis of the plane. $T = \frac{b_1 \cdot v}{b_1 b_1} b_1 + \frac{b_2 \cdot v}{b_2 b_2} b_2$

• in general $T = \frac{b_1 \cdot v}{b_1 b_1} b_1 + \frac{b_2 \cdot v}{b_2 b_2} b_2 + ... + \frac{b_k \cdot v}{b_k b_k} b_k$

• in general, for matrix: $P_W = P_{b_1} + P_{b_2} + ... + P_{b_k}$

Gram-schmidt orthogonal: any subset of $\mathbb{R}^n$ has an orthogonal basis.

Gram-schmidt process

1. take a basis $v$
2. add a $b$ basis into collection (containing $b_1, ..., b_n$) so that it is perpendicular to $b_1, ..., b_n$, that spans $\text{span}(\{b_1, ..., b_n, v\})$
• For the first projection: $b_2 = (v_2 - \text{proj}_{b_1}(v_2)) = v_2 - \frac{b_1 \cdot v_2}{b_1 \cdot b_1} b_1 \perp b_1$
• For later projections: $b = v - \text{proj}_{\text{span}(b_1, ..., b_n)}(v) = v - \frac{b_1 \cdot v}{b_1 \cdot b_1}b_1 - \frac{b_2 \cdot v}{b_2 \cdot b_2}b_2 - ... - \frac{b_n \cdot v}{b_n \cdot b_n}b_n$
3. goto 1 til it is done

Table of Content