Lecture 005

Vector Space

Vector Space: a vector space is a set of vectors V together with an addition and scalar multiplication operation, satisfying the following 10 axioms:

  1. Closed Under Addition: (\forall v, w \in V)(v + w \in V)
  2. Commutative Under Addition: (\forall v, w \in V)(v + w = w + v)
  3. Associative Under Addition: (\forall v, w \in V)(v + (w + z) = (v + w) + z)
  4. Exists Zero Vector: (\exists 0 \in V)(\forall v \in V)(v + 0 = v)
  5. Additive Inverse Vector: (\forall v \in V)(\exists w \in V)(v + w = 0)
  6. Closed Under Scalar Multiplication: (\forall v \in V, c \in \mathbb{R})(cv \in V)
    • Z^2 is not vector space because (\forall z \in V, z \in \mathbb{Z}, c \in \mathbb{R})(cz \notin V)
  7. Vector Distributive Under Scalar Multiplication: (\forall v,w \in V, c \in \mathbb{R})(c(w+v)=cw+cv)
  8. Scalar Distributive Under Scalar Multiplication: (\forall v \in V, c,d \in \mathbb{R})((c+d)v = cv + dv)
  9. Associative Under Scalar Multiplication: (\forall v \in V, c,d \in \mathbb{R})(c(dv) = (cd)v)
  10. Exists One Vector: (\exists 1 \in V)(\forall v \in V)(1v = v)

// QUESTION: why not closed under multiplication // QUESTION: why not 0 * A = 0

Example: For a, b, c, d, ... \in \mathbb{R}

Scalar Vector Theorem: For any v \in V and scalar c\in\mathbb{R}

  1. 0 \cdot v = 0
  2. c \cdot 0 = 0
  3. (-1)v = -v
  4. c \cdot v = 0 \implies c \perp v

Subspace: A subset w \subseteq V is a subspace with the operations on V

Example:

Subspace Theorem: If V is a vector space, then W is a subspace of V iff

  1. W \subseteq V
  2. W \neq \emptyset
  3. W is closed under addition, and
  4. W is closed for scalar multiplication

Polynomial Solution Subspace: let A \in \mathbb{R}^{m \times n} then the set of solutions to Ax, \{x \in \mathbb{R}^n | Ax = 0\}, is a subspace of R^n

Nullspace: nullspace / kernel of A \in \mathbb{R}^{m \times n} is N(A) = \{x \in \mathbb{R}^n | Ax = 0\}

Column Space: all possible set of output of a transformation, space where columns vectors live

Trivial Subspace: A subspace W \subseteq V is a trivial subspace of V if W = \{0\} \lor W = V

Linear Combination: A linear combination of a set of vectors V' \subseteq V \subseteq R^{n} (V is a vector space) is an expression of the form C \cdot V' (c_1v_1+c_2v_2+...+c_nv_n) for some scalar vector C \in \mathbb{R}^n

Trivial Linear Combination: \{\text{expression}(C \cdot V) | (\forall c \in C)(c = 0)\} is trivial linear combination

Span: the span of S \subseteq V \subseteq R^{n}, denoted \text{span}(S), (V is a vector space) is the set of all linear combinations of S

Span: the span of s \in S for S \subseteq V is \text{span}(s) = \{cs | c \in \mathbb{R}^n\}.

Hanke's Conjecture of Subspace: Given subspace A \in \mathbb{R}^{n} and (n-1)-many vectors v_1, v_2, ..., v_{n-1} \in A, a non-trivial subspace is the hyper-plane formed by those vectors.

Spanning Set: S is a spanning set of V if V = \text{span}(S). (S spans V)

Column Space: The column space of A \in \mathbb{R}^{m \times n} is C(A) = \text{span}(\{A^T[i] | (\forall i)(1 \leq i \leq n) \}). (the subspace of R^{?} spanned by the columns of A.)

Linear independence: a set of vectors is linearly independent iff c_1v_1 + c_2v_2 + ... + c_nv_n = 0 \implies c_1 = c_2 = ... = c_n = 0 (or CV = 0 \implies C = \overrightarrow{0})

Let A \in \mathbb{R}^{n \times n}, then the followings are equivalent:

Basis (inverse span): B \subseteq V is a bases of V if 1. the vectors in B are linearly independent and 2. \text{span}(B) = V

Rank: how many dimension you get at maximum after applying transformation

Dimension: number of vectors in any of V's basis

Dimensional Theorem:

Isomorphism: writing 2 \times 2 matrix in vector space is the same as 4-dimensional vectors. (proof needed)

Four Fundamental Subspace (where klzzwxh:0054 stands for range, not row space, klzzwxh:0055 is pseudo inverse of klzzwxh:0056). Notice klzzwxh:0057 is the inverse of klzzwxh:0058 (klzzwxh:0059) but keeping the stretch the same. For more info: klzzwxh:0060

Four Fundamental Subspace (where R(\cdot) stands for range, not row space, A^+ is pseudo inverse of A). Notice A^T is the inverse of A (A^{-1}) but keeping the stretch the same. For more info: Visual Explanation of Four Fundamental Subspace

Four Fundamental Subspace: Let A \in \mathbb{R}^{n \times n}

Column-Rank: the dimension of A \in \mathbb{R}^{m \times n} column space. (maximum possible number of linear independent columns of A) Row-Rank: Dimension of RS(A) (maximum possible number of linear independent rows of A)

Gaussian Elimination: let A \rightarrow U \rightarrow R, then RS(A) = RS(U) = RS(R)

rref and ref

rref and ref

Rank Theorems:

  1. \text{row-rank}(A) = \text{column-rank}(A) = \text{num-pivots}(\text{rref}(A))
  2. Rank–nullity theorem: \text{dim}(N(A)) + \text{rank}(A) = \text{num-column}(A)
  3. \text{row-rank}(\text{rref}(A)) = \text{num-pivots}(\text{rref}(A)) = \text{num-non-zero-row}(\text{rref}(A)) = \text{column-rank}(\text{rref}(A)) (prove by columns are linear independent by definition)
  4. \text{row-rank}(A) = \text{row-rank}(\text{rref}(A))
  5. RS(A) = RS(\text{rref}(A)) (column space are not the same for A and rref(A), but the rank will not change)
  6. \text{column-rank}(A) = \text{column-rank}(\text{rref}(A))
  7. N(A) = N(\text{rref}(A)) ()
  8. \text{dim}(N(A)) + \text{column-rank}(\text{rref}(A)) = \text{num-column}(A) (prove by construct set of null space, number of free variable = number of columns - number of pivots)

Corollary:

  1. Ax = 0 \iff rref(A)x = 0
  2. (rref(A)x = 0 \implies x = 0) \implies A \text{ is linearly independent} // QUESTION

Find RS(A): pick non-zero row of rref(A) (begin with 1) Find C(A): pick pivot columns of rref(A) and look up in A

Extension of theorem of non-singular matrices: Let A \in \mathbb{R}^{n \times n}, then te followings are equivalent

  1. A is invertible
  2. \text{rank}(A) = n

Linear Transformation: a linear transformation is a function T: V \rightarrow W where V, W are vector spaces such that

  1. (\forall v, v' \in V)(T(v + v') = T(v) + T(v'))
  2. (\forall v \in V, \alpha \in \mathbb{R})(T(\alpha v) = \alpha T(v))

  3. To check if T is linear transformation, check T(\alpha v + \beta w) = \alpha T(v) + \beta T(w)

Simple Linear Transform:

(you have to check if it fits the definition of linear transformation)

Matrix is Transform Theorem: for any A \in R^{n \times n}, then T_A : \mathbb{R}^n \rightarrow \mathbb{R}^n is a linear transformation.

Weak Transform Basis Theorem: If T: V \rightarrow W is a linear transformation and B = \{v_1, ..., v_n\} \subseteq V is a basis for V, and T(v_1), ... T(v_n) are determined such that T(v_1) = w_1, ..., T(v_n) = w_n, then T is determined on the whole V.

Change Basis Transform Theorem: Let \{v_1, ..., v_n\} be basis of V, and arbitrary v = c_1v_1 + ... + c_nv_n, w_1, ..., w_n \in W then T(v) = c_1w_1 + ... c_nw_n is a linear transformation.

Linear Transformation and Matrix:

Coordinate System:

Find Transformation:

Change of Basis:

Composing Linear Transformation:

Inverse of Transformation: The inverse of a linear transformation T: V \rightarrow W is T^{-1}: W \rightarrow V.

Projection onto rotation angle: \begin{pmatrix} \cos^2 \theta & \sin \theta \cos \theta \\ \sin \theta \cos \theta & \sin^2 \theta \\ \end{pmatrix}

Kernel: \text{Ker}(T) = \{v | T(v) = 0\}

Image: \text{Img}(T) = \{w \in W | (\exists v \in V)(T(v) = w)\}

Theorem: \text{dim}(\text{Img}(T_A)) + \text{dim}(\text{Ker}(T_A)) = n (where n is the dimension of starting space)

Injective: T(v) = T(w) \iff v = w

Surjective: transformation T is surjective if (\forall w \in W)(\exists v \in W)(T(v) = w)

T^{m \times n} : \mathbb{R}^n \rightarrow \mathbb{R}^m injective not injective
surjective rank(T) = n rank(T) = m (m<n)
not surjective rank(T) = n (m>n) rank(T) < m, n

Orthogonality:

Orthonormal Basis: an orthogonal basis in which every vector is a unit vector.

Orthogonal Complement: subspace of all vectors \{w \in \mathbb{R}^n | (\forall v \in V \subseteq \mathbb{R}^n)(w \perp v)\}

Theorem: For any matrix A \subseteq \mathbb{R}^{m \times n}

Finding Orthogonal Complement:

  1. Find a basis of V
  2. Find a set of linear independent vectors W that are perpendicular to \forall v \in V
  3. \text{span}(W) = V^\perp

Projection to Line: project b onto a is T(b) = \frac{a \cdot b}{a \cdot a} a

Projection to plane: let \{b_1, b_2\} be an orthogonal basis of the plane. T = \frac{b_1 \cdot v}{b_1 b_1} b_1 + \frac{b_2 \cdot v}{b_2 b_2} b_2

Gram-schmidt orthogonal: any subset of \mathbb{R}^n has an orthogonal basis.

Gram-schmidt process

  1. take a basis v
  2. add a b basis into collection (containing b_1, ..., b_n) so that it is perpendicular to b_1, ..., b_n, that spans \text{span}(\{b_1, ..., b_n, v\})
    • For the first projection: b_2 = (v_2 - \text{proj}_{b_1}(v_2)) = v_2 - \frac{b_1 \cdot v_2}{b_1 \cdot b_1} b_1 \perp b_1
    • For later projections: b = v - \text{proj}_{\text{span}(b_1, ..., b_n)}(v) = v - \frac{b_1 \cdot v}{b_1 \cdot b_1}b_1 - \frac{b_2 \cdot v}{b_2 \cdot b_2}b_2 - ... - \frac{b_n \cdot v}{b_n \cdot b_n}b_n
  3. goto 1 til it is done

Table of Content