Sample Space (\Omega): set of all possible outcome
Discrete Sample Space: number of outcome countable (finite or countably infinite)
Continuous Sample Space: number of outcome uncountable
Mutually Exclusive (joint): two events are mutually exclusive if E_1 \cap E_2 = \emptyset
therefore Pr\{E_1 \cup E_2\} = Pr\{E_1\} + Pr\{E_2\}
therefore Pr\{E_1 \cap E_2\} = 0
therefore E_1, E_2 are dependent
Partition: events E_1, ..., E_n partition set F iff E_1, ..., E_n forms a partition of F
Event (E): a subset of sample space \Omega. An event can either happen or not happen (E and \cap{E} are mutually exclusive)
Pr\{E\} the probability outcome of experiment is in set E
Note: events E_1, ..., E_n should be countable many
Intersecting Event: Pr\{E \cup F\} = Pr\{E\} + Pr\{F\} - Pr\{E \cap F\}
Union Bound: Pr\{E \cup F\} \leq Pr\{E\} + Pr\{F\}
Union Bound (and inverting probability, of \forall, \exists quantifier) is very useful when you have \max(\cdot) or \min(\cdot)
Conditional Probability of event E given F: assume Pr\{F\} > 0, Pr\{E | F\} = \frac{Pr\{E \cap F\}}{Pr\{F\}}
Chainrule: Pr\{\cup_{i = 1}^n E_i\} = Pr\{E_1\} + Pr\{E_2 | E_1\} + Pr\{E_3 | E_1 \cup E_2\} ... Pr\{E_n | E_1 \cup E_2, ..., \cup E_{n-1}\} Pr\{E \cap F\} = Pr\{F\} \cdot Pr\{E | F\}: think we are in F first, and then we choose Pr\{E | F\}
Independent Event: E \perp F two definitions are the same
Definition 1: if Pr\{E \cap F\} = Pr\{E\} \cdot Pr\{F\}
Definition 2: if Pr\{E | F\} = Pr\{E\} when Pr\{F\} > 0
Proof: E \perp G \implies E \perp \bar{G}
Multiple Independent Event:
Wrong definition: Pr\{E \cap F \cap G\} = Pr\{E\} \cdot Pr\{F\} \cdot Pr\{G\}
Wrong definition: Pair-wise Independent, 3-way Independent
Good definition: Event A_1, A_2, ..., A_n are Full Independent if (\forall S \subseteq \{1, 2, ..., n\})(Pr\{\bigcap_{i \in S} A_i\} = \prod_{i \in S}Pr\{A_i\})
Pairwise Independence: (\forall i, j)(Pr\{A_i \cap A_j\} = Pr\{A_i\} \cdot Pr\{A_j\})
Conditional Independent: Pr\{E \cap F | G\} \neq Pr\{E | G\} \cdot Pr\{F | G\}
Independence does not imply Conditional Independence: let E be "1st coin is head", let F be "2nd coin is head", let G be "both coins are the same"
Conditional Independence does not imply Independence
Law of Total Probability: E = (E \cap F) \cup (E \cap \bar{F}) (Pr\{E\} = Pr\{E | F\} \cdot Pr\{F\} + Pr\{E | \bar{F}\} \cdot Pr\{\bar{F}\})
General Law of Total Probability: for F_1, F_2, ..., F_n partition the whole space \Omega, then
Conditional Law of Total Probability: for F_1, F_2, ..., F_n partition the whole space \Omega, then
Bayes Law:
Extended Bayes Law: Suppose F_1, F_2, ..., F_n partition \Omega, then Pr\{F | E\} = \frac{Pr\{F\} \cdot Pr\{E | F\}}{\sum_{j = 1}^n Pr\{E | F_j\} \cdot Pr\{F_j\}}
Random Variable: a mapping from experiment outcome to number we care about
random variable is a mapping
but once the mapping's outcome is set to specific value, it is an event
Example:
Experiment roll 2 dies
Outcome: (1, 4)
Random Variable: the larger of 2 rolls
X = 4: event
X < 4: event
The result of probability calculation depend on how you define "random".
Table of Content