Discrete Random Variable: countable number of values
Distribution of Ramdom Variables
Probability Mass Function (p.m.f.): P_x(i) = Pr\{X = i\}
Accumulative Distribution Function (a.d.f.): F_x(i) = Pr\{X = i\}
Tail: \bar{F}_x(i) = Pr\{X > i\}
X \sim \text{Bernoulli}(p):
Experiment: flip p-coin (P_x(1) = p) once
Random Variable X = \begin{cases} 1 &\text{if Head}\\ 0 &\text{if tail}\\ \end{cases}
p.m.f.: P_x(i) = \begin{cases} p &\text{if } i = 1\\ 1-p &\text{if } i = 0\\ \end{cases}
Summation: p+(1-p) = 1
X \sim \text{Binomial}(n, p):
Experiment: flip p-coin (P_x(1) = p) n times
Random Variable X = \text{number of heads}
p.m.f.: P_x(i) = {n \choose i} p^i(1-p)^{n-i}
Summation: Binomial Series \sum_{i = 0}^n {n \choose i}p^i(1 - p)^{n - i} = (p + (1 - p))^n = 1
X \sim \text{Geometric}(p):
Experiment: flip p-coin (P_x(1) = p) until get a head
Random Variable X = \text{number of flips until getting a head (including the head time)}
p.m.f.: P_x(i) = (1-p)^{i-1}p where i = 1, 2, 3, ...
Tail: \bar{F}_x(i) = (1-p)^i
Summation: Geometric Series \sum_{i = 1}^{\infty} (1 - p)^{i - 1} \cdot p = \sum_{i = 0}^{\infty} (1 - p)^i \cdot p = \sum_{i = 0}^{\infty} (1 - p)^i \cdot p = \frac{1}{1 - (1 - p)} = 1
X \sim \text{Poisson}(\lambda):
Experiment: mix non-negative independent distribution, where \lambda is the peak. Models mixture of a very large number of independent source, each with a very small individual probability.
Distribution: P_x(i) = \frac{e^{-\lambda}\lambda^i}{i!} for i = 0, 1, 2, ...
Summation: Taylor Series e^{-\lambda} \sum_{i = 0}^{\infty} \frac{\lambda^i}{i!} = e^{-\lambda}e^{\lambda} = 1
When n is large and p is small, \text{Binomial}(n, p) \simeq \text{Poisson}(np)
Joint Probability Mass Function:
Marginal Probability: probability obtained by summing along another dimension
Independent of Joint Probability:
Example: Each day, event A happens at probability p_1, event B happens at probability p_2, what is the probability that p_1 happens before p_2?
Brute Force: Let X_1, X_2 denotes the first day event A, B happens.
Another ways of thinking: Pr\{p_1\text{ before }p_2\} = Pr\{p_1 \cup \bar{p_2} | \lnot (p_1 \cap p_2)\} (on the day of not the same, what is the probability that it is in this specific configuration?)
Law of Total Probability for Discrete Random Variables: for an event E and a discrete random variable Y:
Conditioning:
Table of Content