Lecture 002

Discrete Random Variable: countable number of values

Distribution of Ramdom Variables

Bernoulli

p.m.f. of Bernoulliklzzwxh:0000 with klzzwxh:0001

p.m.f. of Bernoulli(p) with p = 0.76

X \sim \text{Bernoulli}(p):

Binomial

p.m.f. of Binomialklzzwxh:0004 with klzzwxh:0005

p.m.f. of Binomial(n, p) with n = 4, p = 0.3

X \sim \text{Binomial}(n, p):

Geometric

p.m.f. of Geometricklzzwxh:0008 with klzzwxh:0009

p.m.f. of Geometric(p) with p = 0.5

X \sim \text{Geometric}(p):

Poisson

p.m.f. of Poissonklzzwxh:0012 with klzzwxh:0013

p.m.f. of Poisson(\lambda) with \lambda = 2

X \sim \text{Poisson}(\lambda):

When n is large and p is small, \text{Binomial}(n, p) \simeq \text{Poisson}(np)

Joint Probability

Joint Probability Mass Function:

P_{X, Y}(x, y) = Pr\{X = x \cap Y = y\} \text{ where }\sum_x \sum_y Pr\{X = x \cap Y = y\} = 1

Marginal Probability: probability obtained by summing along another dimension

P_X(x) = \sum_y P_{X, Y}(x, y)

Independent of Joint Probability:

X \perp Y \iff (\forall x \in X), (\forall y \in Y)(Pr\{X = x \cap Y = y\} = Pr\{X = x\} \cdot Pr\{Y = y\})

Example: Each day, event A happens at probability p_1, event B happens at probability p_2, what is the probability that p_1 happens before p_2?

Brute Force: Let X_1, X_2 denotes the first day event A, B happens.

\begin{align*} Pr\{X_1 < X_2\} &= \sum_{k_1 = 1}^{\infty} \sum_{k_2 = k_1 + 1}^{\infty} (1-p_1)^{k_1 - 1}p_1(1-p_2)^{k_2 - 1}p_2\\ &= p_1p_2 \sum_{k_1 = 1}^{\infty} (1-p_1)^{k_1 - 1} (\sum_{k_2 = k_1 + 1}^{\infty} (1-p_2)^{k_2 - 1})\\ &= p_1p_2 \sum_{k_1 = 1}^{\infty} (1-p_1)^{k_1 - 1} ((1-p_2)^{k_1} \sum_{k_2 = 1}^{\infty} (1-p_2)^{k_2 - 1})\\ &= p_1p_2 \sum_{k_1 = 1}^{\infty} (1-p_1)^{k_1 - 1} ((1-p_2)^{k_1} \frac{1}{1-(1-p_2)})\\ &= p_1p_2 \sum_{k_1 = 1}^{\infty} (1-p_1)^{k_1 - 1} ((1-p_2)^{k_1} \frac{1}{p_2})\\ &= p_1 \sum_{k_1 = 1}^{\infty} (1-p_1)^{k_1 - 1} ((1-p_2)^{k_1})\\ &= p_1 (1-p_2) \sum_{k_1 = 1}^{\infty} (1-p_1)^{k_1 - 1} ((1-p_2)^{k-1})\\ &= p_1 (1-p_2) \sum_{k_1 = 1}^{\infty} ((1-p_1)(1-p_2))^{k_1 - 1}\\ &= \frac{p_1 (1-p_2)}{1 - (1-p_1)(1-p_2)}\\ \end{align*}

Another ways of thinking: Pr\{p_1\text{ before }p_2\} = Pr\{p_1 \cup \bar{p_2} | \lnot (p_1 \cap p_2)\} (on the day of not the same, what is the probability that it is in this specific configuration?)

Law of Total Probability for Discrete Random Variables: for an event E and a discrete random variable Y:

Pr\{E\} = \sum_{y}Pr\{E \cap Y = y\} = \sum_y Pr\{E | Y = y\} \cdot Pr\{Y = y\}

Conditioning:

\begin{align*} Pr\{X_1 < X_2\} = &\sum_{k = 1}^\infty Pr\{X_1 < X_2 | X_1 = k\} \cdot Pr\{X_1 = k\}\\ = &\sum_{k = 1}^\infty Pr\{k < X_2 | X_1 = k\} \cdot Pr\{X_1 = k\}\\ = &\sum_{k = 1}^\infty Pr\{k < X_2\} \cdot Pr\{X_1 = k\} \tag{by $X_1, X_2$ independence}\\ = &\sum_{k = 1}^\infty (1 - p_2)^k \cdot (1-p_1)^{k-1}p_1\\ = &p_1(1- p_2)\sum_{k = 1}^\infty ((1-p_2)(1-p_1))^{k - 1}\\ = &p_1(1- p_2)\frac{1}{1 - (1 - p_2)(1 - p_1)}\\ \end{align*}

Table of Content