We would like to generate any random variable given a distribution from a uniform random variable (u = \text{Uniform}(0, 1)). We also make following assumptions:
To generate random variable:
Example: generate X \sim \text{Exp}(\lambda)
To generate
We could have do:
But computing partial sum \sum_{i = 0}^l p_i for all l is inefficient, so we still need F_X(u) and invertible and use the same method as continuous case.
Motivation: generate random variables for distribution in which F_X is unknown (like X \sim \text{Normal}(\mu, \sigma^2)) Idea: generate numbers and throw away some of them.
Suppose we have uniform Q = \begin{cases} 1 & \text{with probability } q_1 = 0.33\\ 2 & \text{with probability } q_2 = 0.33\\ 3 & \text{with probability } q_3 = 0.33\\ \end{cases}, and want to generate P = \begin{cases} 1 & \text{with probability } p_1 = 0.20\\ 2 & \text{with probability } p_2 = 0.30\\ 3 & \text{with probability } p_3 = 0.50\\ \end{cases}.
Requirement:
Both P, Q are discrete.
We can efficiency generate random value for Q
P, Q should take on the same set of values: (\forall j)(q_j > 0 \iff p_j > 0)
Note that Q is uniform. But this requirement is not necessary.
Proposal 1:
Concerns with Proposal 1:
Efficiency: The above example works fine for P, Q only has n = 3 potential values. But with more values, both probability q_j, p_j \sim \frac{1}{n} will be very small. Therefore the probability of accepting is very low \lim_{n \rightarrow \infty} q_j, p_j = 0. The running time is O(n).
Limitation: We require Q to be uniform for it to work.
Proposal 2:
Concerns with Proposal 2:
Proposal 3:
Proof Correctness:
Note that the probability of any value accepted is \frac{1}{c}. Therefore c = \max_j(\frac{p_j}{q_j}) is the expected number of iteration we need to generate one random value.
Method: to generate X given Y
Idea:
By Linear Transformation Property, we only need to generate X \sim \text{Normal}(0, 1).
By symmetry, we only need to generate right half of the distribution (and then multiply -1 with probability \frac{1}{2}) for 0 < t < \infty.
To make c small, we need to find a similar distribution Y we know how to generate. Let it be Y \sim \text{Exponential}(1) (it can be generated since we know its invertible c.d.f.)
Note that for distributions involve infinity, we can no longer use Y \sim \text{Uniform}(a, b) because uniform distribution can't extend to infinity with non-zero probability.
We find c:
Taking the derivative to find maximum.
Which gives c = \frac{f_X(1)}{f_Y(1)} = 1.3
Challenges:
Cannot use Inverse Transform: There is no closed form for F(i) = Pr\{X \leq i\}.
Cannot use Accept/Reject: There are infinite number of possible value p_i. Unlike Normal, it is hard to find right distribution to match up to.
Table of Content