Suppose I sample values uniformly at random from the unit interval. How many samples should I expect to take before the sum of my sampled values exceeds unity?1
Let be the (random) number of samples taken when the sum first exceeds unity. Then has expected value equal to Euler’s number . This can be verified approximately via simulation:
simulate <- function(run) {
tot <- 0
N <- 0
while (tot < 1) {
tot <- tot + runif(1)
N <- N + 1
}
N
}
set.seed(0)
mean(sapply(1:1e5, simulate))
## [1] 2.7183
To see why , let be an infinite sequence of random variables with uniform distributions over the unit interval. Then the probability that exceeds any non-negative integer is Consider the unit (hyper)cube in . Its vertices comprise the origin, the standard basis vectors , and the sums of two or more of these basis vectors. The convex hull of forms an -simplex with volume . The interior of this simplex is precisely the set It follows that and therefore from above. Now for each . Thus, since (and, by convention, ), we have The final equality comes from evaluating the Maclaurin series expansion of at .
-
Grant Sanderson mentions this problem in this Numberphile video. ↩︎