Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nice fact! How about this for a proof: Let X_j be a sequence for unif(0,1) random variables, S_n the sequence of partial sums (so S_4=X_1+X_2+X_3+X_4).

If N is the number of uniforms needed to have a sum bigger than 1, you can see that P(N>n)=P(S_n<1).

P(S_n<1) is not too bad to compute, or you can wikipedia it (Irwin-Hall distribution) to find the expression P(S_n<1)=1/n!

Combining the things, E[N]=sum(P(N>n),n=0..infinity)=sum(1/n!,n=0..infinity)=exp(1).



Thank you, I will look into this. I've taken discrete math and some graph theory, but never stats, so all those families of distributions are alien to me :)


You can use the relation f(x) = 1 + integral_0^1 f(t) dt, then just check that e^x is a solution or use taylor expansion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: