Hacker Newsnew | past | comments | ask | show | jobs | submit | deifact's commentslogin

The average of an infinite set isn't necessarily a meaningful thing to consider. For example, what is the average integer? Well, you could argue that the answer is 0 because you can pair off the positives and the negatives. On the other hand, you could also say it's 1: pair off 0 and 2, -1 and 3, -2 and 4, etc.

Instead, the rigorous way to generalize the average is essentially to assign a probability to each member of the set, and compute the sum over the set of (n * p(n)), where p(n) is the probability of picking n, and the sum of all of the p(n) is 1. This is generally known as the 'expected value', and it has the property you want that the expected value of a set given a probability distribution is less than or equal to the maximum of the set.

On the other hand, you can't set p(n) to the same value for every natural number, since the sum of p(n) has to be 1 (because it's a probability). But the sum of infinitely many copies of the same number between 0 and 1 is either 0 or infinite.

So instead, you have to have some varying probability. For example, you might say you have have probability 1/2 of picking 1, 1/4 of picking 2, 1/8 of picking 3, and so on, halving the probability each time. And in that case, the average number of bits works out to be about 1.7.

In fact, you can show that for any way of picking a natural number at random, the expected number of bits to represent it will be finite.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: