# Counting large numbers of events in small registers

“What does the logarithm do?”, asked a Professor a bunch of us back in 1990. “It is the inverse of $e^{x}$, I said. “No” he replied, “I ask again, what does the logartihm do?”. A friend started to say something along the lines of the Wikipedia definition. “You tell me how it is calculated, but not what it stands for; why do we need it?”. He then proceeded asking us the value of the logarithm of some numbers:

 x $log_{10}(x)$ 1 0 10 1 100 2 1000 3 10000 4

– Do you see a pattern now?

“The length of the number is $\lfloor log_{10}(x) \rfloor +1$, one of us observed.

– Exactly!

I was reminded of this discussion when stumbling on Counting large numbers of events in small registers written by Robert Morris (father of the Robert T. Morris). Basically, what Morris observes in this paper is that with n bits available, one can count up to $2^{n}-1$. He had 8 bit registers available only, so this limited counting up to 255 which did not suite the application he had at hand. So he began by simply having the value of a register representing $v(n) = log_{2}(1+n)$ and reached to a generalized result of $v(n) = \frac{log(1+\frac{n}{a})}{log(1+\frac{1}{a})}$ where a controls both the maximum count that can be held in the register and the expected error.

The maximum number is $n_{v} = a((1+\frac{1}{a})^{v}-1)$. The expected error of n after n counts can be calculated from the formula $\sigma^{2} = \frac{n(n-1)}{2a}$.

This is a really nice hack if you need to count large numbers of events and where accuracy is not a must. It maybe helpful to people working with embedded systems or otherwise deprived environments.

## 4 thoughts on “Counting large numbers of events in small registers”

1. Η ερώτηση ήταν από Ζάχο (στο έτος μου τουλάχιστον)

2. Και σε εμάς ο Ζάχος το ρώτησε. Αλλά όχι μέσα στο αμφιθέατρο, στο γραφείο του όταν πήγαμε να τον ρωτήσουμε για το μέσο ύψος του δυαδικού δέντρου.

3. I was under the impression that all “these” things are used so that Numb3rs have things to reference

This is obviously another way to say: “I didn’t understand shit!”

4. @Dimitrios Stergiou:
I think the notation scared you. Things are very simple:

log(10000) = 4 and the length of 10000 is 5 (4 + 1). log(9999) ~ 3.9999 and the length of 9999 is 4 (3 + 1). So one can say that the logarithm of a number gives you an estimate of its length in digits.

Now with the Morris trick. Assuming that you have an 8 bit register that you use as a counter, the maximum number you can count is 255 (11111111). What if you want to count more events with some error? Counting every other event gives you up to 510 with a 50% chance of having the right number: 00001000 = 8 which means that you have counted either 15 or 16 events (eg. people crossing the street).

Assuming that v(n) is the value of the register, using the $log_{2}(1+n)$ trick you can count estimate up to $2^{256}-1$. With a register value of 00001111 (15), this means that n = 32767 and with v(n) = 14, n = 16383. To correct this distance in the estimates between sequential values he reaches to the more general formula. So when you need estimates of quantities and not exact values, this is a good trick (assuming that you cannot count otherwise).

I was impressed by this technique because usually we are taught stuff, but not why stuff was invented. If you know why, you may use it in the most unexpected places.