I briefly wrote about this in a footnote of a different post, and yet I felt it deserved proper treatment. Here we go.

In the past, I had discussions about what does it exactly mean when you say "program x is faster than program y by so much". Something that I noticed is that the speed of a program is sometimes not well understood.

For example, if you say Program A is 25% faster than Program B, what do you mean exactly? Here are some of the interpretations I have encountered:

- If Program B's time is 100%, then Program A finished in 75% of that time.
- Take Program B's time, divide by Program A's time, and you obtain 1.25.
- Take Program B's time, subtract Program A's time, divide by Program's B time, multiply by 100%, and you obtain 25%.
- Same as above, divide by Program A's time instead.

What is interesting is that any of these appears to be consistent in its own context, sometimes with surprising consequences. For example, with one of the definitions above, you cannot ever have a program be >100% faster than another one. I've run into many of these definitions, and in my experience all of them have strong believers.

I also strongly believe in one of the intepretations above because it is based on the definition of speed. So what is the speed of a program in the first place?

We are all familiar with what miles or kilometers per hour mean in terms of speed. Basically, it's some unit of distance traveled per unit of time. But what would be the distance to measure for a program? I think something that makes sense is that the distance a program travels towards the completion of its assigned task is precisely that: whether it finishes its job or not.

Let's call the unit of measurement of distance traveled by a program an

iteration. Therefore, our unit of speed is

iterations per second.

So now, what does it mean exactly to say that Program A runs 25% faster than Program B? That Program A completes 25% more iterations than Program B in the same amount of time. Note that this means that each iteration of Program A will complete in 80% (not 75%) of the time it takes for Program B to complete an iteration, because 1/1.25 = 0.8.

Furthermore, what does it mean exactly to say that Program A runs, for example, 2 times faster than Program B? That Program A completes twice as many iterations than Program B in the same amount of time. Note that this means that each iteration of Program A will complete in 50% of the time it takes for Program B to complete an iteration, because 1/2 = 0.5.

As you can see, the assertions "runs 100% faster" and "runs 2x faster" are equivalent under these conditions.

Finally, what happens on edge cases, such as when for example Program B is 100 null operations and Program A has zero instructions? Then Program A is

infinitely faster than Program B, because Program A can complete an infinite amount of zero cost iterations with ease.

To summarize then,

- The speed of a program is the amount of iterations per unit of time (e.g. seconds) it completes.
- Program A is x times faster than Program B if and only if Speed A / Speed B = x.
- Program A is x% faster than Program B if and only if (Speed A / Speed B) - 1 = x/100.

Have a nice day!