Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is an extremely common mistake in reporting performance numbers. That the old version is 70% slower does not make the new version 70% faster.


70% slower is a bit ambiguous though - it could mean 70% extra runtime or it could mean 30% of the new speed. Whereas 70% faster would always suggest to me that it can do 70% more work in the same amount of time, i.e. a 1.7x increase in speed.


I do not agree. When you benchmark you usually measure the differences between times needed to complete. This is because it is highly non-obvious that if you increase workload twice, the time increases also twice. Perhaps the algorithm is not linear. Perhaps if you have more data, you suddenly need to swap memory. Perhaps something (like disk access in parallel) means that actually it takes less than 2x time. This means that a concept of speed per unit of work is undefined. So the only reasonable interpretation of "70% faster" means "spends 30% time of original".


I can categorically state I've never thought of or understand 70% faster as meaning that, and certainly not 100% faster as meaning "completes instantly".

I see the OP has solved the problem by removing any references to how much faster from the article title!

You're right about non-linear algorithms though. If an O(n^2) algorithm is 2x / 100% faster, it can't process 100% more items in the same time, but I'd understand it to mean taking half the time for the same n.


But under your definition, if something becomes "three times as fast" (i.e., 200% faster), it will have to finish its task in negative time!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: