The mean misleads: why the minimum is the true measure of a function’s run time

David Gilbertson
Better Programming
Published in
16 min readMay 24, 2023

--

Imagine: you’ve got a function and you’d like to know how long it takes to run, on average. So you run the function a few times, measuring how long it takes each time, then take the average of those times.

That couldn’t possibly be wrong, could it?

In this post I will make the case that taking the mean is indeed the wrong approach most of the time, and that taking the minimum will give superior results. The reasoning behind this assertion is quite straightforward, and…

--

--