This work proposes a novel framework that enables one to compare distinct iterative procedures with known rates of convergence, in terms of the computational effort to be employed to reach some prescribed vicinity of the optimal solution to a given problem of interest. An algorithm is introduced that decides between two competing algorithms, which algorithm makes the best use of the computational resources for some prescribed error. Several examples are presented that illustrate the trade-offs involved in such a choice and demonstrate that choosing an algorithm over another with a higher rate of convergence can be perfectly justifiable in terms of the overall computational effort.