This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author vstinner
Recipients brett.cannon, fijall, lemburg, ned.deily, pitrou, serhiy.storchaka, steven.daprano, tim.peters, vstinner
Date 2016-09-23.08:21:39
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1474618900.19.0.83615071376.issue28240@psf.upfronthosting.co.za>
In-reply-to
Content
Marc-Andre: "Consensus then was to use the minimum as basis for benchmarking: (...) There are arguments both pro and con using min or avg values."

To be honest, I expect that most developer are already aware that minimum is evil and so I wouldn't have to convince you. I already posted two links for the rationale. It seems that you are not convinced yet, it seems like I have to prepare a better rationale :-)

Quick rationale: the purpose of displaying the average rather than the minimum in timeit is to make timeit more *reliable*. My goal is that running timeit 5 times would give exactly the same result. With the current design of timeit (only run one process), it's just impossible (for different reasons listed in my first article linked on this issue). But displaying the average is less worse than displaying the minimum to make results more reproductible.
History
Date User Action Args
2016-09-23 08:21:40vstinnersetrecipients: + vstinner, lemburg, tim.peters, brett.cannon, pitrou, ned.deily, steven.daprano, fijall, serhiy.storchaka
2016-09-23 08:21:40vstinnersetmessageid: <1474618900.19.0.83615071376.issue28240@psf.upfronthosting.co.za>
2016-09-23 08:21:40vstinnerlinkissue28240 messages
2016-09-23 08:21:39vstinnercreate