Message310821
> Likewise, on the same builds, running _decimal/tests/bench.py does not show a significant difference: https://gist.github.com/elprans/fb31510ee28a3aa091aee3f42fe65e00
Note: it may be interesting to rewrite this benchmark my perf module to be able to easily check if a benchmark result is significant.
http://perf.readthedocs.io/en/latest/cli.html#perf-compare-to
"perf determines whether two samples differ significantly using a Student’s two-sample, two-tailed t-test with alpha equals to 0.95."
=> https://en.wikipedia.org/wiki/Student's_t-test
Usually, I consider that between 5% slower and 5% faster is not significant. But it depends how the benchmark was run, it depends on the type of benchmark, etc. Here I don't know bench.py so I cannot judge.
For example, for an optimization, I'm more interested by an optimization making a benchmark 10% faster ;-) |
|
Date |
User |
Action |
Args |
2018-01-26 23:16:30 | vstinner | set | recipients:
+ vstinner, gvanrossum, mark.dickinson, ned.deily, methane, skrah, yselivanov |
2018-01-26 23:16:30 | vstinner | set | messageid: <1517008590.46.0.467229070634.issue32630@psf.upfronthosting.co.za> |
2018-01-26 23:16:30 | vstinner | link | issue32630 messages |
2018-01-26 23:16:30 | vstinner | create | |
|