Message257880
So min is the fastest time in a benchmark execution, average is the average across all benchmark executions, and the t-value is some statistics thing. Anything marked insignificant had such a small average difference it isn't worth reporting.
If you want to smooth out the numbers you should do a rigorous run that uses more loops per benchmark to help smooth out outliers. And if you are doing this on Linux there is a flag to measure memory usage as well. |
|
Date |
User |
Action |
Args |
2016-01-10 01:07:19 | brett.cannon | set | recipients:
+ brett.cannon, pitrou, vstinner |
2016-01-10 01:07:19 | brett.cannon | set | messageid: <1452388039.55.0.280642550006.issue26058@psf.upfronthosting.co.za> |
2016-01-10 01:07:19 | brett.cannon | link | issue26058 messages |
2016-01-10 01:07:18 | brett.cannon | create | |
|