Author yselivanov
Recipients gvanrossum, mark.dickinson, methane, ned.deily, skrah, yselivanov
Date 2018-01-26.21:06:38
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
> However, I could not find any tests for the added feature (safe
> use with async) though. We would be adding a new feature without
> tests.

This is no problem, I can add a few async/await tests.

> I'm getting a large slowdown:
> ./python Modules/_decimal/tests/
> [..]
> patched:    [0.199, 0.206, 0.198, 0.199, 0.197, 0.202, 0.198, 0.201, 0.213, 0.199]
> status-quo: [0.187, 0.184, 0.185, 0.183, 0.184, 0.188, 0.184, 0.183, 0.183, 0.185]

I'd like you to elaborate a bit more here.  First, produces a completely different output from what you've quoted.  How exactly did you compile these results?  Are those numbers results of Pi calculation or factorial?  Can you upload the actual script you used here (if there's one)?

Second, here's my run of with contextvars and without:

I don't see any difference, left alone 10% slowdown.

> --------
> patched:    [0.535, 0.541, 0.523]
> status-quo: [0.412, 0.393, 0.375]

This benchmark is specially constructed to profile creating decimal contexts and doing almost nothing with them.

I've optimized PEP 567 for contextvar.get() operation, not contextvar.set (it's hard to make hamt.set() as fast as dict.set()).  That way, if you have an some decimal code that performs actual calculations with decimal objects, the operation of looking up the current context is cheap.

It's hard to imagine a situation, where a real decimal-related code just creates decimal contexts and does nothing else with them.
Date User Action Args
2018-01-26 21:06:38yselivanovsetrecipients: + yselivanov, gvanrossum, mark.dickinson, ned.deily, methane, skrah
2018-01-26 21:06:38yselivanovsetmessageid: <>
2018-01-26 21:06:38yselivanovlinkissue32630 messages
2018-01-26 21:06:38yselivanovcreate