This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author ncoghlan
Recipients abarry, brett.cannon, ncoghlan, rhettinger, serhiy.storchaka, vstinner, xiang.zhang
Date 2016-06-03.00:22:41
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
As Raymond notes, the main downside here is in terms of code complexity. However, the concrete gain is that APIs that rely on callable pickling, such as concurrent.futures with a ProcessPoolExecutor, would be consistently compatible with functools.partial:

>>> from concurrent.futures import ProcessPoolExecutor
>>> from functools import partial
>>> with ProcessPoolExecutor() as pool:
...     pool.submit(print, "hello")
...     pool.submit(partial(print, "hello"))
<Future at 0x7f4fdb47ce48 state=running>
<Future at 0x7f4fd4f9cb00 state=pending>

At the moment, such code will fail if _functools is unavailable, since closures don't support pickling (unpickling functions involves looking them up by name, which isn't possible with closures)

The other main benefit is retaining the custom __repr__ when falling back to the Python implementation:

>>> partial(print, "hello")
functools.partial(<built-in function print>, 'hello')

At the moment, the closure-based version instead gives:

>>> partial(print, "hello")
<function functools.partial.<locals>.newfunc at 0x7f4fd6e0aea0>

Preserving those two capabilities seems sufficiently worthwhile to me to justify the extra code complexity and the greater speed penalty when the accelerator module isn't available (I'm assuming that in runtimes using a JIT compiler the speed difference should be negligible in practice)
Date User Action Args
2016-06-03 00:22:43ncoghlansetrecipients: + ncoghlan, brett.cannon, rhettinger, vstinner, serhiy.storchaka, xiang.zhang, abarry
2016-06-03 00:22:43ncoghlansetmessageid: <>
2016-06-03 00:22:43ncoghlanlinkissue27137 messages
2016-06-03 00:22:41ncoghlancreate