This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: Cannot use ProcessPoolExecutor if in a decorator?
Type: behavior Stage:
Components: Library (Lib) Versions: Python 3.9, Python 3.8, Python 3.7, Python 3.6
process
Status: open Resolution:
Dependencies: Superseder:
Assigned To: Nosy List: aeros, alexandre.vassalotti, bob.fang.london, pitrou
Priority: normal Keywords:

Created on 2020-08-18 17:21 by bob.fang.london, last changed 2022-04-11 14:59 by admin.

Messages (2)
msg375614 - (view) Author: Bob Fang (bob.fang.london) Date: 2020-08-18 17:21
I have this minimal example:

```
from functools import wraps
from concurrent import futures
import random

def decorator(func):
    num_process = 4

    def impl(*args, **kwargs):
        with futures.ProcessPoolExecutor() as executor:
            fs = []
            for i in range(num_process):
                fut = executor.submit(func, *args, **kwargs)
                fs.append(fut)
            result = []
            for f in futures.as_completed(fs):
                result.append(f.result())
        return result
    return impl

@decorator
def get_random_int():
    return random.randint(0, 100)


if __name__ == "__main__":
    result = get_random_int()
    print(result)
```
If we try to run this function I think we will have the following error:
```
_pickle.PicklingError: Can't pickle <function get_random_int at 0x7f06cee666a8>: it's not the same object as __main__.get_random_int
```
I think the main issue here is that the "wraps" decorator itself alters the `func` object and thus make it impossible to pickle. I found this rather strange. I am just wondering if there is any way to get around this behavior? I would want to use `wraps` if possible. Thanks!
msg375632 - (view) Author: Kyle Stanley (aeros) * (Python committer) Date: 2020-08-19 03:12
Due to the way pickle works, it's not presently possible to serialize wrapped functions directly, at least not in a way that allows you to pass it to executor.submit() within the inner function (AFAICT). I'm also not certain what it would involve to provide that, or if it would be feasible to do so in a manner that would be backwards compatible.

In the meantime, this is a decent work-around:

```
from concurrent import futures
import random

class PPEWrapper:
    def __init__(self, func, num_proc=4):
        self.fn = func
        self.num_proc = num_proc
    
    def __call__(self, *args, **kwargs):
        with futures.ProcessPoolExecutor() as executor:
            fs = []
            for i in range(self.num_proc):
                fut = executor.submit(self.fn, *args, **kwargs)
                fs.append(fut)
            result = []
            for f in futures.as_completed(fs):
                result.append(f.result())
        return result


def _get_random_int():
    return random.randint(0, 100)

# it's often quite useful anyways to have a parallel and non-parallel version
# (for testing and devices that don't support MP)
get_random_int = PPEWrapper(_get_random_int, num_proc=4)

if __name__ == "__main__":
    result = get_random_int()
    print(result)
```

This doesn't allow you to use the decorator syntax, but it largely provides the same functionality. That being said, my familiarity with the pickle protocol isn't overly strong, so the above is mostly based on my own recent investigation. There could very well be a way to accomplish what you're looking for in a way that I was unable to determine.
History
Date User Action Args
2022-04-11 14:59:34adminsetgithub: 85749
2020-08-19 03:12:40aerossetnosy: + pitrou, alexandre.vassalotti, aeros
messages: + msg375632
2020-08-18 17:34:53bob.fang.londonsetversions: + Python 3.6, Python 3.7, Python 3.9
2020-08-18 17:21:46bob.fang.londoncreate