This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: Possible race condition in Pool
Type: behavior Stage:
Components: Library (Lib) Versions: Python 3.8, Python 3.4, Python 3.5
process
Status: open Resolution:
Dependencies: Superseder:
Assigned To: Nosy List: Stanislaw Izaak Pitucha, anilb, davin
Priority: normal Keywords:

Created on 2015-09-10 11:33 by Stanislaw Izaak Pitucha, last changed 2022-04-11 14:58 by admin.

Messages (9)
msg250362 - (view) Author: Stanislaw Izaak Pitucha (Stanislaw Izaak Pitucha) Date: 2015-09-10 11:33
This is something that happened once and I cannot reproduce anymore. In an IPython session I've done the following (see session below), which resulted in pool.map raising its internal exceptions in cases where it shouldn't. Then it worked again. (see in/out 29)

Running exactly the same lines again did not result in the same behaviour. I tried multiple times in the same session as well as new sessions. Looks like a weird race / corruption.

I'm on Ubuntu's '3.4.3 (default, Mar 26 2015, 22:03:40) \n[GCC 4.9.2]'. Running with 2 virtualised cores (vmware). 64-bit system.

In [21]: import multiprocessing
In [22]: p=multiprocessing.Pool()
... (unrelated, checked p.map? and other helps)
In [26]: class A(object):
    def a(self, i):
        print("boo", i, multiprocessing.current_process())
   ....: 
In [27]: obj = A()
In [28]: p.map(obj.a, [1,2,3,4])
Process ForkPoolWorker-1:
Process ForkPoolWorker-2:
Traceback (most recent call last):
Traceback (most recent call last):
  File "/usr/lib/python3.4/multiprocessing/process.py", line 254, in _bootstrap
    self.run()
  File "/usr/lib/python3.4/multiprocessing/process.py", line 254, in _bootstrap
    self.run()
  File "/usr/lib/python3.4/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python3.4/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python3.4/multiprocessing/pool.py", line 108, in worker
    task = get()
  File "/usr/lib/python3.4/multiprocessing/pool.py", line 108, in worker
    task = get()
  File "/usr/lib/python3.4/multiprocessing/queues.py", line 357, in get
    return ForkingPickler.loads(res)
AttributeError: Can't get attribute 'A' on <module '__main__'>
  File "/usr/lib/python3.4/multiprocessing/queues.py", line 357, in get
    return ForkingPickler.loads(res)
AttributeError: Can't get attribute 'A' on <module '__main__'>
boo 3 <ForkProcess(ForkPoolWorker-3, started daemon)>
boo 4 <ForkProcess(ForkPoolWorker-3, started daemon)>

^CProcess ForkPoolWorker-4:
Process ForkPoolWorker-3:
Traceback (most recent call last):
Traceback (most recent call last):
  File "/usr/lib/python3.4/multiprocessing/process.py", line 254, in _bootstrap
    self.run()
  File "/usr/lib/python3.4/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python3.4/multiprocessing/process.py", line 254, in _bootstrap
    self.run()
  File "/usr/lib/python3.4/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python3.4/multiprocessing/pool.py", line 108, in worker
    task = get()
  File "/usr/lib/python3.4/multiprocessing/pool.py", line 108, in worker
    task = get()
  File "/usr/lib/python3.4/multiprocessing/queues.py", line 354, in get
    with self._rlock:
  File "/usr/lib/python3.4/multiprocessing/queues.py", line 355, in get
    res = self._reader.recv_bytes()
  File "/usr/lib/python3.4/multiprocessing/synchronize.py", line 96, in __enter__
    return self._semlock.__enter__()
KeyboardInterrupt
  File "/usr/lib/python3.4/multiprocessing/connection.py", line 216, in recv_bytes
    buf = self._recv_bytes(maxlength)
  File "/usr/lib/python3.4/multiprocessing/connection.py", line 416, in _recv_bytes
    buf = self._recv(4)
  File "/usr/lib/python3.4/multiprocessing/connection.py", line 383, in _recv
    chunk = read(handle, remaining)
KeyboardInterrupt
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-28-3be6fbcc359b> in <module>()
----> 1 p.map(obj.a, [1,2,3,4])

/usr/lib/python3.4/multiprocessing/pool.py in map(self, func, iterable, chunksize)
    258         in a list that is returned.
    259         '''
--> 260         return self._map_async(func, iterable, mapstar, chunksize).get()
    261 
    262     def starmap(self, func, iterable, chunksize=None):

/usr/lib/python3.4/multiprocessing/pool.py in get(self, timeout)
    591 
    592     def get(self, timeout=None):
--> 593         self.wait(timeout)
    594         if not self.ready():
    595             raise TimeoutError

/usr/lib/python3.4/multiprocessing/pool.py in wait(self, timeout)
    588 
    589     def wait(self, timeout=None):
--> 590         self._event.wait(timeout)
    591 
    592     def get(self, timeout=None):

/usr/lib/python3.4/threading.py in wait(self, timeout)
    551             signaled = self._flag
    552             if not signaled:
--> 553                 signaled = self._cond.wait(timeout)
    554             return signaled
    555         finally:

/usr/lib/python3.4/threading.py in wait(self, timeout)
    288         try:    # restore state no matter what (e.g., KeyboardInterrupt)
    289             if timeout is None:
--> 290                 waiter.acquire()
    291                 gotit = True
    292             else:

KeyboardInterrupt: 

In [29]: p.map(obj.a, [1,2,3,4])
boo 1 <ForkProcess(ForkPoolWorker-5, started daemon)>
boo 2 <ForkProcess(ForkPoolWorker-6, started daemon)>
boo 3 <ForkProcess(ForkPoolWorker-5, started daemon)>
boo 4 <ForkProcess(ForkPoolWorker-6, started daemon)>
Out[29]: [None, None, None, None]
msg250393 - (view) Author: Davin Potts (davin) * (Python committer) Date: 2015-09-10 17:09
I have been able to reproduce the behavior you described under 3.5 under one set of circumstances so far.

When attempting to use classes/functions/other not defined in an importable module, an AttributeError is expected as you observed.  The viability of that existing Pool to perform further useful work has been compromised and no promises are made regarding its continued behavior.

In general, the recommendations in https://docs.python.org/3.4/library/multiprocessing.html#multiprocessing-programming hold to avoid these situations.

Understanding the exact set of circumstances to provoke situations where viable work can continue with a locally-defined (not in an importable module) function/class is perhaps worth further investigation.

That said, multiple places in the multiprocessing docs (including the introductory paragraphs) provide the guidance to always use functions/classes whose definitions are importable.  Hence, I'm inclined to mark this mentally as "interesting" but on the bug tracker as "not a bug".  Any objections?
msg250436 - (view) Author: Stanislaw Izaak Pitucha (Stanislaw Izaak Pitucha) Date: 2015-09-11 02:27
I agree with what you said mostly. I wonder though if the message could be clearer in the docs. I remember that the rule about the module being importable without side-effects, but didn't even consider that for the interactive session.

If I understand your analysis correctly, you're basically saying that "don't use multiprocessing" could be a general rule for interactive interpreters? Which would also affect any ipython session like ipython notebooks? (seems to be confirmed if you google for "multiprocessing ipython crash"...) If that's true, maybe that specific case should be spelled out in the docs more explicitly.
msg381029 - (view) Author: Anil Bishnoie (anilb) Date: 2020-11-15 20:16
I still get this error while processing multiple tasks in python 3.8 
Is this bug resolved and where I can find resolution ?
msg381030 - (view) Author: Anil Bishnoie (anilb) Date: 2020-11-15 20:19
Traceback (most recent call last):
  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap
    self.run()
  File "C:\Python38\lib\multiprocessing\process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Python38\lib\multiprocessing\pool.py", line 114, in worker
    task = get()
  File "C:\Python38\lib\multiprocessing\queues.py", line 358, in get
    return _ForkingPickler.loads(res)
AttributeError: Can't get attribute 'ld_df1' on <module '__main__' (built-in)>
msg381031 - (view) Author: Anil Bishnoie (anilb) Date: 2020-11-15 20:21
Before running the Pool I checked whether my func existed in memory or not and it was there well defined by address and works fine otherwise

>>> ld_df1
<function ld_df1 at 0x062D6DF0>
msg381032 - (view) Author: Anil Bishnoie (anilb) Date: 2020-11-15 20:23
from multiprocessing import Pool

l_adt=ldir()
l_ln=len(l_stk)
p = Pool(processes=l_ln)
df = p.map(ld_df1, [i for i in l_adt])

l_ln=12 processes
Above is the self explanatory code snippet

IS there a resolution ?
msg381035 - (view) Author: Anil Bishnoie (anilb) Date: 2020-11-15 21:28
Process SpawnPoolWorker-36:
Process SpawnPoolWorker-28:
Process SpawnPoolWorker-33:
Process SpawnPoolWorker-30:
Process SpawnPoolWorker-34:
Process SpawnPoolWorker-32:
Process SpawnPoolWorker-35:
Process SpawnPoolWorker-38:
Process SpawnPoolWorker-37:
Process SpawnPoolWorker-31:
Process SpawnPoolWorker-29:
Process SpawnPoolWorker-26:
Traceback (most recent call last):
  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap
    self.run()
  File "C:\Python38\lib\multiprocessing\process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Python38\lib\multiprocessing\pool.py", line 114, in worker
    task = get()
  File "C:\Python38\lib\multiprocessing\queues.py", line 356, in get
    res = self._reader.recv_bytes()
  File "C:\Python38\lib\multiprocessing\connection.py", line 216, in recv_bytes
    buf = self._recv_bytes(maxlength)
  File "C:\Python38\lib\multiprocessing\connection.py", line 305, in _recv_bytes
    waitres = _winapi.WaitForMultipleObjects(
KeyboardInterrupt
Traceback (most recent call last):
Traceback (most recent call last):
Traceback (most recent call last):
Traceback (most recent call last):
Traceback (most recent call last):
  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap
    self.run()
Traceback (most recent call last):
Traceback (most recent call last):
  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap
    self.run()
Traceback (most recent call last):
  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap
    self.run()
  File "C:\Python38\lib\multiprocessing\process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap
    self.run()
  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap
    self.run()
  File "C:\Python38\lib\multiprocessing\process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
Traceback (most recent call last):
  File "C:\Python38\lib\multiprocessing\process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Python38\lib\multiprocessing\pool.py", line 114, in worker
    task = get()
  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap
    self.run()
  File "C:\Python38\lib\multiprocessing\pool.py", line 114, in worker
    task = get()
  File "C:\Python38\lib\multiprocessing\process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Python38\lib\multiprocessing\pool.py", line 114, in worker
    task = get()
  File "C:\Python38\lib\multiprocessing\queues.py", line 355, in get
    with self._rlock:
  File "C:\Python38\lib\multiprocessing\synchronize.py", line 95, in __enter__
    return self._semlock.__enter__()
KeyboardInterrupt
  File "C:\Python38\lib\multiprocessing\queues.py", line 355, in get
    with self._rlock:
  File "C:\Python38\lib\multiprocessing\synchronize.py", line 95, in __enter__
    return self._semlock.__enter__()
KeyboardInterrupt
  File "C:\Python38\lib\multiprocessing\process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Python38\lib\multiprocessing\pool.py", line 114, in worker
    task = get()
  File "C:\Python38\lib\multiprocessing\queues.py", line 355, in get
    with self._rlock:
  File "C:\Python38\lib\multiprocessing\synchronize.py", line 95, in __enter__
    return self._semlock.__enter__()
  File "C:\Python38\lib\multiprocessing\process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Python38\lib\multiprocessing\pool.py", line 114, in worker
    task = get()
  File "C:\Python38\lib\multiprocessing\queues.py", line 355, in get
    with self._rlock:
  File "C:\Python38\lib\multiprocessing\synchronize.py", line 95, in __enter__
    return self._semlock.__enter__()
KeyboardInterrupt
KeyboardInterrupt
msg381037 - (view) Author: Anil Bishnoie (anilb) Date: 2020-11-15 22:27
Fatal Python error: init_sys_streams: can't initialize sys standard streams
Python runtime state: core initialized

  File "C:\Python38\lib\multiprocessing\process.py", line 315, in _bootstrap


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Python38\lib\multiprocessing\spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "C:\Python38\lib\multiprocessing\spawn.py", line 129, in _main
    return self._bootstrap(parent_sentinel)
  File "C:\Python38\lib\multiprocessing\process.py", line 331, in _bootstrap
    traceback.print_exc()
  File "C:\Python38\lib\traceback.py", line 163, in print_exc
    print_exception(*sys.exc_info(), limit=limit, file=file, chain=chain)
  File "C:\Python38\lib\traceback.py", line 105, in print_exception
    print(line, file=file, end="")
KeyboardInterrupt
    addpackage(sitedir, name, known_paths)
  File "C:\Python38\lib\site.py", line 169, in addpackage
    exec(line)
  File "<string>", line 1, in <module>
  File "C:\Python38\lib\importlib\util.py", line 14, in <module>
    from contextlib import contextmanager
  File "C:\Python38\lib\contextlib.py", line 5, in <module>
    from collections import deque
  File "C:\Python38\lib\collections\__init__.py", line 22, in <module>
    from keyword import iskeyword as _iskeyword
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 779, in exec_module
  File "<frozen importlib._bootstrap_external>", line 874, in get_code
  File "<frozen importlib._bootstrap_external>", line 972, in get_data
History
Date User Action Args
2022-04-11 14:58:20adminsetgithub: 69240
2020-11-15 22:27:31anilbsetmessages: + msg381037
2020-11-15 21:28:28anilbsetmessages: + msg381035
2020-11-15 20:23:56anilbsetmessages: + msg381032
2020-11-15 20:21:36anilbsetmessages: + msg381031
2020-11-15 20:19:28anilbsetmessages: + msg381030
2020-11-15 20:16:40anilbsetnosy: + anilb

messages: + msg381029
versions: + Python 3.8
2015-09-11 02:27:22Stanislaw Izaak Pituchasetstatus: pending -> open

messages: + msg250436
2015-09-10 19:33:51davinsetstatus: open -> pending
versions: + Python 3.5
2015-09-10 17:09:55davinsetnosy: + davin
messages: + msg250393
2015-09-10 11:33:42Stanislaw Izaak Pituchacreate