This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: Possible deadlock on sys.stdout/stderr when combining multiprocessing with threads
Type: behavior Stage:
Components: Library (Lib) Versions: Python 3.7, Python 3.6, Python 3.5
process
Status: open Resolution:
Dependencies: Superseder:
Assigned To: Nosy List: Hadhoke, davin, pitrou
Priority: normal Keywords:

Created on 2016-10-06 23:20 by Hadhoke, last changed 2022-04-11 14:58 by admin.

Messages (3)
msg278221 - (view) Author: Alexis (Hadhoke) Date: 2016-10-06 23:20
I am launching a process inside a pool worker, using the multiprocessing module.
After a while, a deadlock append when I am trying to join the process.

Here is a simple version of the code:

import sys, time, multiprocessing
from multiprocessing.pool import ThreadPool

def main():
    # Launch 8 workers
    pool = ThreadPool(8)
    it = pool.imap(run, range(500))
    while True:
        try:
            it.next()
        except StopIteration:
            break

def run(value):
    # Each worker launch its own Process
    process = multiprocessing.Process(target=run_and_might_segfault,     args=(value,))
    process.start()

    while process.is_alive():
        sys.stdout.write('.')
        sys.stdout.flush()
        time.sleep(0.1)

    # Will never join after a while, because of a mystery deadlock
    process.join()

def run_and_might_segfault(value):
    print(value)

if __name__ == '__main__':
    main()

And here is a possible output:

~ python m.py
..0
1
........8
.9
.......10
......11
........12
13
........14
........16
........................................................................................
As you can see, process.is_alive() is alway true after few iterations, the process will never join.

If I CTRL-C the script a get this stacktrace:

Traceback (most recent call last):
  File "/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/multiprocessing/pool.py", line 680, in next
    item = self._items.popleft()
IndexError: pop from an empty deque

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "m.py", line 30, in <module>
    main()
  File "m.py", line 9, in main
    it.next()
  File "/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5    /lib/python3.5/multiprocessing/pool.py", line 684, in next
    self._cond.wait(timeout)
  File "/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5    /lib/python3.5/threading.py", line 293, in wait
    waiter.acquire()
KeyboardInterrupt

Error in atexit._run_exitfuncs:
Traceback (most recent call last):
  File "/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5    /lib/python3.5/multiprocessing/popen_fork.py", line 29, in poll
    pid, sts = os.waitpid(self.pid, flag)
KeyboardInterrupt

Using python 3.5.1 on macos, also tried with 3.5.2 with same issue.
Same result on Debian.
I tried using python 2.7, and it is working well. May be a python 3.5 issue only?

Here is the link of the stackoverflow question:
http://stackoverflow.com/questions/39884898/large-amount-of-multiprocessing-process-causing-deadlock
msg298875 - (view) Author: Antoine Pitrou (pitrou) * (Python committer) Date: 2017-07-22 21:52
Ok, after a bit of diagnosing, the issue is combining multi-threading with use of fork().  The problem is file objects (such as sys.stdout) have locks but those locks may be taken at the exact point where fork() happens, in which case the child will block when trying to take the lock again.

This is mostly a duplicate of issue 6721, but perhaps multiprocessing could at least improve things for sys.stdout and sys.stderr (though I'm not sure how).

This is also compounded by the fact that Process._bootstrap() flushed the standard streams at the end.
msg298900 - (view) Author: Antoine Pitrou (pitrou) * (Python committer) Date: 2017-07-23 12:16
Also, a reliable fix is to use the "forkserver" (or "spawn", but it is much slower) method:
https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods
History
Date User Action Args
2022-04-11 14:58:37adminsetgithub: 72568
2017-07-23 12:16:39pitrousetmessages: + msg298900
2017-07-22 21:52:54pitrousetnosy: + pitrou, davin
title: Possible deadlock after many multiprocessing.Process are launch -> Possible deadlock on sys.stdout/stderr when combining multiprocessing with threads
messages: + msg298875

versions: + Python 3.6, Python 3.7
2016-10-06 23:20:37Hadhokecreate