Title: Possible deadlock on sys.stdout/stderr when combining multiprocessing with threads
Type: behavior Stage:
Components: Library (Lib) Versions: Python 3.7, Python 3.6, Python 3.5
Status: open Resolution:
Dependencies: Superseder:
Assigned To: Nosy List: Hadhoke, davin, pitrou
Priority: normal Keywords:

Created on 2016-10-06 23:20 by Hadhoke, last changed 2017-07-23 12:16 by pitrou.

Messages (3)
msg278221 - (view) Author: Alexis (Hadhoke) Date: 2016-10-06 23:20
I am launching a process inside a pool worker, using the multiprocessing module.
After a while, a deadlock append when I am trying to join the process.

Here is a simple version of the code:

import sys, time, multiprocessing
from multiprocessing.pool import ThreadPool

def main():
    # Launch 8 workers
    pool = ThreadPool(8)
    it = pool.imap(run, range(500))
    while True:
        except StopIteration:

def run(value):
    # Each worker launch its own Process
    process = multiprocessing.Process(target=run_and_might_segfault,     args=(value,))

    while process.is_alive():

    # Will never join after a while, because of a mystery deadlock

def run_and_might_segfault(value):

if __name__ == '__main__':

And here is a possible output:

~ python
As you can see, process.is_alive() is alway true after few iterations, the process will never join.

If I CTRL-C the script a get this stacktrace:

Traceback (most recent call last):
  File "/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/multiprocessing/", line 680, in next
    item = self._items.popleft()
IndexError: pop from an empty deque

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "", line 30, in <module>
  File "", line 9, in main
  File "/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5    /lib/python3.5/multiprocessing/", line 684, in next
  File "/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5    /lib/python3.5/", line 293, in wait

Error in atexit._run_exitfuncs:
Traceback (most recent call last):
  File "/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5    /lib/python3.5/multiprocessing/", line 29, in poll
    pid, sts = os.waitpid(, flag)

Using python 3.5.1 on macos, also tried with 3.5.2 with same issue.
Same result on Debian.
I tried using python 2.7, and it is working well. May be a python 3.5 issue only?

Here is the link of the stackoverflow question:
msg298875 - (view) Author: Antoine Pitrou (pitrou) * (Python committer) Date: 2017-07-22 21:52
Ok, after a bit of diagnosing, the issue is combining multi-threading with use of fork().  The problem is file objects (such as sys.stdout) have locks but those locks may be taken at the exact point where fork() happens, in which case the child will block when trying to take the lock again.

This is mostly a duplicate of issue 6721, but perhaps multiprocessing could at least improve things for sys.stdout and sys.stderr (though I'm not sure how).

This is also compounded by the fact that Process._bootstrap() flushed the standard streams at the end.
msg298900 - (view) Author: Antoine Pitrou (pitrou) * (Python committer) Date: 2017-07-23 12:16
Also, a reliable fix is to use the "forkserver" (or "spawn", but it is much slower) method:
Date User Action Args
2017-07-23 12:16:39pitrousetmessages: + msg298900
2017-07-22 21:52:54pitrousetnosy: + pitrou, davin
title: Possible deadlock after many multiprocessing.Process are launch -> Possible deadlock on sys.stdout/stderr when combining multiprocessing with threads
messages: + msg298875

versions: + Python 3.6, Python 3.7
2016-10-06 23:20:37Hadhokecreate