This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Unsupported provider

classification
Title: multiprocessing deadlock on Mac OS X when queue collected before process terminates
Type: behavior Stage: resolved
Components: Library (Lib) Versions: Python 3.1, Python 3.2, Python 2.7
process
Status: closed Resolution: not a bug
Dependencies: Superseder:
Assigned To: jnoller Nosy List: asksol, bquinlan, davin, jnoller
Priority: normal Keywords:

Created on 2009-10-24 23:10 by bquinlan, last changed 2022-04-11 14:56 by admin. This issue is now closed.

Messages (4)
msg94440 - (view) Author: Brian Quinlan (bquinlan) * (Python committer) Date: 2009-10-24 23:10
This code:

import multiprocessing
import queue

def _process_worker(q):
    while True:
        try:
            something = q.get(block=True, timeout=0.1)
        except queue.Empty:
            return
        else:
            pass
            # print('Grabbed item from queue:', something)


def _make_some_processes(q):
    processes = []
    for _ in range(10):
        p = multiprocessing.Process(target=_process_worker, args=(q,))
        p.start()
        processes.append(p)
    return processes

#p = []
def _do(i):
    print('Run:', i)
    q = multiprocessing.Queue()
#    p.append(q)
    print('Created queue')
    for j in range(30):
        q.put(i*30+j)
    processes = _make_some_processes(q)
    print('Created processes')

    while not q.empty():
        pass
    print('Q is empty')

for i in range(100):
    _do(i)

Produces this output on Mac OS X (it produces the expected output on
Linux and Windows):

Run: 0
Created queue
Grabbed item from queue: 0
...
Grabbed item from queue: 29
Created processes
Q is empty
Run: 1
Created queue
Grabbed item from queue: 30
...
Grabbed item from queue: 59
Created processes
Q is empty
Run: 2
Created queue
Created processes
<no further output>

Changing the code as follows:

+ p = []
def _do(i):
    print('Run:', i)
    q = multiprocessing.Queue()
+   p.append(q)
    print('Created queue')
    for j in range(30):
        q.put(i*30+j)
    processes = _make_some_processes(q)
    print('Created processes')

    while not q.empty():
        pass
    print('Q is empty')

fixes the deadlock. So it looks like if a multiprocessing.Queue is
collected with sub-processes still using it then calling some methods on
other multiprocessing.Queues with deadlock.
msg119498 - (view) Author: Ask Solem (asksol) (Python committer) Date: 2010-10-24 08:29
Queue uses multiprocessing.util.Finalize, which uses weakrefs to track when the object is out of scope, so this is actually expected behavior.

IMHO it is not a very good approach, but changing the API to use explicit close methods is a little late at this point, I guess.
msg200327 - (view) Author: Brian Quinlan (bquinlan) * (Python committer) Date: 2013-10-18 23:15
OK, working as intended.
msg235902 - (view) Author: Davin Potts (davin) * (Python committer) Date: 2015-02-13 16:38
This issue was marked as "not a bug" by OP a while back but for whatever reason it did not also get marked as "closed".  Going ahead with closing it now.
History
Date User Action Args
2022-04-11 14:56:54adminsetgithub: 51449
2015-02-13 16:38:43davinsetstatus: open -> closed

nosy: + davin
messages: + msg235902

stage: needs patch -> resolved
2013-10-18 23:15:40bquinlansetresolution: not a bug
messages: + msg200327
2010-10-24 08:29:48asksolsetnosy: + asksol
messages: + msg119498
2010-07-11 09:59:44BreamoreBoysetassignee: jnoller
stage: needs patch

nosy: + jnoller
versions: + Python 3.1, Python 2.7
2009-10-24 23:10:35bquinlancreate