Message94440
This code:
import multiprocessing
import queue
def _process_worker(q):
while True:
try:
something = q.get(block=True, timeout=0.1)
except queue.Empty:
return
else:
pass
# print('Grabbed item from queue:', something)
def _make_some_processes(q):
processes = []
for _ in range(10):
p = multiprocessing.Process(target=_process_worker, args=(q,))
p.start()
processes.append(p)
return processes
#p = []
def _do(i):
print('Run:', i)
q = multiprocessing.Queue()
# p.append(q)
print('Created queue')
for j in range(30):
q.put(i*30+j)
processes = _make_some_processes(q)
print('Created processes')
while not q.empty():
pass
print('Q is empty')
for i in range(100):
_do(i)
Produces this output on Mac OS X (it produces the expected output on
Linux and Windows):
Run: 0
Created queue
Grabbed item from queue: 0
...
Grabbed item from queue: 29
Created processes
Q is empty
Run: 1
Created queue
Grabbed item from queue: 30
...
Grabbed item from queue: 59
Created processes
Q is empty
Run: 2
Created queue
Created processes
<no further output>
Changing the code as follows:
+ p = []
def _do(i):
print('Run:', i)
q = multiprocessing.Queue()
+ p.append(q)
print('Created queue')
for j in range(30):
q.put(i*30+j)
processes = _make_some_processes(q)
print('Created processes')
while not q.empty():
pass
print('Q is empty')
fixes the deadlock. So it looks like if a multiprocessing.Queue is
collected with sub-processes still using it then calling some methods on
other multiprocessing.Queues with deadlock. |
|
Date |
User |
Action |
Args |
2009-10-24 23:10:37 | bquinlan | set | recipients:
+ bquinlan |
2009-10-24 23:10:37 | bquinlan | set | messageid: <1256425837.07.0.628633816678.issue7200@psf.upfronthosting.co.za> |
2009-10-24 23:10:35 | bquinlan | link | issue7200 messages |
2009-10-24 23:10:35 | bquinlan | create | |
|