This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author sbraz
Recipients sbraz
Date 2021-02-05.12:03:05
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
I have noticed that when multiple processes try to simultaneously get items from a multiprocessing queue with block=False, it will raise queue.Empty even though there are still items in it.
Adding a multiprocessing lock around calls to Queue.get fixes the problem.

Please consider the attached reproducer script and its output:
$ ./
using processes
2021-02-05T12:48:21.742728 worker 0 got 0, queue size was 100
2021-02-05T12:48:21.743702 worker 1 got 1, queue size was 99
2021-02-05T12:48:21.744059 worker 2 got 2, queue size was 98
2021-02-05T12:48:21.745352 worker 3 got 3, queue size was 97
2021-02-05T12:48:22.743905 worker 1 queue is EMPTY, size was 96
2021-02-05T12:48:22.744064 worker 0 got 4, queue size was 96
2021-02-05T12:48:22.746525 worker 3 queue is EMPTY, size was 95
2021-02-05T12:48:22.749573 worker 2 got 5, queue size was 95
2021-02-05T12:48:23.744474 worker 0 got 6, queue size was 94
2021-02-05T12:48:23.750728 worker 2 got 7, queue size was 93
2021-02-05T12:48:24.745852 worker 0 got 8, queue size was 92
2021-02-05T12:48:24.751827 worker 2 got 9, queue size was 91

I have been able to reproduce this problem with Python 2.7 and 3.5 through 3.9 (3.10 untested).

When using threads and queue.Queue instead of their multiprocessing counterparts, the problem is not present ("./ thread" → no spurious exceptions until the queue is actually empty).
Date User Action Args
2021-02-05 12:03:06sbrazsetrecipients: + sbraz
2021-02-05 12:03:06sbrazsetmessageid: <>
2021-02-05 12:03:06sbrazlinkissue43136 messages
2021-02-05 12:03:05sbrazcreate