This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Roberto Martínez
Recipients Roberto Martínez
Date 2015-03-04.10:15:47
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1425464148.59.0.542661724106.issue23582@psf.upfronthosting.co.za>
In-reply-to
Content
We face yesterday a bug in a project, and after a few hours of investigation we found a bad/not documented behavior in multiprocessing.Queue.

If you put one or more items in a queue and if the items are large, there is a delay between the put is executed and the item is finally available in the queue. This is reasonable because the underlying thread and the pipe, but the problem is not this.

The problem is that Queue.qsize() is reporting the number of items put, Queue.empty() is returning True, and Queue.get() is raising Empty.

So, the only safe method to get all the items is as follows:

while q.qsize():                                                         
    try:                                                                 
        item = q.get_nowait()                                                   
    except Empty:                                                        
        pass

Which is not very nice.

I attach a sample file reproducing the behavior, a single process put 100 elements in a Queue and after that it tries to get all of them, I tested in python 2.7.9 and 3.4 with the same result (seems python3 is a little bit faster and you need to enlarge the items). Also if you wait between get's, the process is able to retrieve all the items.
History
Date User Action Args
2015-03-04 10:15:48Roberto Martínezsetrecipients: + Roberto Martínez
2015-03-04 10:15:48Roberto Martínezsetmessageid: <1425464148.59.0.542661724106.issue23582@psf.upfronthosting.co.za>
2015-03-04 10:15:48Roberto Martínezlinkissue23582 messages
2015-03-04 10:15:48Roberto Martínezcreate