This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Martin.Teichmann
Recipients Martin.Teichmann, asvetlov, cjrh, yselivanov
Date 2019-06-24.13:13:58
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1561382038.28.0.980136594988.issue37334@roundup.psfhosted.org>
In-reply-to
Content
Given the reactions I gather "close" is a better name for the method, so I changed it accordingly.

In the current implementation, items that had been put on the queue but not processed yet still get processed after the close, and I think this is the desired behavior. I added a test such that this won't unexpectedly change in the future.

To be precise, the current implementation of the queue does not even put new items on the queue if there is already a waiting consumer. The item will directly be handed over to said consumer, which may hang around on the event loop for a bit longer, but during this time the item is not in the queue. This also answers the questions about catching the CancelledError: if there are waiting consumers, there is nothing on the queue, so the problem of processing leftover items does not exist. The same holds for the task_done.

As for the "why don't I just cancel the task?", well, if you know it. There may be many consumer or producer tasks waiting for their turn. Sure, you can keep a list of all those tasks. But that's exactly the point of the proposed change: the Queue already knows all the waiting tasks, no need to keep another list up-to-date!
History
Date User Action Args
2019-06-24 13:13:58Martin.Teichmannsetrecipients: + Martin.Teichmann, asvetlov, cjrh, yselivanov
2019-06-24 13:13:58Martin.Teichmannsetmessageid: <1561382038.28.0.980136594988.issue37334@roundup.psfhosted.org>
2019-06-24 13:13:58Martin.Teichmannlinkissue37334 messages
2019-06-24 13:13:58Martin.Teichmanncreate