This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author aparamon
Recipients andybalaam, aparamon, asvetlov, glin, yselivanov
Date 2019-02-21.09:00:15
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1550739615.37.0.414891173344.issue30782@roundup.psfhosted.org>
In-reply-to
Content
Might as_completed() be considered a low-level API, but as of Python 3.7 there are seemingly no ready alternatives to achieve proposed behavior. All of asyncio.gather(), asyncio.wait(), asyncio.as_completed() expect awaitables list of limited size; doing something like https://www.artificialworlds.net/blog/2017/06/12/making-100-million-requests-with-python-aiohttp is not straightforward.

A function that takes iterator/async iterator of tasks and is itself generator/async generator is very much wanted, something in the spirit of (but more efficient?)
----
async def igather(tasks, limit=None):
    pending = set()
    while True:
        for task in islice(tasks, limit - len(pending) if limit else None):
            pending.add(task)
        if pending:
            done, pending = await asyncio.wait(pending, return_when=asyncio.FIRST_COMPLETED)
            for task in done:
                yield task
        else:
            break
----

It is an open question whether such function should yield results in the task submission order. Albeit useful, it's a bit harder to implement and (most importantly) has terrible worst-case memory behavior.

See also:
https://bugs.python.org/issue33533
https://github.com/h2non/paco/issues/38
History
Date User Action Args
2019-02-21 09:00:15aparamonsetrecipients: + aparamon, andybalaam, asvetlov, yselivanov, glin
2019-02-21 09:00:15aparamonsetmessageid: <1550739615.37.0.414891173344.issue30782@roundup.psfhosted.org>
2019-02-21 09:00:15aparamonlinkissue30782 messages
2019-02-21 09:00:15aparamoncreate