This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author decaz
Recipients asvetlov, decaz, yselivanov
Date 2018-03-21.17:46:30
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1521654390.62.0.467229070634.issue33115@psf.upfronthosting.co.za>
In-reply-to
Content
But why if I use multiprocessing (run 100 tasks by 100 workers) it still continue blocking loop within some workers? Are 100 tasks "a lot of work" for asyncio loop?

```python
import asyncio
from multiprocessing import Process

worker_count = 100
task_count = 100

def worker_main(worker_id):
    async def main():
        for x in range(1, task_count + 1):
            asyncio.ensure_future(f(x))

    async def f(x):
        if x % 1000 == 0 or x == task_count:
            print(f'[WORKER-{worker_id}] Run f({x})')
        await asyncio.sleep(1)
        loop.call_later(1, lambda: asyncio.ensure_future(f(x)))

    loop = asyncio.get_event_loop()
    loop.set_debug(True)
    loop.run_until_complete(main())
    loop.run_forever()

if __name__ == '__main__':
    for worker_id in range(worker_count):
        worker = Process(target=worker_main, args=(worker_id,), daemon=True)
        worker.start()
    while True:
        pass
```
History
Date User Action Args
2018-03-21 17:46:30decazsetrecipients: + decaz, asvetlov, yselivanov
2018-03-21 17:46:30decazsetmessageid: <1521654390.62.0.467229070634.issue33115@psf.upfronthosting.co.za>
2018-03-21 17:46:30decazlinkissue33115 messages
2018-03-21 17:46:30decazcreate