Message229778
Sorry for the confusion, yes i do the yield from. The stdout stream for the process is actually producing data as it should. The subprocess produces a high amount of data (pbzip2), but is only consumed slowly.
Normally when the buffer limit is reached for a stream reader, it calls pause_reading on the transport inside the feed_data method (see https://code.google.com/p/tulip/source/browse/asyncio/streams.py#365),
but here this is not happening, as the returned reader has no transport set (p.stdout._transport == None). So it fills up all the memory. |
|
Date |
User |
Action |
Args |
2014-10-21 19:55:56 | wabu | set | recipients:
+ wabu, gvanrossum, vstinner, yselivanov |
2014-10-21 19:55:56 | wabu | set | messageid: <1413921356.28.0.458315236992.issue22685@psf.upfronthosting.co.za> |
2014-10-21 19:55:56 | wabu | link | issue22685 messages |
2014-10-21 19:55:55 | wabu | create | |
|