This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author wabu
Recipients gvanrossum, vstinner, wabu, yselivanov
Date 2014-10-21.19:55:55
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1413921356.28.0.458315236992.issue22685@psf.upfronthosting.co.za>
In-reply-to
Content
Sorry for the confusion, yes i do the yield from. The stdout stream for the process is actually producing data as it should. The subprocess produces a high amount of data (pbzip2), but is only consumed slowly. 

Normally when the buffer limit is reached for a stream reader, it calls pause_reading on the transport inside the feed_data method (see https://code.google.com/p/tulip/source/browse/asyncio/streams.py#365),
but here this is not happening, as the returned reader has no transport set (p.stdout._transport == None). So it fills up all the memory.
History
Date User Action Args
2014-10-21 19:55:56wabusetrecipients: + wabu, gvanrossum, vstinner, yselivanov
2014-10-21 19:55:56wabusetmessageid: <1413921356.28.0.458315236992.issue22685@psf.upfronthosting.co.za>
2014-10-21 19:55:56wabulinkissue22685 messages
2014-10-21 19:55:55wabucreate