This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author gvanrossum
Recipients gvanrossum, vstinner, wabu, yselivanov
Date 2014-10-21.20:07:39
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <CAP7+vJLkku=YP1w+Mm0t8n3fJ8veAzyi+tGnGc9_Oa7Jq-btsg@mail.gmail.com>
In-reply-to <1413921356.28.0.458315236992.issue22685@psf.upfronthosting.co.za>
Content
Hm... It does look like there's nothing that tells stdout (which is a
StreamReader) about its transport. Wabu, could you experiment with a change
to asyncio/subprocess.py where SubprocessStreamProtocol.connection_made()
calls self.stdout.set_transport(transport) right after creating self.stdout?

On Tue, Oct 21, 2014 at 12:55 PM, wabu <report@bugs.python.org> wrote:

>
> wabu added the comment:
>
> Sorry for the confusion, yes i do the yield from. The stdout stream for
> the process is actually producing data as it should. The subprocess
> produces a high amount of data (pbzip2), but is only consumed slowly.
>
> Normally when the buffer limit is reached for a stream reader, it calls
> pause_reading on the transport inside the feed_data method (see
> https://code.google.com/p/tulip/source/browse/asyncio/streams.py#365),
> but here this is not happening, as the returned reader has no transport
> set (p.stdout._transport == None). So it fills up all the memory.
>
> ----------
>
> _______________________________________
> Python tracker <report@bugs.python.org>
> <http://bugs.python.org/issue22685>
> _______________________________________
>
History
Date User Action Args
2014-10-21 20:07:39gvanrossumsetrecipients: + gvanrossum, vstinner, yselivanov, wabu
2014-10-21 20:07:39gvanrossumlinkissue22685 messages
2014-10-21 20:07:39gvanrossumcreate