This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author gvanrossum
Recipients christian.heimes, draghuram, gvanrossum
Date 2007-12-13.22:03:36
SpamBayes Score 0.080419734
Marked as misclassified No
Message-id <ca471dc20712131403k5b2103f3rc1f41e78ef70dd67@mail.gmail.com>
In-reply-to <4761A0A6.4010207@cheimes.de>
Content
> I believe so, too. The subprocess docs aren't warning about the problem.
> I've seen a fair share of programmers who fall for the trap - including
> me a few weeks ago.

Yes, the docs should definitely address this.

> Consider yet another example
>
> >>> p = Popen(someprogram, stdin=PIPE, stdout=PIPE)
> >>> p.stdin.write(10MB of data)
>
> someprogram processes the incoming data in small blocks. Let's say 1KB
> and 1MB stdin and stdout buffer. It reads 1KB from stdin and writes 1KB
> to stdout until the stdout buffer is full. The program stops and waits
> for for Python to free the stdout buffer. However the python code is
> still writing data to the limited stdin buffer.

Hm. I thought this would be handled using threads or select but it
doesn't seem to be quite the case. communicate() does the right thing
but if you use p.stdin.write() directly you may indeed hang.
History
Date User Action Args
2007-12-13 22:03:36gvanrossumsetspambayes_score: 0.0804197 -> 0.080419734
recipients: + gvanrossum, draghuram, christian.heimes
2007-12-13 22:03:36gvanrossumlinkissue1606 messages
2007-12-13 22:03:36gvanrossumcreate