This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author eryksun
Recipients Ramin Farajpour Cami, eryksun, methane, paul.moore, steve.dower, tim.golden, zach.ware
Date 2021-02-20.05:20:01
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1613798401.99.0.464984107092.issue43260@roundup.psfhosted.org>
In-reply-to
Content
> In your code, huge data passed to .write(huge) may be 
> remained in the internal buffer.

If you mean the buffered writer, then I don't see the problem. A large bytes object in pending_bytes gets temporarily referenced by _textiowrapper_writeflush(), and self->pending_bytes is cleared. If the buffer's write() method fails, then the bytes object is simply deallocated.

If you mean pending_bytes in the text wrapper, then I also don't see the problem. It always gets flushed if pending_bytes_count exceeds the chunk size. If pending_bytes is a list, it never exceeds the chunk size. It gets pre-flushed to avoid growing beyond the chunk size. If pending_bytes isn't a list, then it can only exceed the chunk size if it's a bytes object -- never for an ASCII str() object. _textiowrapper_writeflush() does not call PyBytes_AsStringAndSize() for that case.
History
Date User Action Args
2021-02-20 05:20:02eryksunsetrecipients: + eryksun, paul.moore, tim.golden, methane, zach.ware, steve.dower, Ramin Farajpour Cami
2021-02-20 05:20:01eryksunsetmessageid: <1613798401.99.0.464984107092.issue43260@roundup.psfhosted.org>
2021-02-20 05:20:01eryksunlinkissue43260 messages
2021-02-20 05:20:01eryksuncreate