This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author methane
Recipients Ramin Farajpour Cami, eryksun, methane, paul.moore, steve.dower, tim.golden, zach.ware
Date 2021-02-20.00:57:33
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1613782654.15.0.473565996545.issue43260@roundup.psfhosted.org>
In-reply-to
Content
This is not the problem only in the optimization. _textiowrapper_writeflush need to create buffer bytes object anyway and if it cause MemoryError, the TextIOWrapper can not flush internal buffer forever.

  stdout.write("small text")
  stdout.write("very large text")  # Calls writeflush, but can not allocate buffer.

This example would stuck on same situation without the optimization.


But the optimization made the problem easy to happen. Now the problem happend with only one learge text.

Idea to fix this problem:

* If input text is large (>1M?)
  * Flush buffer before adding the large text to the buffer.
  * Encode the text and write it to self->buffer soon. Do not put it into internal buffer (self->pending_bytes).
History
Date User Action Args
2021-02-20 00:57:34methanesetrecipients: + methane, paul.moore, tim.golden, zach.ware, eryksun, steve.dower, Ramin Farajpour Cami
2021-02-20 00:57:34methanesetmessageid: <1613782654.15.0.473565996545.issue43260@roundup.psfhosted.org>
2021-02-20 00:57:34methanelinkissue43260 messages
2021-02-20 00:57:33methanecreate