Message233747
For what it’s worth, it would be better if compressed streams did limit the amount of data they decompressed, so that they are not susceptible to decompression bombs; see Issue 15955. But having a flexible-sized buffer could be useful in other cases.
I haven’t looked closely at the code, but I wonder if there is much difference from the existing BufferedReader. Perhaps just that the underlying raw stream in this case can deliver data in arbitrary-sized chunks, but BufferedReader expects its raw stream to deliver data in limited-sized chunks?
If you exposed the buffer it could be useful to do many things more efficiently:
* readline() with custom newline or end-of-record codes, solving Issue 1152248, Issue 17083
* scan the buffer using string operations or regular expressions etc, e.g. to skip whitespace, read a run of unescaped symbols
* tentatively read data to see if a keyword is present, but roll back if the data doesn’t match the keyword |
|
Date |
User |
Action |
Args |
2015-01-09 12:22:56 | martin.panter | set | recipients:
+ martin.panter, alanmcintyre, pitrou, nadeem.vawda, benjamin.peterson, stutzbach, serhiy.storchaka |
2015-01-09 12:22:56 | martin.panter | set | messageid: <1420806176.11.0.957684109278.issue19051@psf.upfronthosting.co.za> |
2015-01-09 12:22:56 | martin.panter | link | issue19051 messages |
2015-01-09 12:22:55 | martin.panter | create | |
|