This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author quentel
Recipients christian.heimes, martin.panter, quentel, r.david.murray, terry.reedy, v+python, vstinner
Date 2017-07-26.10:08:34
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1501063714.47.0.282540689063.issue30576@psf.upfronthosting.co.za>
In-reply-to
Content
@martin.panter

For HTTP/1.0, since chunked transfer is not supported, and storage in a temporary file is also not an option, I see 2 possible solutions :
- give up compressing big files - it would be a pity, compression is actually made for them...
- compress the file 2 times : a first time just to compute the content length, without storing or sending anything, and a second time to send the gzipped data after all headers have been sent

If there is a 3rd solution I'd be glad to know ; otherwise I prefer the second one, in spite of the waste of CPU.
History
Date User Action Args
2017-07-26 10:08:34quentelsetrecipients: + quentel, terry.reedy, vstinner, christian.heimes, v+python, r.david.murray, martin.panter
2017-07-26 10:08:34quentelsetmessageid: <1501063714.47.0.282540689063.issue30576@psf.upfronthosting.co.za>
2017-07-26 10:08:34quentellinkissue30576 messages
2017-07-26 10:08:34quentelcreate