Message299217
@martin.panter
For HTTP/1.0, since chunked transfer is not supported, and storage in a temporary file is also not an option, I see 2 possible solutions :
- give up compressing big files - it would be a pity, compression is actually made for them...
- compress the file 2 times : a first time just to compute the content length, without storing or sending anything, and a second time to send the gzipped data after all headers have been sent
If there is a 3rd solution I'd be glad to know ; otherwise I prefer the second one, in spite of the waste of CPU. |
|
Date |
User |
Action |
Args |
2017-07-26 10:08:34 | quentel | set | recipients:
+ quentel, terry.reedy, vstinner, christian.heimes, v+python, r.david.murray, martin.panter |
2017-07-26 10:08:34 | quentel | set | messageid: <1501063714.47.0.282540689063.issue30576@psf.upfronthosting.co.za> |
2017-07-26 10:08:34 | quentel | link | issue30576 messages |
2017-07-26 10:08:34 | quentel | create | |
|