Message254972
Thanks for your testing Serhiy. I can reproduce this on 32-bit Linux (also by building with CC="gcc -m32"). It is easy to produce the bug with flush(), but with Decompress.decompress() it would need over 2 GiB of memory and data to expand the buffer enough to trigger the error.
>>> zlib.decompressobj().flush(sys.maxsize + 1)
SystemError: Negative size passed to PyBytes_FromStringAndSize
In zlib-Py_ssize_t.patch, I added the bigmem tests, but I am not able to actually test them. The gzip test would require 8 GiB of memory. I also changed the converter to Py_ssize_t to fix this latest bug. |
|
Date |
User |
Action |
Args |
2015-11-20 12:51:36 | martin.panter | set | recipients:
+ martin.panter, twouters, nadeem.vawda, serhiy.storchaka, Matthew.Brett, Ben Cipollini |
2015-11-20 12:51:35 | martin.panter | set | messageid: <1448023895.87.0.574047445591.issue25626@psf.upfronthosting.co.za> |
2015-11-20 12:51:35 | martin.panter | link | issue25626 messages |
2015-11-20 12:51:35 | martin.panter | create | |
|