This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author martin.panter
Recipients Ben Cipollini, Matthew.Brett, martin.panter, nadeem.vawda, serhiy.storchaka, twouters
Date 2015-11-20.12:51:34
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1448023895.87.0.574047445591.issue25626@psf.upfronthosting.co.za>
In-reply-to
Content
Thanks for your testing Serhiy. I can reproduce this on 32-bit Linux (also by building with CC="gcc -m32"). It is easy to produce the bug with flush(), but with Decompress.decompress() it would need over 2 GiB of memory and data to expand the buffer enough to trigger the error.

>>> zlib.decompressobj().flush(sys.maxsize + 1)
SystemError: Negative size passed to PyBytes_FromStringAndSize

In zlib-Py_ssize_t.patch, I added the bigmem tests, but I am not able to actually test them. The gzip test would require 8 GiB of memory. I also changed the converter to Py_ssize_t to fix this latest bug.
History
Date User Action Args
2015-11-20 12:51:36martin.pantersetrecipients: + martin.panter, twouters, nadeem.vawda, serhiy.storchaka, Matthew.Brett, Ben Cipollini
2015-11-20 12:51:35martin.pantersetmessageid: <1448023895.87.0.574047445591.issue25626@psf.upfronthosting.co.za>
2015-11-20 12:51:35martin.panterlinkissue25626 messages
2015-11-20 12:51:35martin.pantercreate