This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author martin.panter
Recipients Klamann, martin.panter, xiang.zhang
Date 2016-06-02.00:11:45
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1464826306.02.0.316899089693.issue27130@psf.upfronthosting.co.za>
In-reply-to
Content
You should be able to use a compression (or decompression) object as a workaround. But calling zlib.compress() multiple times would generate multiple separate deflated streams, which is different.

I think it is reasonable for Python to handle larger data sizes for zlib. (In theory, the 4 GiB UINT_MAX limit is platform-dependent.) IMO it is a matter of writing the patch(es), and perhaps depending on the complexity, deciding whether to apply them to 2.7 etc or just the next version of Python 3 (risk vs reward).

Alternatively (or in the mean time), I guess we could document the limitation.
History
Date User Action Args
2016-06-02 00:11:46martin.pantersetrecipients: + martin.panter, xiang.zhang, Klamann
2016-06-02 00:11:46martin.pantersetmessageid: <1464826306.02.0.316899089693.issue27130@psf.upfronthosting.co.za>
2016-06-02 00:11:46martin.panterlinkissue27130 messages
2016-06-02 00:11:45martin.pantercreate