This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Ben Cipollini
Recipients Ben Cipollini, Matthew.Brett, martin.panter, nadeem.vawda, twouters
Date 2015-11-15.12:20:34
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
@Martin: yes, that reproduces the problem.

>I think the ideal fix would be to cap the limit at 2**32 - 1 in the zlib library.

If this cap is implemented, is there any workaround how we can efficiently open gzipped files over 4GB? Such files exist, exist in high-priority, government-funded datasets, and neuroinformatics in Python requires a way to open them efficiently.

>another option would be to cap the limit in the gzip library

Again, these limitations are new in Python 3.5 and currently block us from using Python 3.5 to run neuroinformatics efficiently. Is there any chance to revert this new behavior, or any known memory-efficient workaround?
Date User Action Args
2015-11-15 12:20:34Ben Cipollinisetrecipients: + Ben Cipollini, twouters, nadeem.vawda, martin.panter, Matthew.Brett
2015-11-15 12:20:34Ben Cipollinisetmessageid: <>
2015-11-15 12:20:34Ben Cipollinilinkissue25626 messages
2015-11-15 12:20:34Ben Cipollinicreate