This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author nikratio
Recipients Arfrever, christian.heimes, eric.araujo, martin.panter, nadeem.vawda, nikratio, pitrou, serhiy.storchaka
Date 2014-01-25.04:56:52
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1390625813.48.0.0129741881093.issue15955@psf.upfronthosting.co.za>
In-reply-to
Content
Is there any reason why unconsumed_tail needs to be exposted?

I would suggest to instead introduce a boolean attribute data_ready than indicates that more decompressed data can be provided without additional compressed input.

Example:

# decomp = decompressor object
# infh = compressed input stream
# outfh = decompressed output stream
while not decomp.eof:
    if decomp.data_ready:
        buf = decomp.decompress(max_size=BUFSIZE)
        # or maybe:
        #buf = decomp.decompress(b'', max_size=BUFSIZE)
    else:
        buf = infh.read(BUFSIZE)
        if not buf:
            raise RuntimeError('Unexpected end of compressed stream')
        buf = decomp.decompress(buf, max_size=BUFSIZE)

    assert len(buf) > 0
    outfh.write(buf)

This is short, easily readable (in my opinion) and also avoids the problem where the decompressor blocks because it needs more data even though there still is an unconsumed tail.
History
Date User Action Args
2014-01-25 04:56:53nikratiosetrecipients: + nikratio, pitrou, christian.heimes, nadeem.vawda, eric.araujo, Arfrever, martin.panter, serhiy.storchaka
2014-01-25 04:56:53nikratiosetmessageid: <1390625813.48.0.0129741881093.issue15955@psf.upfronthosting.co.za>
2014-01-25 04:56:53nikratiolinkissue15955 messages
2014-01-25 04:56:52nikratiocreate