This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author liad100
Recipients liad100, rushter
Date 2018-08-13.12:24:07
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1534163048.03.0.56676864532.issue34393@psf.upfronthosting.co.za>
In-reply-to
Content
The gzip module may work for saving file localy but for example:

This upload json to Google Storage:

import datalab.storage as storage
storage.Bucket('mybucket').item(path).write_to(json.dumps(response), 'application/json')

Your won't work here unless I save the file locally and only then upload it... It's a bit of a problem when your files are  100 GBs+

I still think the json.dump() should support compression
History
Date User Action Args
2018-08-13 12:24:08liad100setrecipients: + liad100, rushter
2018-08-13 12:24:08liad100setmessageid: <1534163048.03.0.56676864532.issue34393@psf.upfronthosting.co.za>
2018-08-13 12:24:08liad100linkissue34393 messages
2018-08-13 12:24:07liad100create