Message323477
The gzip module may work for saving file localy but for example:
This upload json to Google Storage:
import datalab.storage as storage
storage.Bucket('mybucket').item(path).write_to(json.dumps(response), 'application/json')
Your won't work here unless I save the file locally and only then upload it... It's a bit of a problem when your files are 100 GBs+
I still think the json.dump() should support compression |
|
Date |
User |
Action |
Args |
2018-08-13 12:24:08 | liad100 | set | recipients:
+ liad100, rushter |
2018-08-13 12:24:08 | liad100 | set | messageid: <1534163048.03.0.56676864532.issue34393@psf.upfronthosting.co.za> |
2018-08-13 12:24:08 | liad100 | link | issue34393 messages |
2018-08-13 12:24:07 | liad100 | create | |
|