This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author nneonneo
Recipients ned.deily, nneonneo, ronaldoussoren, xiang.zhang
Date 2018-10-29.18:21:14
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1540837274.83.0.788709270274.issue33074@psf.upfronthosting.co.za>
In-reply-to
Content
I just started a new project, thoughtlessly decided to use `shelve` to store data, and lost it all again thanks to this bug.

To reiterate: Although `gdbm` might fix this issue, it's not installed by default. But the issue is with `dbm`: Python is allowing me to insert elements into the database which exceed internal limits, causing the database to become silently corrupt upon retrieval. This is an unacceptable situation - a very normal, non-complex use of the standard library is causing data loss without any indication that the loss is occurring.

At the very least there should be a warning or error that the data inserted exceeds dbm's limits, and in an ideal world dbm would not fall over from inserting a few KB of data in a single row (but I understand that's a third party problem at that point).

Can't we just ship a dbm that is backed with a more robust engine, like a SQLite key-value table?
History
Date User Action Args
2018-10-29 18:21:14nneonneosetrecipients: + nneonneo, ronaldoussoren, ned.deily, xiang.zhang
2018-10-29 18:21:14nneonneosetmessageid: <1540837274.83.0.788709270274.issue33074@psf.upfronthosting.co.za>
2018-10-29 18:21:14nneonneolinkissue33074 messages
2018-10-29 18:21:14nneonneocreate