This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: Implement dumps/loads for lru_cache
Type: enhancement Stage:
Components: Library (Lib) Versions: Python 3.5
process
Status: closed Resolution: rejected
Dependencies: Superseder:
Assigned To: rhettinger Nosy List: BreamoreBoy, anacrolix, ezio.melotti, frafra, pitrou, rhettinger
Priority: normal Keywords:

Created on 2013-03-23 10:59 by frafra, last changed 2022-04-11 14:57 by admin. This issue is now closed.

Messages (6)
msg185036 - (view) Author: Francesco Frassinelli (frafra) Date: 2013-03-23 10:59
Hi,
I propose to change the public API of functools.lru_cache in order to make the cache persistent when the program is restarted.
It could be implemented using two different functions (dumps/loads), where the cached is exported into a classical dictionary and restored in the same way.
A third argument could be also added to provide the initial cache (calling the loads function internally after the initialization).

I think this could be an important enhancement because lru_cache will probably implemented in C and try to export cache contents it will be not possible (see: http://bugs.python.org/issue14373).
msg223438 - (view) Author: Mark Lawrence (BreamoreBoy) * Date: 2014-07-18 21:40
https://pypi.python.org/pypi/fastcache/0.4.0 also seems relevant.
msg223456 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2014-07-19 01:23
Have you seen something like this done for other implementations of LRU caches?  

To me, the idea seems to be at odds with the idea of retaining only the last n calls in memory -- suggesting that a refreshing an outdated entry is cheaper than retiring it to disk and retrieving it later (as you might do with a shelf or database).
msg223466 - (view) Author: Antoine Pitrou (pitrou) * (Python committer) Date: 2014-07-19 14:50
That doesn't sound like a good idea. lru_cache is a decorator, that acts as transparently as possible (i.e. the decorated function has the same metadata and appearance as the original function). Therefore, the lru_cache'd function should also pickle as a regular function - which it currently does.

Having a pickle carry all the cache overhead would actually be a regression - instead of a couple of bytes representing the function's global name, you could now get kilobytes (or more) of data representing the whole cache contents.
(and, also, it would fail if the cache contains any non-picklable data)
msg281540 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2016-11-23 06:59
Marking this as rejected for the reasons mentions by me and Antoine.
msg281541 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2016-11-23 07:06
FWIW, the idea did have some merit and I appreciate your submitting it.

The issues are that the net gain likely isn't worth the API complication, that it opens a can worms (about manipulating the cache contents beyond load and store), and that it is at odds with the core notion of being transparent.

FWIW, the C version of OrderedDict now makes it trivially easy to roll your own highly performant variants of the LRU cache.
History
Date User Action Args
2022-04-11 14:57:43adminsetgithub: 61730
2016-11-23 07:06:37rhettingersetmessages: + msg281541
2016-11-23 06:59:18rhettingersetstatus: open -> closed
resolution: rejected
messages: + msg281540
2014-07-19 14:50:54pitrousetnosy: + pitrou
messages: + msg223466
2014-07-19 01:23:30rhettingersetassignee: rhettinger
messages: + msg223456
2014-07-18 21:40:11BreamoreBoysetnosy: + BreamoreBoy

messages: + msg223438
versions: + Python 3.5, - Python 3.4
2013-03-23 12:29:36ezio.melottisetnosy: + rhettinger, ezio.melotti, anacrolix

components: + Library (Lib)
versions: + Python 3.4
2013-03-23 10:59:41frafracreate