This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: lru_cache manual get/put
Type: enhancement Stage:
Components: Library (Lib) Versions: Python 3.5
process
Status: closed Resolution: rejected
Dependencies: Superseder:
Assigned To: rhettinger Nosy List: ConnyOnny, ethan.furman, josh.r, ncoghlan, r.david.murray, rhettinger
Priority: low Keywords: patch

Created on 2014-12-11 13:59 by ConnyOnny, last changed 2022-04-11 14:58 by admin. This issue is now closed.

Files
File name Uploaded Description Edit
lru_get_put.patch ConnyOnny, 2014-12-11 13:59 patch to add get/put support to lru_cache + tests for it review
Messages (7)
msg232476 - (view) Author: Constantin (ConnyOnny) * Date: 2014-12-11 13:59
In an effort for improved communication between stacked decorators, I would like to propose that all decorators providing caching mechanisms should provide the functions cache_info, cache_clear, cache_get and cache_put. The standard lib only provides functools.lru_cache as caching decorators, which already implements the former two. I have attached a patch to provide the latter two.

On python-ideas there was also the idea to create a cache data structure and then have a decorator take its data structure as an argument. But it was argued that this could lead to some optimizations of the caching being impossible.

I think my suggested approach should not impose a problem on highly optimized caching code, because every cache - no matter how optimized - must have some functionality to add something to the cache and lookup something in the cache.
msg232482 - (view) Author: R. David Murray (r.david.murray) * (Python committer) Date: 2014-12-11 18:50
It would be helpful (and provide a better record) if you could provide a link to the python-ideas discussion that endorses your idea.
msg232485 - (view) Author: Ethan Furman (ethan.furman) * (Python committer) Date: 2014-12-11 19:58
When reading this thread, keep in mind that most of it was taken up with rejecting exposing the underlying data structure, which is not what this patch does.

https://mail.python.org/pipermail/python-ideas/2014-December/030230.html
msg232507 - (view) Author: Josh Rosenberg (josh.r) * (Python triager) Date: 2014-12-12 02:16
Manual adding to the cache seems of limited utility for the proposed recursion base case approach when your cache is actually operating in LRU mode (maxsize isn't None). You can inject your base cases all you want, but unless you wrap every call to the decorated function with another function that reinserts the base cases every time (and even that won't always work), you can't actually rely on the cache containing your base cases; they could be aged off at any time.

I've been bad and (for fun) used inspect to directly find the cache closure variable in an lru_cache wrapped function to inject base cases, but it only works if your cache is unbounded. Trying to make it actually work reliably in LRU mode would complicate matters. I suppose a collections.ChainMap (where one is the LRU map, and the other is an unbounded dictionary solely for injected entries) might work, but it also starts making the cache more expensive to use in general for a really limited use case.
msg232603 - (view) Author: Constantin (ConnyOnny) * Date: 2014-12-13 11:24
It may be the case, that an lru_cache does not provide the best strategy for reliably caching many base cases in recursively written code. I suggest that someday we think about a different caching paradigm which fits this purpose and add it to functools e.g. as functools.recopt_cache. This cache would then implement the same interface as lru_cache and therefore all code currently using lru_cache could benefit from recopt_cache with just one line of code change.

Furthermore, by designing this interface, it becomes more probable that user defined caching decorators are compatible.

Please remember: My suggestion isn't just about lru_cache, but about an interface for caching decorators.
msg232627 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2014-12-14 01:13
Sorry Constantin, I am rejecting this proposal or any variants of it.

* As Nick pointed-out in the referenced thread, we provide two tools: a functools caching decorator that is tightly focused on the task of caching function calls and a collections OrderedDict that is a general purpose data store suitable for implementing custom LRU logic when needed.

* As John pointed-out, this proposal isn't even fit for the original use case.  Any speed benefits of a tail recursion optimization get immediately wiped out by overhead of a LRU cache decorator, and the clarity of the original code starts to get lost in the extra code to call cache_get() and cache_put() -- remember the decorator was designed to wrap around a function without having to change the logic inside it.  Also, the optimization itself is fundamentally suspect because it changes the semantics of the language (thereby defeating all decorators that supply wrapper functions including memoization decorators, call loggers, precondition/postcondition checkers, subscription notifiers, type checkers, etc).

* The essence of this proposal is at odds with what the functools caching decorator is all about -- providing a cache for function calls.  The proposal bypasses the function call itself, making it more difficult to reason about what is in the cache (i.e. the result of previous function calls) and mucking up the cache hit/miss statistics which stop being meaningful.

* The proposal also adds API complexity (making it less like a function decorator and more like an exposed data store such as an ordered dictionary).  And, it creates a new control flow exception NotInCache.  Collectively, these changes make the decorator slower, harder to maintain, harder to learn, harder to avoid reentrancy and locking problems, and harder to reason about but it doesn't provide much if any incremental benefit over using an OrderedDict directly.  

* To the extent there are valid niche use cases, we shouldn't try to cater to all of them.  Good standard library API design stay focused on serving on the common case as well as possible and leaving the rest to OrderedDict or a third-party package (you're welcome to publish one to see if it actually serves a real need).

* In designing the cache, I surveyed previously published memoization decorators and did not find the proposed feature. That means that it is false to state that every memoization decorator *must have* some functionality to insert or lookup entries while bypassing the function call that was intended to be cached.  If it really was "must have" behavior, then it would have already been commonplace.  IMO, cache_put() is bad design (would you want you database cache to return something that was never in the database or your disk cache to return values that had never been written to disk?)

* Hopefully, this message explains my reasoning clearly, so you will understand why I'm closing this one.  That said, judging by the insistent wording of your posts, I don't expect that you will be convinced.  Your mental model of caching tools is very different from what the functools.lru_cache was intended to accomplish.  To the extent that you don't really want a transparent function decorator and would prefer a full featured class (with methods for getting, setting, deleting, re-ordering, listing, monitoring, etc), I recommend that you write one and post it to the Python Package Index.  The collections.OrderedDict() class should easily handle the LRU storage logic, so all you have to do is specify your API and decide which parts of the store you want to expose.
msg232636 - (view) Author: Constantin (ConnyOnny) * Date: 2014-12-14 10:37
I understand your decision. Even though it makes my life a little bit harder, it is definitely not the end of the world, the end of Python or even the end for my libtco project.
History
Date User Action Args
2022-04-11 14:58:11adminsetgithub: 67219
2014-12-14 10:37:48ConnyOnnysetmessages: + msg232636
2014-12-14 01:13:17rhettingersetstatus: open -> closed
resolution: rejected
messages: + msg232627
2014-12-13 11:24:50ConnyOnnysetmessages: + msg232603
2014-12-12 05:53:42rhettingersetpriority: normal -> low
assignee: rhettinger
2014-12-12 02:16:48josh.rsetnosy: + josh.r
messages: + msg232507
2014-12-11 19:58:08ethan.furmansetmessages: + msg232485
2014-12-11 18:50:09r.david.murraysetnosy: + r.david.murray
messages: + msg232482
2014-12-11 17:02:40ethan.furmansetnosy: + rhettinger, ncoghlan
2014-12-11 17:01:26ethan.furmansetnosy: + ethan.furman
2014-12-11 13:59:36ConnyOnnycreate