This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author josh.r
Recipients ConnyOnny, ethan.furman, josh.r, ncoghlan, r.david.murray, rhettinger
Date 2014-12-12.02:16:47
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
Manual adding to the cache seems of limited utility for the proposed recursion base case approach when your cache is actually operating in LRU mode (maxsize isn't None). You can inject your base cases all you want, but unless you wrap every call to the decorated function with another function that reinserts the base cases every time (and even that won't always work), you can't actually rely on the cache containing your base cases; they could be aged off at any time.

I've been bad and (for fun) used inspect to directly find the cache closure variable in an lru_cache wrapped function to inject base cases, but it only works if your cache is unbounded. Trying to make it actually work reliably in LRU mode would complicate matters. I suppose a collections.ChainMap (where one is the LRU map, and the other is an unbounded dictionary solely for injected entries) might work, but it also starts making the cache more expensive to use in general for a really limited use case.
Date User Action Args
2014-12-12 02:16:48josh.rsetrecipients: + josh.r, rhettinger, ncoghlan, r.david.murray, ethan.furman, ConnyOnny
2014-12-12 02:16:48josh.rsetmessageid: <>
2014-12-12 02:16:48josh.rlinkissue23030 messages
2014-12-12 02:16:47josh.rcreate