This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author ConnyOnny
Recipients ConnyOnny, ethan.furman, josh.r, ncoghlan, r.david.murray, rhettinger
Date 2014-12-13.11:24:50
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1418469890.47.0.697918243731.issue23030@psf.upfronthosting.co.za>
In-reply-to
Content
It may be the case, that an lru_cache does not provide the best strategy for reliably caching many base cases in recursively written code. I suggest that someday we think about a different caching paradigm which fits this purpose and add it to functools e.g. as functools.recopt_cache. This cache would then implement the same interface as lru_cache and therefore all code currently using lru_cache could benefit from recopt_cache with just one line of code change.

Furthermore, by designing this interface, it becomes more probable that user defined caching decorators are compatible.

Please remember: My suggestion isn't just about lru_cache, but about an interface for caching decorators.
History
Date User Action Args
2014-12-13 11:24:50ConnyOnnysetrecipients: + ConnyOnny, rhettinger, ncoghlan, r.david.murray, ethan.furman, josh.r
2014-12-13 11:24:50ConnyOnnysetmessageid: <1418469890.47.0.697918243731.issue23030@psf.upfronthosting.co.za>
2014-12-13 11:24:50ConnyOnnylinkissue23030 messages
2014-12-13 11:24:50ConnyOnnycreate