This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author rhettinger
Recipients ncoghlan, rhettinger, serhiy.storchaka, thesheep
Date 2013-12-04.09:32:47
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1386149567.55.0.184984400702.issue19859@psf.upfronthosting.co.za>
In-reply-to
Content
> Limiting the cache size is also not a solution in the 
> practical example with request that I linked to in the
> previous comment, because we can't know in advance how
> many times per request the function is going to be called, 
> picking an arbitrary number feels wrong and may lead to 
> unexpected behaviors

This suggests that you don't really want an LRU cache which is specifically designed to limit the cache size by expelling the
least recently used entry.

At its heart, the cache decorator is all about mapping a fixed inputs to fixed outputs.  The memory conservation comes from the replacement strategy and an option to clear the cache entirely.

The reason that my answer and Serhiy's answer don't fit your needs is that it isn't clear what you really want to do.  I think you should move this discussion to StackOverflow so others can help you tease-out your actual needs and suggest appropriate solutions.  Ideally, you should start with real use cases rather than focusing on hacking-up the LRU cache implementation.
History
Date User Action Args
2013-12-04 09:32:47rhettingersetrecipients: + rhettinger, ncoghlan, serhiy.storchaka, thesheep
2013-12-04 09:32:47rhettingersetmessageid: <1386149567.55.0.184984400702.issue19859@psf.upfronthosting.co.za>
2013-12-04 09:32:47rhettingerlinkissue19859 messages
2013-12-04 09:32:47rhettingercreate