Author thesheep
Recipients ncoghlan, rhettinger, serhiy.storchaka, thesheep
Date 2013-12-03.23:43:02
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1386114182.74.0.17581807865.issue19859@psf.upfronthosting.co.za>
In-reply-to
Content
The method example is just the most common case where this problem can be easily seen, but not the only one. We indeed use the @cached_property decorator on properties (similar to https://github.com/mitsuhiko/werkzeug/blob/master/werkzeug/utils.py#L35), and I've actually written a @memoized_method decorator for methods, that do pretty much what Serhiy suggests (except it's not repeated all over the place, a la copy-pasta, but kept in one decorator). That is fine, and the @cached_property is actually superior, as it avoids a lookup and a check once the value has been calculated.

However, this still doesn't solve the problems that are encountered in practice in actual code, like here: https://github.com/openstack/horizon/blob/master/openstack_dashboard/api/neutron.py#L735

Here we have a normal function, not a method, that calls a remote API through HTTP (the call is relatively slow, so we definitely want to cache it for multiple invocations). The function takes a ``request`` parameter, because it needs it for authentication with the remote service. The problem we had is that this will keep every single request in memory, because it's referenced by the cache.

Somehow it feels wrong to store the cache on an arbitrary attribute of the function, like the request in this case, and it's easy to imagine a function that takes two such critical arguments.

This is the code that actually made me write the weakref version of the @memoized decorator that I linked initially, and I thought that it could also be useful to have that in Python's caching decorator as an option.

I can understand if you think that this is too much, and that in such tricky situations the programmer should either write their own caching, or rewrite the code to avoid a memory leak. But I am sure that most programmers will not even think about this caveat. I think that we should at least add a note to lru_cache's documentation warning about this scenario and advising them to write their own caching decorators.
History
Date User Action Args
2013-12-03 23:43:02thesheepsetrecipients: + thesheep, rhettinger, ncoghlan, serhiy.storchaka
2013-12-03 23:43:02thesheepsetmessageid: <1386114182.74.0.17581807865.issue19859@psf.upfronthosting.co.za>
2013-12-03 23:43:02thesheeplinkissue19859 messages
2013-12-03 23:43:02thesheepcreate