This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: Document that lru_cache uses hard references
Type: resource usage Stage: resolved
Components: Library (Lib) Versions: Python 3.11, Python 3.10, Python 3.9
process
Status: closed Resolution: fixed
Dependencies: Superseder: functools.lru_cache keeps objects alive forever
View: 19859
Assigned To: rhettinger Nosy List: Wouter De Borger2, cryvate, kj, miss-islington, nanjekyejoannah, pablogsal, python-dev, rhettinger, serhiy.storchaka
Priority: normal Keywords: patch

Created on 2021-06-04 11:45 by Wouter De Borger2, last changed 2022-04-11 14:59 by admin. This issue is now closed.

Pull Requests
URL Status Linked Edit
PR 26528 closed python-dev, 2021-06-04 12:22
PR 26715 merged rhettinger, 2021-06-14 05:29
PR 26716 merged miss-islington, 2021-06-14 05:47
PR 26731 merged rhettinger, 2021-06-15 06:52
PR 26777 merged miss-islington, 2021-06-17 20:39
PR 26778 closed miss-islington, 2021-06-17 20:39
PR 26789 merged rhettinger, 2021-06-18 19:27
Messages (21)
msg395075 - (view) Author: Wouter De Borger (Wouter De Borger2) Date: 2021-06-04 11:45
# Problem

the functools.lru_cache decorator locks all arguments to the function in memory (inclusing self), causing hard to find memory leaks. 

# Expected  

I had assumed that the lru_cache would keep weak-references and that when an object is garbage colected, all its cache entries expire as unreachable. This is not the case.

# Solutions 

1. I think it is worth at least mentioning this behavior in de documentation. 
2. I also think it would be good if the LRU cache actually uses weak references. 

I will try to make a PR for this.
msg395087 - (view) Author: Pablo Galindo Salgado (pablogsal) * (Python committer) Date: 2021-06-04 14:46
Using a weak dictionary is not a correct solution as the cache must take string ownership of the arguments and return value to do it's job properly. Moreover, there are many types in Python that don't support weak references so this will be a backwards incompatible change and limiting the cache quite a lot.
msg395088 - (view) Author: Ken Jin (kj) * (Python committer) Date: 2021-06-04 14:55
@Wouter
Hmm, I thought most use cases of lru_cache benefit from strong references for predictable hit rates? I'm not an expert in this area, so I nosied-in someone else who is.

However, I noticed that the current doc doesn't mention the strong reference behavior anywhere. So I think your suggestion to amend the docs is an improvement, thanks!
msg395125 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-04 21:31
Also note that many important objects in Python are not weak referenceable, tuples for example.
msg395144 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-05 00:06
I'm thinking of a more minimal and targeted edit than what is in the PR.   

Per the dev guide, we usually word the docs in an affirmative and specific manner (here is what the tool does and an example of how to use it).  Recounting a specific debugging case or misassumption usually isn't worthwhile unless it is a common misconception.

For strong versus weak references, we've had no previous reports even though the lru_cache() has been around for a long time.  Likely, that is because the standard library uses strong references everywhere unless specifically documented to the contrary.  Otherwise, we would have to add a strong reference note to everything stateful object in the language.

Another reason that it likely hasn't mattered to other users is that an lru cache automatically purges old entries.  If an object is not longer used, it cycles out as new items are added to the cache.  Arguably, a key feature of an LRU algorithm is that you don't have to think about the lifetime of objects.   

I'll think it a for a while and will propose an alternate edit that focuses on how the cache works with methods.  The essential point is that the instance is included in the cache key (which is usually what people want).  Discussing weak vs strong references is likely just a distractor.
msg395149 - (view) Author: Pablo Galindo Salgado (pablogsal) * (Python committer) Date: 2021-06-05 02:13
Agreed! I will let the PR to you :)
msg395152 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-05 04:34
It may useful to link back to @cached_property() for folks wanting method caching tied to the lifespan of an instance rather than actual LRU logic.
msg395157 - (view) Author: Serhiy Storchaka (serhiy.storchaka) * (Python committer) Date: 2021-06-05 09:22
This is a full duplicate of issue19859. Both ideas of using weak references and changing documentation were rejected.
msg395158 - (view) Author: Joannah Nanjekye (nanjekyejoannah) * (Python committer) Date: 2021-06-05 09:31
I saw the thread but the idea was rejected by @rhettinger who seems to suggest the changes in the documentation this time himself.

Maybe he has changed his mind, in which case he can explain the circumstances of his decisions if he wants.
msg395336 - (view) Author: Henk-Jaap Wagenaar (cryvate) * Date: 2021-06-08 15:42
Reading this bug thread last week made me realize we had made the following error in our code:


class SomethingView():
    @functools.lru_cache()
    def get_object(self):
        return self._object


Now, as this class was instantiated for every (particular kind of) request to a webserver and this method called (a few times), the lru_cache just kept filling up and up. We had been having a memory leak we couldn't track down, and this was it.

I think this is an easy mistake to make and it was rooted, not so much in hard references though (without that though, it would have not leaked memory) but because of the fact the cache lives on the class and not the object.
msg395774 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-14 05:47
New changeset fafcfff9262ae9dee03a00006638dfcbcfc23a7b by Raymond Hettinger in branch 'main':
bpo-44310:  Note that lru_cache keep references to both arguments and results (GH-26715)
https://github.com/python/cpython/commit/fafcfff9262ae9dee03a00006638dfcbcfc23a7b
msg395796 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-14 13:43
New changeset 809c3faa032d32bc45a0fa54d0400fcbc42a618f by Miss Islington (bot) in branch '3.10':
bpo-44310:  Note that lru_cache keep references to both arguments and results (GH-26715) (GH-26716)
https://github.com/python/cpython/commit/809c3faa032d32bc45a0fa54d0400fcbc42a618f
msg395887 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-15 17:50
See PR 26731 for a draft FAQ entry.  Let me know what you think.
msg395912 - (view) Author: Henk-Jaap Wagenaar (cryvate) * Date: 2021-06-16 08:58
PR 26731 looks very good to me. My only comment, which I am not sure is worthy of adding/is a general lru_cache thing, that "instances
are kept alive until they age out of the cache or until the cache is
cleared", if you are creating instances and calling this method all the time, will lead to an infinite memory leak.

Not sure whether that's too specific to the problem we encountered and we are all consenting adults and should infer this or it is helpful: leave it up to your/other people's judgement.

P.S. In the programming.rst there is also the "Why are default values shared between objects?" section which actually uses default values to make its own poor version of a cache. It should probably at least mention lru_cache could be used (unless you particularly need callers to be able to pass their own cache).
msg396016 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-17 20:14
> if you are creating instances and calling this method 
> all the time, will lead to an infinite memory leak.

Your words aren't making any sense to me.  The default 
lru_cache will never hold more than maxsize entries.
The default maxsize is 128.  How is that "infinite"?
msg396017 - (view) Author: Henk-Jaap Wagenaar (cryvate) * Date: 2021-06-17 20:28
I clearly was missing some words there Raymond. I meant, if one has set maxsize=None.
msg396018 - (view) Author: Henk-Jaap Wagenaar (cryvate) * Date: 2021-06-17 20:33
(but consenting adults, setting max_size=None for "efficiency", you better be sure what you are doing in a long running process and making sure it cannot grow unbounded.)
msg396019 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-17 20:37
> I meant, if one has set maxsize=None.

The docs already say, "If maxsize is set to None, the LRU feature is disabled and the cache can grow without bound."
msg396020 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-17 20:39
New changeset 7f01f77f8fabcfd7ddb5d99f12d6fc99af9af384 by Raymond Hettinger in branch 'main':
bpo-44310: Add a FAQ entry for caching method calls (GH-26731)
https://github.com/python/cpython/commit/7f01f77f8fabcfd7ddb5d99f12d6fc99af9af384
msg396021 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-17 21:14
New changeset 77eaf14d278882857e658f83681e5b9a52cf14ac by Miss Islington (bot) in branch '3.10':
bpo-44310: Add a FAQ entry for caching method calls (GH-26731) (GH-26777)
https://github.com/python/cpython/commit/77eaf14d278882857e658f83681e5b9a52cf14ac
msg396178 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2021-06-20 15:11
Adding a weak referencing recipe here just so I can find it in the future.

--------------------------------------------------------------------------

import functools
import weakref

def weak_lru(maxsize=128, typed=False):
    """LRU Cache decorator that keeps a weak reference to "self".

    Only provides benefit if the instances are so large that
    it is impractical to wait for them to age out of the cache.

    When the instance is freed, the cache entry still remains
    but will be unreachable.

    If new instances will be created that are equal to the ones
    retired by the weak reference, we lose all the benefits of
    having cached the previous call.  

    If the class defines __slots__, be sure to add '__weakref__'
    to make the instances weak referenceable.

    """

    def decorator(func):

        ref = weakref.ref

        @functools.lru_cache(maxsize, typed)
        def _func(_self, /, *args, **kwargs):
            return func(_self(), *args, **kwargs)

        @functools.wraps(func)
        def wrapper(self, /, *args, **kwargs):
            return _func(ref(self), *args, **kwargs)

        return wrapper

    return decorator
History
Date User Action Args
2022-04-11 14:59:46adminsetgithub: 88476
2021-06-20 15:11:01rhettingersetmessages: + msg396178
2021-06-18 19:27:32rhettingersetpull_requests: + pull_request25373
2021-06-17 21:14:49rhettingersetmessages: + msg396021
2021-06-17 20:39:56miss-islingtonsetpull_requests: + pull_request25364
2021-06-17 20:39:52rhettingersetmessages: + msg396020
2021-06-17 20:39:51miss-islingtonsetpull_requests: + pull_request25363
2021-06-17 20:37:23rhettingersetmessages: + msg396019
2021-06-17 20:33:05cryvatesetmessages: + msg396018
2021-06-17 20:28:07cryvatesetmessages: + msg396017
2021-06-17 20:14:30rhettingersetmessages: + msg396016
2021-06-16 08:58:33cryvatesetmessages: + msg395912
2021-06-15 17:50:41rhettingersetmessages: + msg395887
2021-06-15 06:52:56rhettingersetpull_requests: + pull_request25319
2021-06-14 13:43:56rhettingersetmessages: + msg395796
2021-06-14 05:54:45rhettingersetstatus: open -> closed
resolution: duplicate -> fixed
stage: patch review -> resolved
2021-06-14 05:47:38miss-islingtonsetnosy: + miss-islington
pull_requests: + pull_request25304
2021-06-14 05:47:35rhettingersetmessages: + msg395774
2021-06-14 05:29:22rhettingersetpull_requests: + pull_request25303
2021-06-08 15:42:07cryvatesetnosy: + cryvate
messages: + msg395336
2021-06-05 09:31:58nanjekyejoannahsetmessages: + msg395158
2021-06-05 09:22:10serhiy.storchakasetnosy: + serhiy.storchaka
messages: + msg395157
resolution: duplicate

superseder: functools.lru_cache keeps objects alive forever
2021-06-05 04:34:04rhettingersetmessages: + msg395152
2021-06-05 02:13:28pablogsalsetmessages: + msg395149
2021-06-05 00:06:37rhettingersetmessages: + msg395144
2021-06-04 21:31:51rhettingersetmessages: + msg395125
title: lru_cache memory leak -> Document that lru_cache uses hard references
2021-06-04 21:27:57rhettingersetassignee: rhettinger
2021-06-04 21:19:57terry.reedysetversions: - Python 3.6, Python 3.7, Python 3.8
2021-06-04 14:55:49kjsetnosy: + rhettinger, kj
messages: + msg395088
2021-06-04 14:46:56pablogsalsetmessages: + msg395087
2021-06-04 12:39:59nanjekyejoannahsetnosy: + pablogsal, nanjekyejoannah
2021-06-04 12:22:20python-devsetkeywords: + patch
nosy: + python-dev

pull_requests: + pull_request25122
stage: patch review
2021-06-04 11:53:01Wouter De Borger2settype: resource usage
2021-06-04 11:45:58Wouter De Borger2create