This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: functools.lru_cache user specified cachedict support
Type: enhancement Stage: resolved
Components: Library (Lib) Versions: Python 3.6
process
Status: closed Resolution: rejected
Dependencies: Superseder:
Assigned To: rhettinger Nosy List: josh.r, rhettinger, wdv4758h
Priority: normal Keywords: patch

Created on 2016-01-11 19:23 by wdv4758h, last changed 2022-04-11 14:58 by admin. This issue is now closed.

Files
File name Uploaded Description Edit
functools.lru_cache-user-specified-cachedict.patch wdv4758h, 2016-01-11 19:23 add user specified cachedict support review
Messages (3)
msg258001 - (view) Author: Chiu-Hsiang Hsu (wdv4758h) * Date: 2016-01-11 19:23
Currently, lru_cache will automatically construct a Python dictionary in the function as cachedict. IMHO, it will be much more flexible to let users specified their cachedict, so they can use any kind of dict-like calss as their cachedict. Thus, users can use any dictionary implementation and save result in any form they want.

for example :

use OrderedDict

.. code-block:: python

    from functools import lru_cache
    from collections import OrderedDict

    @lru_cache(maxsize=None, cache=OrderedDict())
    def func(*args, **kwargs):
        pass


save by pickle

.. code-block:: python

    import os
    import pickle
    from functools import lru_cache

    filename = "cache.pickle"
    cache = {}

    def load_cache():
        global cache
        if os.path.isfile(filename):
            with open(filename, "rb") as f:
                cache = pickle.load(f)

    def store_cache():
        with open(filename, "wb") as f:
            pickle.dump(cache, f)

    load_cache()

    @lru_cache(maxsize=None, cache=cache)
    def func(*args, **kwargs):
        pass
msg258049 - (view) Author: Josh Rosenberg (josh.r) * (Python triager) Date: 2016-01-12 01:37
Given that lru_cache uses the cache dict in very specific ways, supporting arbitrary mapping types would be extremely hard. Among other things:

1. The C code uses the concrete dict APIs (including private APIs) that would not work on arbitrary mappings that don't directly inherit from dict (and often wouldn't work properly even if they do inherit from dict). Changing to use the abstract mapping APIs would slow all use cases for an extremely uncommon use case.

2. The C code is operating under the assumption that specific operations cannot release the GIL (e.g. dict insertion and deletion is done after precomputing the hash of the key, so it's impossible for Python byte code to be executed), so it can safely ignore thread safety issues. If a non-dict mapping was provided, implemented in Python rather than C, these assumptions could easily be violated.

3. This is basically a superset of the request from #23030, which rhettinger has rejected (you can read the rationale there)
msg292261 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2017-04-25 14:36
Marking as closed/rejected for the reasons lists by Josh.  The functools.lru_cache() decorator is somewhat tightly focused and is trying to do one thing well.  Another reason is that at some point, we want to be able to change the internals (perhaps using the new compact/ordereddict) and that would be precluded by this feature request.
History
Date User Action Args
2022-04-11 14:58:26adminsetgithub: 70270
2017-04-25 14:36:07rhettingersetstatus: open -> closed
resolution: rejected
messages: + msg292261

stage: resolved
2017-04-25 06:03:55serhiy.storchakasetassignee: rhettinger
2016-01-12 01:37:49josh.rsetnosy: + rhettinger, josh.r
messages: + msg258049
2016-01-11 19:23:57wdv4758hcreate