This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Title: Make lru_cache(maxsize=None) more discoverable
Type: enhancement Stage: resolved
Components: Library (Lib) Versions: Python 3.9
Status: closed Resolution: fixed
Dependencies: Superseder:
Assigned To: Nosy List: rhettinger
Priority: normal Keywords: patch

Created on 2020-05-08 21:22 by rhettinger, last changed 2022-04-11 14:59 by admin. This issue is now closed.

Pull Requests
URL Status Linked Edit
PR 20019 merged rhettinger, 2020-05-09 23:51
Messages (3)
msg368469 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2020-05-08 21:22
The recent discussions on python-ideas showed that people have a hard time finding the infinity-cache option for lru_cache().  Also, in the context of straight caching without limits, the name *lru_cache()* makes the tool seem complex and heavy when in fact, it is simple, lightweight, and fast (doing no more than a simple dictionary lookup).

We could easily solve both problems with a helper function:

    def cache(func):
        'Simple unbounded cache.  Sometimes called "memoize".'
        return lru_cache(maxsize=None, typed=False)

It would be used like this:

     def configure_server():
         return server_instance

There was some discussion about a completely new decorator with different semantics (holding a lock across a call to an arbitrary user function and being limited to zero argument functions).  It all the examples that were presented, this @cache decorator would suffice.  None of examples presented actually locking behavior.
msg368471 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2020-05-08 21:31
FWIW, this doesn't preclude the other proposal if it ever gains traction and moves forward.  This just takes existing functionality and improves clarity and discoverability.

The core issue is that if you only want a simple unbounded cache, it isn't obvious that that behavior is buried in the lru_cache() API.  In hindsight, this always should have been separate functionality.  And some day we may deprecate the maxsize=None option which is somewhat opaque.
msg368684 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2020-05-12 00:00
New changeset 21cdb711e3b1975398c54141e519ead02670610e by Raymond Hettinger in branch 'master':
bpo-40571: Make lru_cache(maxsize=None) more discoverable (GH-20019)
Date User Action Args
2022-04-11 14:59:30adminsetgithub: 84751
2020-05-12 00:01:23rhettingersetstatus: open -> closed
resolution: fixed
stage: patch review -> resolved
2020-05-12 00:00:57rhettingersetmessages: + msg368684
2020-05-09 23:51:14rhettingersetkeywords: + patch
stage: patch review
pull_requests: + pull_request19330
2020-05-08 21:31:50rhettingersetmessages: + msg368471
2020-05-08 21:22:40rhettingercreate