This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author rhettinger
Recipients rhettinger
Date 2020-05-08.21:22:40
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1588972960.42.0.924669050455.issue40571@roundup.psfhosted.org>
In-reply-to
Content
The recent discussions on python-ideas showed that people have a hard time finding the infinity-cache option for lru_cache().  Also, in the context of straight caching without limits, the name *lru_cache()* makes the tool seem complex and heavy when in fact, it is simple, lightweight, and fast (doing no more than a simple dictionary lookup).

We could easily solve both problems with a helper function:

    def cache(func):
        'Simple unbounded cache.  Sometimes called "memoize".'
        return lru_cache(maxsize=None, typed=False)

It would be used like this:

     @cache
     def configure_server():
         ...
         return server_instance

There was some discussion about a completely new decorator with different semantics (holding a lock across a call to an arbitrary user function and being limited to zero argument functions).  It all the examples that were presented, this @cache decorator would suffice.  None of examples presented actually locking behavior.
History
Date User Action Args
2020-05-08 21:22:40rhettingersetrecipients: + rhettinger
2020-05-08 21:22:40rhettingersetmessageid: <1588972960.42.0.924669050455.issue40571@roundup.psfhosted.org>
2020-05-08 21:22:40rhettingerlinkissue40571 messages
2020-05-08 21:22:40rhettingercreate