Message368469
The recent discussions on python-ideas showed that people have a hard time finding the infinity-cache option for lru_cache(). Also, in the context of straight caching without limits, the name *lru_cache()* makes the tool seem complex and heavy when in fact, it is simple, lightweight, and fast (doing no more than a simple dictionary lookup).
We could easily solve both problems with a helper function:
def cache(func):
'Simple unbounded cache. Sometimes called "memoize".'
return lru_cache(maxsize=None, typed=False)
It would be used like this:
@cache
def configure_server():
...
return server_instance
There was some discussion about a completely new decorator with different semantics (holding a lock across a call to an arbitrary user function and being limited to zero argument functions). It all the examples that were presented, this @cache decorator would suffice. None of examples presented actually locking behavior. |
|
Date |
User |
Action |
Args |
2020-05-08 21:22:40 | rhettinger | set | recipients:
+ rhettinger |
2020-05-08 21:22:40 | rhettinger | set | messageid: <1588972960.42.0.924669050455.issue40571@roundup.psfhosted.org> |
2020-05-08 21:22:40 | rhettinger | link | issue40571 messages |
2020-05-08 21:22:40 | rhettinger | create | |
|