Message413790
Thanks, Raymond.
I agree that caching of iterators and generators is out of the issue scope.
Also, I agree that a separate async cache decorator should be added. I prefer the `async_lru_cache` (and maybe `async_cache` for the API symmetry). We have `contextmanager` and `asynccontextmanager` in contextlib already along with `closing` / `aclosing`, `ExitStack` / `AsyncExitStack` etc.
`async_lru_cache` should have the same arguments as accepted by `lru_cache` but work with async functions.
I think this function should be a part of stdlib because the implementation shares *internal* `_lru_cache_wrapper` that does all dirty jobs (and has C accelerator). A third-party library should either copy all these implementation details or import a private function from stdlib and keep fingers crossed in hope that the private API will keep backward compatibility in future Python versions.
Similar reasons were applied to contextlib async APIs.
Third parties can have different features (time-to-live, expiration events, etc., etc.) and can be async-framework specific (work with asyncio or trio only) -- I don't care about these extensions here.
My point is: stdlib has built-in lru cache support, I love it. Let's add exactly the as we have already for sync functions but for async ones. |
|
Date |
User |
Action |
Args |
2022-02-23 13:38:46 | asvetlov | set | recipients:
+ asvetlov, rhettinger, serhiy.storchaka, yselivanov, uranusjr |
2022-02-23 13:38:46 | asvetlov | set | messageid: <1645623526.93.0.0314956730582.issue46622@roundup.psfhosted.org> |
2022-02-23 13:38:46 | asvetlov | link | issue46622 messages |
2022-02-23 13:38:46 | asvetlov | create | |
|