This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: [functools] provide an async-compatible version of functools.lru_cache
Type: enhancement Stage:
Components: asyncio Versions: Python 3.8
process
Status: open Resolution:
Dependencies: Superseder:
Assigned To: Nosy List: Liran Nuna, asvetlov, brett.cannon, xtreak, yselivanov
Priority: normal Keywords:

Created on 2018-10-22 01:39 by Liran Nuna, last changed 2022-04-11 14:59 by admin.

Files
File name Uploaded Description Edit
test-case.py Liran Nuna, 2018-10-22 01:39
Messages (6)
msg328228 - (view) Author: Liran Nuna (Liran Nuna) * Date: 2018-10-22 01:39
lru_cache is a very useful method but it does not work well with coroutines since they can only be executed once.

Take for example, the attached code (test-case.py) - It will throw a RuntimeError because you cannot reuse an already awaited coroutine.

A solution would be to call `asyncio.ensure_future` on the result of the coroutine if detected.
msg328237 - (view) Author: Andrew Svetlov (asvetlov) * (Python committer) Date: 2018-10-22 06:33
A coroutine detection is a relatively slow check.
I don't think we need to do it in `functools.lru_cache`.

There is a specialized asyncio compatible version: https://github.com/aio-libs/async_lru
Please use it.
msg328274 - (view) Author: Brett Cannon (brett.cannon) * (Python committer) Date: 2018-10-22 21:38
Making this a feature request.
msg343687 - (view) Author: Andrew Svetlov (asvetlov) * (Python committer) Date: 2019-05-27 20:20
Brett please elaborate.
Do you want to incorporate async_lru library into CPython Core?
msg343816 - (view) Author: Brett Cannon (brett.cannon) * (Python committer) Date: 2019-05-28 20:10
I was just saying that this is an enhancement request, no judgment about whether we want to solve the enhancement request.
msg343818 - (view) Author: Liran Nuna (Liran Nuna) * Date: 2019-05-28 20:44
> A coroutine detection is a relatively slow check.
> I don't think we need to do it in `functools.lru_cache`.

Wouldn't a coroutine check only happen during decoration time? To successfully solve this easily and efficiently, we only really need to wrap the coroutine with `asyncio.ensure_future` if the decorated function is a coroutine, and it will only happen when a result comes back from the decorated function which would have minimal impact.

Of course, I don't know much about the internals of `lru_cache` so my assumptions could be wrong. I should familiar myself with the implementation and figure out how doable it would be.
History
Date User Action Args
2022-04-11 14:59:07adminsetgithub: 79221
2019-05-28 20:44:49Liran Nunasetmessages: + msg343818
2019-05-28 20:10:11brett.cannonsetmessages: + msg343816
2019-05-27 20:20:56asvetlovsetmessages: + msg343687
2018-10-22 21:38:04brett.cannonsetnosy: + brett.cannon
title: functools.lru_cache does not work with coroutines -> [functools] provide an async-compatible version of functools.lru_cache
messages: + msg328274

versions: - Python 3.5, Python 3.6, Python 3.7
type: enhancement
2018-10-22 06:33:07asvetlovsetmessages: + msg328237
2018-10-22 02:59:48xtreaksetnosy: + xtreak
2018-10-22 01:39:51Liran Nunacreate