This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: Expose the value passed of typed passed to functools.lru_cache
Type: enhancement Stage: resolved
Components: Library (Lib) Versions: Python 3.9
process
Status: closed Resolution: fixed
Dependencies: Superseder:
Assigned To: rhettinger Nosy List: Manjusaka, Scott Sanderson2, rhettinger, serhiy.storchaka
Priority: normal Keywords: patch

Created on 2019-10-23 14:50 by Scott Sanderson2, last changed 2022-04-11 14:59 by admin. This issue is now closed.

Pull Requests
URL Status Linked Edit
PR 16915 closed Manjusaka, 2019-10-24 06:53
PR 16916 merged Manjusaka, 2019-10-24 08:10
Messages (13)
msg355226 - (view) Author: Scott Sanderson (Scott Sanderson2) Date: 2019-10-23 14:50
In some circumstances, it's useful to be able in inspect the parameters with which an instance of functools.lru_cache was instantiated. It's currently possible to recover the cache's maxsize via the .cache_info() method, but there's no way to recover the value passed for `typed`, which controls whether the lru_cache's cache is partitioned by input type.

This came up in the context of cloudpickle, a library that tries to extend pickle to support more types (in particular, interactively-defined functions and classes) for use-cases like cluster computing. 

It's currently not possible to pickle an lru-cache decorated function that's defined in __main__ (which includes, e.g. a Jupyter Notebook). We can **almost** fix this with a pure library solution (see https://github.com/cloudpipe/cloudpickle/pull/309), but we're currently blocked by the fact that there's no way to recover the value that was passed for `typed`. Exposing a .typed attribute on the extension type for lru_cached functions fixes this

For more discussion, see the above linked PR, along with https://github.com/cloudpipe/cloudpickle/issues/178.
msg355277 - (view) Author: Manjusaka (Manjusaka) * Date: 2019-10-24 05:33
I think it's a good idea to expose the full arguments that people use in lru_cache()
msg355284 - (view) Author: Manjusaka (Manjusaka) * Date: 2019-10-24 06:56
I have already make a PR for this issue

but here's a problem. add a new field to cache_info maybe cause some special problem for the third-party libs what're dependent the cache_info
msg355285 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2019-10-24 07:09
I'm fine with adding a new function:

   >>> f.cache_parameters()
   {'maxsize': 200, 'typed'=False}
msg355286 - (view) Author: Manjusaka (Manjusaka) * Date: 2019-10-24 07:21
I think add new function would be a better way 

I will make a new PR
msg355288 - (view) Author: Serhiy Storchaka (serhiy.storchaka) * (Python committer) Date: 2019-10-24 07:33
It is easy:

diff --git a/Lib/functools.py b/Lib/functools.py
index 3192bd02d9..52c07db749 100644
--- a/Lib/functools.py
+++ b/Lib/functools.py
@@ -499,7 +499,9 @@ def lru_cache(maxsize=128, typed=False):
         # The user_function was passed in directly via the maxsize argument
         user_function, maxsize = maxsize, 128
         wrapper = _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo)
-        return update_wrapper(wrapper, user_function)
+        func = update_wrapper(wrapper, user_function)
+        func.cache_parameters = lambda: {'maxsize': maxsize, 'typed': typed}
+        return func
     elif maxsize is not None:
         raise TypeError(
             'Expected first argument to be an integer, a callable, or None')

But there are many design questions. Why method instead of just attribute?

        func.cache_parameters = {'maxsize': maxsize, 'typed': typed}

Or maybe just add the "typed" attribute?

        func.typed = typed


Also I consider adding the more general "make_key" parameter to lru_cache(). The "typed" parameter would just specify the default value for "make_key".
msg355291 - (view) Author: Manjusaka (Manjusaka) * Date: 2019-10-24 07:42
I think to modify in lru_cache should be good in normal circumstance

But here's maybe some people modify the wrapped object that underlying in  lru_cache

So I prefer to add a new function in _functools.c?
msg355296 - (view) Author: Manjusaka (Manjusaka) * Date: 2019-10-24 08:06
I have already make a new function like this

static PyObject *
lru_cache_cache_parameters(lru_cache_object *self)
{
    PyObject *cache_parameters =  PyDict_New();
    PyDict_SetItemString(cache_parameters, "maxsize", PyLong_FromSsize_t(self->maxsize));
    PyDict_SetItemString(cache_parameters, "typed", self->typed == 0 ? Py_False : Py_True);
    return cache_parameters;
}
msg355297 - (view) Author: Serhiy Storchaka (serhiy.storchaka) * (Python committer) Date: 2019-10-24 08:08
The rule used in the lru_cache implementation is: do not write in C that can be written in Python.
msg355304 - (view) Author: Manjusaka (Manjusaka) * Date: 2019-10-24 08:46
Yes, you are right, we shouldn't consider about the unstandard using way. I will update my PR

BTW, what're means for the "make_key" parameter?
msg356418 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2019-11-12 07:30
New changeset 051ff526b5dc2c40c4a53d87089740358822edfa by Raymond Hettinger (Manjusaka) in branch 'master':
bpo-38565: add new cache_parameters method for lru_cache (GH-16916)
https://github.com/python/cpython/commit/051ff526b5dc2c40c4a53d87089740358822edfa
msg356419 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2019-11-12 07:32
Scott, thank you for the suggestion.
Serhiy, thank for pointing to the simplest implementation.
Zheaoli, thanks for the patch.
msg356420 - (view) Author: Manjusaka (Manjusaka) * Date: 2019-11-12 07:34
Raymond, thanks for fixing many errors for my patch!
History
Date User Action Args
2022-04-11 14:59:22adminsetgithub: 82746
2019-11-12 07:34:21Manjusakasetmessages: + msg356420
2019-11-12 07:32:24rhettingersetstatus: open -> closed
resolution: fixed
messages: + msg356419

stage: patch review -> resolved
2019-11-12 07:30:26rhettingersetmessages: + msg356418
2019-10-24 08:46:23Manjusakasetmessages: + msg355304
2019-10-24 08:10:13Manjusakasetpull_requests: + pull_request16446
2019-10-24 08:08:50serhiy.storchakasetmessages: + msg355297
2019-10-24 08:06:28Manjusakasetmessages: + msg355296
2019-10-24 07:42:22Manjusakasetmessages: + msg355291
2019-10-24 07:33:27serhiy.storchakasetmessages: + msg355288
2019-10-24 07:21:04Manjusakasetmessages: + msg355286
2019-10-24 07:15:07rhettingersetnosy: + serhiy.storchaka
2019-10-24 07:09:39rhettingersetassignee: rhettinger
messages: + msg355285
2019-10-24 06:56:45Manjusakasetmessages: + msg355284
2019-10-24 06:53:52Manjusakasetkeywords: + patch
stage: patch review
pull_requests: + pull_request16445
2019-10-24 05:47:26xtreaksetnosy: + rhettinger
2019-10-24 05:33:48Manjusakasetnosy: + Manjusaka
messages: + msg355277
2019-10-23 14:50:54Scott Sanderson2create