This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Itayazolay
Recipients Itayazolay, bar.harel, rhettinger
Date 2020-07-07.07:18:54
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1594106334.2.0.469632434429.issue41220@roundup.psfhosted.org>
In-reply-to
Content
Yes, you're right.
It's a bad example, I tried to simplify it, and I ended up oversimplifying it.
Real-life cases are of course more complicated.
What I wanted to accomplish, is very similar to the `key` argument in sorted/min/max/etc..
let my try and give you an example.

assume we have a complex data type, with a timestamp-signed attribute.
our single item will look as follows:
SingleItem = (unique_timestamp, <mutable_data_structure> )
now assume we have a function "extensive_computation"

def extensive_computation(items: List[SingleItem]):
  # very hard work
  sleep(60)

As developer, I know that the every item has unique timestamp.
So for a list of N timestamp, when they are the same, the result of the computation will be the same.

def item_cache_key(items: List[SingleItem]):
  return (timestamp for timestamp, data in items)

I would like to then create:

@lru_cache(128, key=item_cache_key):
def cache_extensive_computation(items):
  extensive_computation(items)

Does that makes more sense?
History
Date User Action Args
2020-07-07 07:18:54Itayazolaysetrecipients: + Itayazolay, rhettinger, bar.harel
2020-07-07 07:18:54Itayazolaysetmessageid: <1594106334.2.0.469632434429.issue41220@roundup.psfhosted.org>
2020-07-07 07:18:54Itayazolaylinkissue41220 messages
2020-07-07 07:18:54Itayazolaycreate