This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author pitrou
Recipients Aaron.Meurer, BreamoreBoy, amaury.forgeotdarc, anacrolix, asvetlov, brechtm, eric.snow, ezio.melotti, giampaolo.rodola, jcea, josh.r, kachayev, meador.inge, pitrou, poke, rhettinger, scoder, serhiy.storchaka, skrah
Date 2014-08-07.01:51:22
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <53E2DB95.5090708@free.fr>
In-reply-to <1407374341.78.0.713035340894.issue14373@psf.upfronthosting.co.za>
Content
Le 06/08/2014 21:19, Raymond Hettinger a écrit :
>
> I don't think so at all. The LRU cache we have now is plenty
> efficient
for its intended use cases (caching I/O bound functions and expensive
functions). If is only unsuitable for functions that are already
blazingly fast.

This is an unrealistic simplification. Many functions can be either 
expensive or blazingly fast, depending on their input (typical examples 
are re.compile(), math.factorial()...). But the decision of applying the 
lru_cache decorator is a compile-time binary decision: it cannot encode 
the varying properties of the function depending on its inputs.
History
Date User Action Args
2014-08-07 01:51:22pitrousetrecipients: + pitrou, rhettinger, jcea, amaury.forgeotdarc, scoder, giampaolo.rodola, ezio.melotti, asvetlov, poke, skrah, meador.inge, anacrolix, Aaron.Meurer, BreamoreBoy, eric.snow, serhiy.storchaka, brechtm, kachayev, josh.r
2014-08-07 01:51:22pitroulinkissue14373 messages
2014-08-07 01:51:22pitroucreate