This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author brandtbucher
Recipients Mark.Shannon, brandtbucher, neonene, pablogsal
Date 2022-03-02.23:40:07
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1646264407.6.0.696072513545.issue46841@roundup.psfhosted.org>
In-reply-to
Content
> What I *think* is happening is that the inline cache takes the size of the function (in code units) from about 4800 to about 5200, crossing our threshold for quickening (currently set to 5000).

Yep, nailed it:

>>> len(list(dis.get_instructions(do_unpacking)))
4827
>>> len(list(dis.get_instructions(do_unpacking, show_caches=True)))
5251
>>> do_unpacking(1_000, range(10))
0.06478393300494645
>>> do_unpacking.__code__._co_quickened is None
True
History
Date User Action Args
2022-03-02 23:40:07brandtbuchersetrecipients: + brandtbucher, Mark.Shannon, pablogsal, neonene
2022-03-02 23:40:07brandtbuchersetmessageid: <1646264407.6.0.696072513545.issue46841@roundup.psfhosted.org>
2022-03-02 23:40:07brandtbucherlinkissue46841 messages
2022-03-02 23:40:07brandtbuchercreate