This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author olliemath
Recipients olliemath
Date 2021-10-08.23:52:32
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1633737153.71.0.819400116811.issue45417@roundup.psfhosted.org>
In-reply-to
Content
Creating large enums takes a significant amount of time. Moreover this appears to be nonlinear in the number of entries in the enum. Locally, importing a single python file and taking this to the extreme:

 1000 entries - 0.058s
10000 entries - 4.327s

This is partially addressed by https://bugs.python.org/issue38659 and I can confirm that using `@_simple_enum` does not have this problem. But it seems like that API is only intended for internal use and the 'happy path' for user-defined enums is still not good.

Note that it is not simply parsing the file / creating the instances, it is to do with the cardinality. Creating 100 enums with 100 entries each is far faster than a single 10000 entry enum.
History
Date User Action Args
2021-10-08 23:52:33olliemathsetrecipients: + olliemath
2021-10-08 23:52:33olliemathsetmessageid: <1633737153.71.0.819400116811.issue45417@roundup.psfhosted.org>
2021-10-08 23:52:33olliemathlinkissue45417 messages
2021-10-08 23:52:33olliemathcreate