Message396888
Thank you for your message!
I'm not particularly looking for an implementation, I was just surprised by
this behavior.
It can get even worse. Consider this example:
```
d = large_dict()
# remove all but one element of d, runs in quadratic time as explained above
while len(d) > 1:
del d[next(iter(d))]
# now iterating over d takes O(d), even when d has only one item:
# the following prints one key, but takes O(N)
for key in d:
print(key)
```
I think this example is better, since a person would expect iterating over
a singleton dictionary would be really fast, but it can be made as slow as
one wants. A "print_keys()" function would reasonably be expected to take
O(size of dictionary), but it doesn't.
Am I right to think that this is a performance bug?
On Fri, Jul 2, 2021, 8:10 PM Dennis Sweeney <report@bugs.python.org> wrote:
>
> Dennis Sweeney <sweeney.dennis650@gmail.com> added the comment:
>
> For what it's worth, using collections.OrderedDict gives constant-time
> behavior you're looking for.
>
> ----------
> nosy: +Dennis Sweeney
>
> _______________________________________
> Python tracker <report@bugs.python.org>
> <https://bugs.python.org/issue44555>
> _______________________________________
> |
|
Date |
User |
Action |
Args |
2021-07-03 04:03:28 | danielfleischman | set | recipients:
+ danielfleischman, Dennis Sweeney |
2021-07-03 04:03:28 | danielfleischman | link | issue44555 messages |
2021-07-03 04:03:27 | danielfleischman | create | |
|