Message205013
Yes, the old memory argument.
But is it valid? Is there a conceivable application where a dict of weak references would be storing a large chunk of the application memory?
Remember, all of the data must be referred to from elsewhere, or else, the weak refs would not exist. An extra list of pointers is unlikely to make a difference.
I think the chief reason to use iterators has to do with performance by avoiding the creation of temporary objects, not saving memory per-se.
Before the invention of "iteritems()" and friends, all such iteration was by lists (and hence, memory usage). We should try to remain nimble enough so that we can undo an optimization previously done, if the requirements merit us doing so.
As a completely unrelated example of such nimbleness: Faced with stricter regulations in the 70s, american car makers had to sell their muscle cars with increasingly less powerful engines, efectively rolling back previous optimizations :)
Anyway, it's not for me to decide. We have currently three options:
a) my first patch, which is a duplication of the 3.x work but is non-trivial and could bring stability issues
b) my second patch, which will increase memory use, but to no more than previous versions of python used while iterating
c) do nothing and have iterations over weak dicts randomly break when an underlying cycle is unraveled during iteration.
Cheers! |
|
Date |
User |
Action |
Args |
2013-12-02 14:11:53 | kristjan.jonsson | set | recipients:
+ kristjan.jonsson, gvanrossum, dcjim, tseaver, mark.dickinson, pitrou, vstinner, ajaksu2, jon, benjamin.peterson, vdupras, elachuni, BreamoreBoy, qelan |
2013-12-02 14:11:53 | kristjan.jonsson | set | messageid: <1385993513.13.0.36769317594.issue7105@psf.upfronthosting.co.za> |
2013-12-02 14:11:53 | kristjan.jonsson | link | issue7105 messages |
2013-12-02 14:11:52 | kristjan.jonsson | create | |
|