This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Matt.Mackall
Recipients Matt.Mackall, benjamin.peterson, josh.r, larry, marmoute, mpm, rhettinger
Date 2015-04-17.17:17:02
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
We already presize the output dict and have for ages. The question here is what size to use. In the current state, we use twice as much memory and CPU as necessary quite often because we end up spilling and growing... even though we apparently intentionally sized the object to fit.

In the sparse case, if we allocate a 1GB dictionary and use two entries in it, we will have consumed 1GB of address space but 1kB of actual memory.
Date User Action Args
2015-04-17 17:17:02Matt.Mackallsetrecipients: + Matt.Mackall, rhettinger, larry, benjamin.peterson, mpm, josh.r, marmoute
2015-04-17 17:17:02Matt.Mackallsetmessageid: <>
2015-04-17 17:17:02Matt.Mackalllinkissue23971 messages
2015-04-17 17:17:02Matt.Mackallcreate