This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author sbromberger
Recipients josh.r, lemburg, ncoghlan, pmoody, r.david.murray, rhettinger, sbromberger, serhiy.storchaka
Date 2014-12-24.01:59:28
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
>I'm just pointing out that if he thinks 56 bytes per object is too large, he's going to be disappointed with what Python has to offer. General purpose user-defined Python objects don't optimize for low memory usage, and even __slots__ only gets you so far.

"He" thinks that 1300% overhead is a bit too much for a simple object that can be represented by a 32-bit number, and "he" has been using python for several years and understands, generally, what the language has to offer (though not nearly so well as some of the respondents here). It may be in the roundoff at n < 1e5, but when you start using these objects for real-world analysis, the overhead becomes problematic. I note with some amusement that the overhead is several orders of magnitude greater than those of the protocols these objects represent :)

In any case - I have no issues with the decision not to make these objects memory-efficient in stdlib and consequently with closing this out. Rolling my own package appears to be the best solution in any case.

Thanks for the discussion.
Date User Action Args
2014-12-24 01:59:31sbrombergersetrecipients: + sbromberger, lemburg, rhettinger, ncoghlan, pmoody, r.david.murray, serhiy.storchaka, josh.r
2014-12-24 01:59:31sbrombergersetmessageid: <>
2014-12-24 01:59:31sbrombergerlinkissue23103 messages
2014-12-24 01:59:29sbrombergercreate