This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Robert.Elsner
Recipients Robert.Elsner, mark.dickinson
Date 2012-04-16.12:35:43
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1334579744.55.0.286784173297.issue14596@psf.upfronthosting.co.za>
In-reply-to
Content
Well the problem is, that performance is severely degraded when calling unpack multiple times. I do not know in advance the size of the files and they might vary in size from 1M to 1G. I could use some fixed-size buffer which is inefficient depending on the file size (too big or too small). And if I change the buffer on the fly, I end up with the memory leak. I think the caching should take into account the available memory on the system. the no_leak function has comparable performance without the leak. And I think there is no point in caching Struct instances when they go out of scope and can not be accessed anymore? If i let it slip from the scope I do not want to use it thereafter. Especially considering that struct.Struct behaves as expected as do array.fromfile and numpy.fromfile.
History
Date User Action Args
2012-04-16 12:35:44Robert.Elsnersetrecipients: + Robert.Elsner, mark.dickinson
2012-04-16 12:35:44Robert.Elsnersetmessageid: <1334579744.55.0.286784173297.issue14596@psf.upfronthosting.co.za>
2012-04-16 12:35:43Robert.Elsnerlinkissue14596 messages
2012-04-16 12:35:43Robert.Elsnercreate