This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author mmokrejs
Recipients amaury.forgeotdarc, mmokrejs, tim.peters, vstinner
Date 2013-08-26.21:48:59
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1377553739.74.0.999262716969.issue18843@psf.upfronthosting.co.za>
In-reply-to
Content
Yes, I have rebuilt all python modules but even gdb exited on startup due to python ABI change. I am using Gentoo Linux (https://bugs.gentoo.org/show_bug.cgi?id=482348) and unless python-updater forgot to include some package in the listing of those needed to be recompiled, I am sane. And becuase gdb could not even start up I *hope* those no recompiled yet would NOT work AT ALL.

Thanks for clarification, I thought that I might reach some internal number (the serial number as you say) in python and run out of some internal counters on objects. Actually, I hit these issues because I wondered why some of my application tests fail. Although all tests crunch a really lot of data they merely do the same in the end: draw charts using matplotlib which uses numpy. I have huge lists which I recently converted to generators (if possible) and now I even use imap(), izip(), ifilter() from itertools. One of the crashed tests has 153 levels in gdb stacktrace and few lines from the very top/outer already had the izip() objects. But the tests which crashed are not so huge like others, maybe take just 1/10 of the size of others, so I wonder why these failed.

I think some crashes are related to me deleting explictly a huge list in my code even before leaving a function. Or maybe returning such lists between child/parent functions?

Could valgring or something else help to find who is overwriting data of others? But I don't have experience with using it.

I think this _figure.clear() crash could be manifestation of python deleting a wrong object/pointer. Some ugly for loops over lists took ... don't know how much but in total even 26GB of RAM was reserved by the process (most of it also as residual memory requirement). With itertools() I got down 10x.
History
Date User Action Args
2013-08-26 21:48:59mmokrejssetrecipients: + mmokrejs, tim.peters, amaury.forgeotdarc, vstinner
2013-08-26 21:48:59mmokrejssetmessageid: <1377553739.74.0.999262716969.issue18843@psf.upfronthosting.co.za>
2013-08-26 21:48:59mmokrejslinkissue18843 messages
2013-08-26 21:48:59mmokrejscreate