This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author vstinner
Recipients eric.snow, ncoghlan, vstinner
Date 2018-05-22.20:18:53
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1527020333.43.0.682650639539.issue33607@psf.upfronthosting.co.za>
In-reply-to
Content
"Either we'd add 2 pointer-size fields to PyObject or we would keep a separate hash table (or two) pointing from each object to the info (...)"

The expect a huge impact on the memory footprint. I dislike the idea.

Currently, the smallest Python object is:

>>> sys.getsizeof(object())
16

It's just two pointers. Adding two additional pointers would simply double the size of the object.

"Now the C-API offers a way to switch the allocator, so there's no guarantee
that the right allocator is used in PyMem_Free()."

I would expect that either all interpreters use the same memory allocator, or that each interpreter uses its own allocator. If you use one allocator per interpreter, calling PyMem_Free() from the wrong interpreter would just crash. As you get a crash when you call free() on an object allocated by PyMem_Free(). You can extend PYTHONMALLOC=debug to detect bugs. This builtin debugger is already able to catch misuses of allocators. Adding extra pointers to this debugger is acceptable since it doesn't modify the footprint of the default mode.
History
Date User Action Args
2018-05-22 20:18:53vstinnersetrecipients: + vstinner, ncoghlan, eric.snow
2018-05-22 20:18:53vstinnersetmessageid: <1527020333.43.0.682650639539.issue33607@psf.upfronthosting.co.za>
2018-05-22 20:18:53vstinnerlinkissue33607 messages
2018-05-22 20:18:53vstinnercreate