This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author tim.peters
Recipients ian_osh, tim.peters
Date 2020-07-24.21:13:11
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1595625191.6.0.662546316914.issue41389@roundup.psfhosted.org>
In-reply-to
Content
What makes you think that? Your own output shows that the number of "Active" objects does NOT monotonically increase across output lines. It goes up sometimes, and down sometimes.  Whether it goes up or down is entirely due to accidents of when your monitoring thread happens to wake up during the lifetime of the program's gc history.

I boosted the loop count to 10 million on my box just now. It had no significant effect on peak memory use. At the end:

>>> A.COUNT, A.TOTAL, B.COUNT, B.TOTAL
(298, 10000000, 298, 10000000)
>>> gc.collect()
1188
>>> A.COUNT, A.TOTAL, B.COUNT, B.TOTAL
(1, 10000000, 1, 10000000)

There is no leak. An A and B object survive collect() because the last A object created remains bound to the variable `a` used in the loop (so is still reachable).

So I thank you for creating a nice test program, but I'm closing this since it doesn't demonstrate a real problem.
History
Date User Action Args
2020-07-24 21:13:11tim.peterssetrecipients: + tim.peters, ian_osh
2020-07-24 21:13:11tim.peterssetmessageid: <1595625191.6.0.662546316914.issue41389@roundup.psfhosted.org>
2020-07-24 21:13:11tim.peterslinkissue41389 messages
2020-07-24 21:13:11tim.peterscreate