This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author ag6502
Recipients
Date 2006-12-16.17:13:38
SpamBayes Score
Marked as misclassified
Message-id
In-reply-to
Content
I am experimenting with long long timestamps (based on "unsigned long long" if available or explicitly on two "unsigned long" otherwise). An alternative to slowing down lookups by using 64 bits on 32-bit computers is just using a single 32-bit counter and trigger a "cache flush" when the timestamp rolls over by visiting all live dictionaries and code objects.
This would be IMO a *very* infrequent operation (for example on my P4/2.4GHz just doing nothing else but dict updates getting to 2**32 updates would require 14 minutes).
History
Date User Action Args
2007-08-23 15:55:45adminlinkissue1616125 messages
2007-08-23 15:55:45admincreate