Author Yonatan Goldschmidt
Recipients Yonatan Goldschmidt
Date 2020-10-23.14:51:35
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
Key-sharing dictionaries, defined by, require that any resizing of the shared dictionary keys will happen before a second instance of the class is created.

cached_property inserts its resolved result into the instance dict after it is called. This is likely to happen *after* a second instance has been created, and it is also likely to cause a resize of the dict, as demonstrated by this snippet:

    from functools import cached_property
    import sys

    def dict_size(o):
        return sys.getsizeof(o.__dict__)

    class X:
        def __init__(self):
            self.a = 1
            self.b = 2
            self.c = 3
            self.d = 4
            self.e = 5

        def f(self):
            return id(self)

    x1 = X()
    x2 = X()




    x3 = X()

Essentially it means that types using cached_property are less likely to enjoy the benefits of shared keys. It may also incur a certain performance hit, because a resize + unshare will happen every time.

A simple way I've thought of to let cached_property play more nicely with shared keys, is to first create a single object of the class, and set the cached_property attribute to some value (so the key is added to the shared dict). In the snippet above, if you add "x0 = X(); x0.f = None" before creating x1 and x2, you'll see that the cached_property resolving does not unshare the dicts.

But I wonder if there's a way to do so without requiring user code changes.
Date User Action Args
2020-10-23 14:51:35Yonatan Goldschmidtsetrecipients: + Yonatan Goldschmidt
2020-10-23 14:51:35Yonatan Goldschmidtsetmessageid: <>
2020-10-23 14:51:35Yonatan Goldschmidtlinkissue42127 messages
2020-10-23 14:51:35Yonatan Goldschmidtcreate