New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
functools.cached_property does not maintain the wrapped method's __isabstractmethod__ #79176
Comments
What is the use case of this? |
This allows a developer to add a @cached_property to a method with the @AbstractMethod decorator, without breaking the check for abstract methods on ABC instantiation. That is, if you tried to instantiate an ABC with a method that had a method decorated with @cached_property and @AbstractMethod now, it would succeed, instead of throwing a TypeError as you might expect. As for why you'd put @cached_property on an abstract method, it's useful for IDEs and type checkers to know that the method is implemented with a property, and that users of the method (and its implementations) can reasonably call it multiple times and know caching is occurring. |
The comment can serve the same purpose. The caching affects the behavior at run time. Abstract methods should not be executed. The cached_property decorator is not inherited by overriding properties. I don't think that combining the cached_property and the @AbstractMethod decorators should be supported. |
I agree, a comment can serve the same purpose. But for the same reasons it's useful to express typing hints in Python with real syntax, and the reasons it's useful to have the abc module in the first place, I think it is useful to be able to annotate an abstract method with @cachedproperty. The purpose of this change is not to add code developers would try to use at runtime, it is to allow developers to communicate their intentions to static code analysis tools and to other developers, using standard decorators from builtins that do not break one another. Indeed, abstract methods should not be executed (on purpose), but Python still supports abstract methods to protect developers who want to explicitly label their code as "do not execute". Therefore, please do not think of this change as something someone would try to execute. It is for hinting functionality, and it is to protect developers who would decorate a method with @cached_property and @AbstractMethod and find that abc's functionality was unexpectedly broken. |
To add some more context, the existing Python documentation is a little misleading in this situation: https://docs.python.org/3/library/abc.html#abc.abstractmethod It shows that labeling a method with @property
@abstractmethod ought to be done in the order above, but this ordering currently breaks the abstract method check if you swap our @Property for @cached_property. The documentation also says "In order to correctly interoperate with the abstract base class machinery, the descriptor must identify itself as abstract using __isabstractmethod__." which @cached_property does not do, but as the documentation then shows, @Property does do this properly. |
Well, you can combine @AbstractMethod with @Property in an abstract class. This will give you type hints etc. @cached_property is a detail of concrete implementation. |
I agree with Serhiy. For static hinting, class MyABC(metaclass=ABCMeta):
@property
@abstractmethod
def myprop(self):
...
class MyConcrete(MyABC):
@cached_property
def myprop(self):
return self._some_heavy_work() |
Exmaple 1: import abc
import functools
class AbstractExpensiveCalculator(abc.ABC):
@abc.abstractmethod
@functools.cached_property
def calculate(self):
pass
AbstractExpensiveCalculator() Exmaple 2: import abc
import functools
class AbstractExpensiveCalculator(abc.ABC):
@functools.cached_property
@abc.abstractmethod
def calculate(self):
pass
AbstractExpensiveCalculator() Current behavior: Example 1 raises an exception as expected, Example 2 instanciate the object: no exception is raised. PR 9904 looks like a reasonable change to me.
Well, maybe we can hack something to make Example 2 fail as well, but I like the idea of using @functools.cached_property in an abstract class as "documentation". To announce that the property will be cached, even if technically it will not be cached. It's more to use the code as documentation than to execute any code. PR 9904 is just 4 lines of code to make the code "works as expected". |
Context: @functools.cached_property has been added by bpo-21145. |
FWIW, it seems to me (author of |
When thinking about "code as documentation", we should think not only Python iterpreter, but also IDE, static analystics tools. If we support abstractmethod + cached_property, it means we encourage to all IDEs & tools to support it. I think "this property is likely (but not must be) cached" hint is not worth enough to add such complexity to all IDEs / static analytics tools. |
I want to hear mypy developer's comment. |
This is runtime behavior, mypy doesn't care either way. It triggers on the presence of the decorator, not on what attributes it sets at runtime on the object. |
But it's only motivation is static hinting. |
In mypy there would be no difference between a cached property and a normal one -- the type is the same either way. Caching is the quintessential runtime behavior -- it shouldn't matter for semantics, only for performance. (True, sometimes there are more subtle semantics that require caching past the first call, but this is beyond the realm of a type checker, which cannot typically tell whether a call is the first or not.) |
INADA-san: do you prefer to raise an exception? If yes, can we ensure that |
I worried about people think we recommend it when we support it, while this is just a request from one person, not recommendation after wide discussion. But it doesn't affect to static tools than I suspected. Tools like autocompletion should support cached_property in normal class anyway. And this pull request doesn't promote the cached_property support in the document. Generally speaking, I don't like this type of changes. |
I am with Inada-san actually. I would go as far as saying that @cached_property
@abstractmethod
def something(): ... should unconditionally raise on definition. Mostly because this is just misleading. This declaration doesn't guarantee that the implementation will use caching (neither Python nor mypy can enforce this). Caching is an _implementation_ detail, while ABCs are used to specify _interface_. |
Inada-san, I think it's fair to ask for a broader vision about how ABCs are used. In that respect I'm wondering about some inconsistencies in the existing functools module about whether wrappers should maintain the wrapped function's __isabstractmethod__ attribute. Both singledispatchmethod and partialmethod implement their own way of copying __isabstractmethod__ from the wrapped function. singledispatchmethod does this in addition to calling update_wrapper, whereas partialmethod doesn't seem to use update_wrapper at all. In terms of a broader vision, then, should the decorators in functools be more consistent about whether the __isabstractmethod__ attribute of a wrapped function is preserved in the wrapped version? Should update_wrapper be the way to standardize this? If so, it would make sense either to remove the __isabstractmethod__ copying from those decorators, or otherwise to add __isabstractmethod__ to the list of attributes that are copied from the wrapped function by update_wrapper. Hopefully viewing the problem in this way avoids the pitfalls of adding random code to the codebase, as you are pointing out. Let me know if this discussion belongs in a broader issue than this one. Thank you :) |
My personal opinion is: support abstractmethod only when the descriptor is useful for define interface with ABC. In case of cached_property, it's not. It is just a property as interface level. Caching is just an implementation. In case of update_wrapper, it's too generic. A decorated function may be used while define interface, but it's a rare case. So no need to support it. In case of singledispatch, I think it is not used in ABC, like cached_property. But it has shipped in Python already. We shouldn't remove it easily. In case of partialmethod... it's considerable. I don't use ABC much, and I never use partialmethod. So this example is very artificial. class MyABC(abc.ABC):
@abstractmethod
def set_foo(self, v):
pass
reset_foo = partialmethod(set_foo, None) When they subclass of MyABC, they need to override both of On the other hand, this example can be written as: class MyABC(abc.ABC):
@abstractmethod
def set_foo(self, v):
pass
@abstractmethod
def reset_foo(self):
pass Or it can use lazy binding too: class MyABC(abc.ABC):
@abstractmethod
def set_foo(self, v):
pass
# No need to override if default implementation is OK
def reset_foo(self):
self.set_foo(None) I am not sure __isabstractmethod__ support in partialmethod is really useful. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: