Message412073
I was working on improving coverage and test quality of `typing.py`, when I saw that `@no_type_check` is quite strange.
Let's dive into this!
## Everything is correct
We will start with a basic example, that illustrates that everything works fine:
```
from typing import no_type_check, get_type_hints
class A:
x: int = 1
class B:
y: str = 'a'
print(get_type_hints(A)) # ok: {'x': <class 'int'>}
print(get_type_hints(B)) # ok: {'y': <class 'str'>}
```
Ok, let's move on.
## Adding `@no_type_check`
Now, adding `@no_type_check` to `B` will make the result of `get_type_hints()` call on it - an empty `dict`:
```
from typing import no_type_check, get_type_hints
class A:
x: int = 1
@no_type_check
class B:
y: str = 'a'
print(get_type_hints(A)) # ok: {'x': <class 'int'>}
print(get_type_hints(B)) # ok: {}
```
This is still ok.
## Broken?
And now we can add some class-level constant to `B`, like `delegate` to show how it breaks `A`:
```
from typing import no_type_check, get_type_hints
class A:
x: int = 1
@no_type_check
class B:
y: str = 'a'
delegate = A # adding this line will make `A` to have `__no_type_check__` as well
print(get_type_hints(A)) # {}, wait, what?
print(get_type_hints(B)) # {}, ok
```
Why is that important?
It introduces an unfortunate side-effect that can make some totally unrelated (!) class completely ignore `get_type_hints()` and break things like `pydantic`, `beartype`, `attrs, etc that rely on type hints.
By adding a class-level assignment to a class that has `@no_type_check` or other `no_type_check_decorator`.
Why does this happen?
It happens because `no_type_check` has this logic:
```
if isinstance(arg, type):
arg_attrs = arg.__dict__.copy()
for attr, val in arg.__dict__.items():
if val in arg.__bases__ + (arg,):
arg_attrs.pop(attr)
for obj in arg_attrs.values():
if isinstance(obj, types.FunctionType):
obj.__no_type_check__ = True
if isinstance(obj, type):
no_type_check(obj)
```
Source: https://github.com/python/cpython/blob/8b1b27f1939cc4060531d198fdb09242f247ca7c/Lib/typing.py#L1952-L1975
As you can see above, we traverse all `__dict__` values of the given `class` and for some reason recurse into all nested types.
I think that the original goal was to handle cases like:
```
@no_type_check
class Outer:
class Inner: ...
```
And now it also affects regular assignments.
So, what can we do?
0. Nothing, it works correctly (I disagree)
1. Do not cover nested classes at all with `@no_type_check`, only cover methods
2. Only cover types that are **defined** in this class, like my `Inner` class example
3. Something else?
I think that `(2)` is more inline with the currect implementation, so my vote is for it.
I would like to implement this when we will have the final agreement :) |
|
Date |
User |
Action |
Args |
2022-01-29 14:12:02 | sobolevn | set | recipients:
+ sobolevn, gvanrossum, JelleZijlstra, kj, AlexWaygood |
2022-01-29 14:12:02 | sobolevn | set | messageid: <1643465522.51.0.205719891345.issue46571@roundup.psfhosted.org> |
2022-01-29 14:12:02 | sobolevn | link | issue46571 messages |
2022-01-29 14:12:00 | sobolevn | create | |
|