New submission from Nikita Sobolev <m...@sobolevn.me>:
I was working on improving coverage and test quality of `typing.py`, when I saw that `@no_type_check` is quite strange. Let's dive into this! ## Everything is correct We will start with a basic example, that illustrates that everything works fine: ``` from typing import no_type_check, get_type_hints class A: x: int = 1 class B: y: str = 'a' print(get_type_hints(A)) # ok: {'x': <class 'int'>} print(get_type_hints(B)) # ok: {'y': <class 'str'>} ``` Ok, let's move on. ## Adding `@no_type_check` Now, adding `@no_type_check` to `B` will make the result of `get_type_hints()` call on it - an empty `dict`: ``` from typing import no_type_check, get_type_hints class A: x: int = 1 @no_type_check class B: y: str = 'a' print(get_type_hints(A)) # ok: {'x': <class 'int'>} print(get_type_hints(B)) # ok: {} ``` This is still ok. ## Broken? And now we can add some class-level constant to `B`, like `delegate` to show how it breaks `A`: ``` from typing import no_type_check, get_type_hints class A: x: int = 1 @no_type_check class B: y: str = 'a' delegate = A # adding this line will make `A` to have `__no_type_check__` as well print(get_type_hints(A)) # {}, wait, what? print(get_type_hints(B)) # {}, ok ``` Why is that important? It introduces an unfortunate side-effect that can make some totally unrelated (!) class completely ignore `get_type_hints()` and break things like `pydantic`, `beartype`, `attrs, etc that rely on type hints. By adding a class-level assignment to a class that has `@no_type_check` or other `no_type_check_decorator`. Why does this happen? It happens because `no_type_check` has this logic: ``` if isinstance(arg, type): arg_attrs = arg.__dict__.copy() for attr, val in arg.__dict__.items(): if val in arg.__bases__ + (arg,): arg_attrs.pop(attr) for obj in arg_attrs.values(): if isinstance(obj, types.FunctionType): obj.__no_type_check__ = True if isinstance(obj, type): no_type_check(obj) ``` Source: https://github.com/python/cpython/blob/8b1b27f1939cc4060531d198fdb09242f247ca7c/Lib/typing.py#L1952-L1975 As you can see above, we traverse all `__dict__` values of the given `class` and for some reason recurse into all nested types. I think that the original goal was to handle cases like: ``` @no_type_check class Outer: class Inner: ... ``` And now it also affects regular assignments. So, what can we do? 0. Nothing, it works correctly (I disagree) 1. Do not cover nested classes at all with `@no_type_check`, only cover methods 2. Only cover types that are **defined** in this class, like my `Inner` class example 3. Something else? I think that `(2)` is more inline with the currect implementation, so my vote is for it. I would like to implement this when we will have the final agreement :) ---------- components: Library (Lib) messages: 412073 nosy: AlexWaygood, Jelle Zijlstra, gvanrossum, kj, sobolevn priority: normal severity: normal status: open title: Strange `@typing.no_type_check` behavior for class variables versions: Python 3.10, Python 3.11, Python 3.9 _______________________________________ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue46571> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com