This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new f7b663d9af Run mypy checks for full packages in CI (#36638)
f7b663d9af is described below

commit f7b663d9aff472d0a419e16c262fbae2a8a69ce1
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Jan 7 13:53:23 2024 +0100

    Run mypy checks for full packages in CI (#36638)
    
    MyPy as used in our static checks has slightly different heuristics
    when running on on individual files and whole packages. This sometimes
    causes semi-random failures when different set of files is produced
    when pre-commits split the files between parallel processes.
    
    The regular `mypy-*` pre-commits work by passing filenames to mypy
    checks, and when `--all-files` flag is passed to mypy, this means
    that 2700 files are passed. In this case pre-commit will split such
    long list of files to several sequential muypy executions. This
    is not very good because depending on the list of files passed,
    mypy can split the list diferently and results will be different
    when just list of files changes - so mypy might start detecting
    problems that were not present before.
    
    This PR introduces new `mypy` check that runs mypy for packages
    rather than individual files. We cannot run them for local
    pre-commit runs, because in many cases, such package based
    mypy check will run for minutes when a single file changes,
    due to cache invalidation rules - and we do not want to penalise
    commits that are changing common airflow code (because such PRs
    would invalidate a lot of mypy cache every time such common file
    changes). So we still want to run file-based mypy for local
    commits. But we do not want to pass 2700 files in CI, rather than
    that on CI we want to run mypy checks "per package".
    
    This PR introduces a new "manual" stage mypy pre-commit check that
    will run "package" based mypy checks and adds selective check rules
    that will decide properly when to run such tests and separate,
    matrix-based CI job that will run such mypy checks - separately
    for each of the packages: "airflow", "providers", "docs", "dev".
    
    Also this job will skip providers checks in non-main branch and
    will run all tests when "full tests needed" are requested.
    
    This PR ignores some errors resulted from 3rd-party libraries used
    that are randomply appearing when some files are modified (and fixes
    the current main failures)
---
 .github/workflows/ci.yml                           |  56 ++++-
 .pre-commit-config.yaml                            |  13 +-
 STATIC_CODE_CHECKS.rst                             |  32 ++-
 airflow/decorators/__init__.pyi                    |  20 +-
 airflow/models/taskreschedule.py                   |   3 +-
 airflow/operators/latest_only.py                   |   2 +-
 airflow/providers/apache/druid/hooks/druid.py      |   4 +-
 .../providers/databricks/hooks/databricks_sql.py   |   4 +-
 airflow/providers/exasol/hooks/exasol.py           |   6 +-
 airflow/providers/google/cloud/hooks/cloud_run.py  |   2 +-
 .../providers/google/cloud/operators/bigquery.py   |   2 +-
 .../providers/google/cloud/triggers/cloud_run.py   |   2 +-
 .../google/cloud/utils/credentials_provider.py     |   2 +-
 .../providers/google/common/hooks/base_google.py   |   2 +-
 .../google/common/utils/id_token_credentials.py    |   4 +-
 airflow/providers/grpc/hooks/grpc.py               |   2 +-
 airflow/providers/postgres/hooks/postgres.py       |   4 +-
 airflow/providers/snowflake/hooks/snowflake.py     |   6 +-
 airflow/providers/trino/operators/trino.py         |   8 +-
 airflow/providers/vertica/hooks/vertica.py         |   2 +-
 airflow/providers_manager.py                       |  10 +-
 airflow/sensors/base.py                            |   4 +-
 airflow/utils/operator_helpers.py                  |   2 +-
 dev/breeze/src/airflow_breeze/pre_commit_ids.py    |   1 +
 .../src/airflow_breeze/utils/selective_checks.py   |  63 ++++--
 dev/breeze/tests/test_selective_checks.py          | 234 +++++++++++++++++----
 images/breeze/output_static-checks.svg             |  24 +--
 images/breeze/output_static-checks.txt             |   2 +-
 scripts/ci/pre_commit/common_precommit_utils.py    |  26 ++-
 scripts/ci/pre_commit/pre_commit_mypy.py           |  31 ++-
 30 files changed, 443 insertions(+), 130 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index a622e74111..9e9d7a788a 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -104,6 +104,8 @@ jobs:
       ci-image-build: ${{ steps.selective-checks.outputs.ci-image-build }}
       prod-image-build: ${{ steps.selective-checks.outputs.prod-image-build }}
       docs-build: ${{ steps.selective-checks.outputs.docs-build }}
+      mypy-packages: ${{ steps.selective-checks.outputs.mypy-packages }}
+      needs-mypy: ${{ steps.selective-checks.outputs.needs-mypy }}
       needs-helm-tests: ${{ steps.selective-checks.outputs.needs-helm-tests }}
       needs-api-tests: ${{ steps.selective-checks.outputs.needs-api-tests }}
       needs-api-codegen: ${{ steps.selective-checks.outputs.needs-api-codegen 
}}
@@ -510,7 +512,6 @@ jobs:
           retention-days: 7
           if-no-files-found: error
 
-
   static-checks:
     timeout-minutes: 45
     name: "Static checks"
@@ -550,6 +551,57 @@ jobs:
           DEFAULT_BRANCH: ${{ needs.build-info.outputs.default-branch }}
           RUFF_FORMAT: "github"
 
+  # Runs static checks for groups of files in the repository in a single 
process without passing .
+  # List of files
+  mypy:
+    timeout-minutes: 45
+    name: "MyPy checks"
+    runs-on: ${{fromJSON(needs.build-info.outputs.runs-on)}}
+    needs: [build-info, wait-for-ci-images]
+    strategy:
+      fail-fast: false
+      matrix:
+        mypy-package: ${{fromJson(needs.build-info.outputs.mypy-packages)}}
+    env:
+      RUNS_ON: "${{needs.build-info.outputs.runs-on}}"
+      PYTHON_MAJOR_MINOR_VERSION: 
"${{needs.build-info.outputs.default-python-version}}"
+      UPGRADE_TO_NEWER_DEPENDENCIES: "${{ 
needs.build-info.outputs.upgrade-to-newer-dependencies }}"
+    steps:
+      - name: Cleanup repo
+        run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm 
-rf /workspace/*"
+        if: needs.build-info.outputs.needs-mypy == 'true'
+      - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
+        uses: actions/checkout@v4
+        with:
+          persist-credentials: false
+        if: needs.build-info.outputs.needs-mypy == 'true'
+      - name: >
+          Prepare breeze & CI image: 
${{needs.build-info.outputs.default-python-version}}:${{env.IMAGE_TAG}}
+        uses: ./.github/actions/prepare_breeze_and_image
+        id: breeze
+        if: needs.build-info.outputs.needs-mypy == 'true'
+      - name: Cache pre-commit envs
+        uses: actions/cache@v3
+        with:
+          path: ~/.cache/pre-commit
+          # yamllint disable-line rule:line-length
+          key: "pre-commit-${{steps.breeze.outputs.host-python-version}}-${{ 
hashFiles('.pre-commit-config.yaml') }}"
+          restore-keys: |
+            pre-commit-${{steps.breeze.outputs.host-python-version}}-
+        if: needs.build-info.outputs.needs-mypy == 'true'
+      - name: "MyPy checks for ${{ matrix.mypy-package }}"
+        run: |
+          pip install pre-commit
+          pre-commit run --color always --verbose --hook-stage manual mypy 
--all-files
+        env:
+          VERBOSE: "false"
+          COLUMNS: "250"
+          SKIP_GROUP_OUTPUT: "true"
+          DEFAULT_BRANCH: ${{ needs.build-info.outputs.default-branch }}
+          RUFF_FORMAT: "github"
+          MYPY_PACKAGES: ${{ matrix.mypy-package }}
+        if: needs.build-info.outputs.needs-mypy == 'true'
+
   # Those checks are run if no image needs to be built for checks. This is for 
simple changes that
   # Do not touch any of the python code or any of the important files that 
might require building
   # The CI Docker image and they can be run entirely using the pre-commit 
virtual environments on host
@@ -2041,6 +2093,7 @@ jobs:
       - wait-for-ci-images
       - wait-for-prod-images
       - static-checks
+      - mypy
       - tests-sqlite
       - tests-mysql
       - tests-postgres
@@ -2195,6 +2248,7 @@ jobs:
       - build-docs
       - spellcheck-docs
       - static-checks
+      - mypy
       - tests-sqlite
       - tests-mysql
       - tests-postgres
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index fcd9284002..4923e4e7ea 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -736,7 +736,7 @@ repos:
         # Keep dependency versions in sync w/ airflow/www/package.json
         additional_dependencies: ['stylelint@13.3.1', 
'stylelint-config-standard@20.0.0', 'stylelint-config-prettier@9.0.5']
       - id: compile-www-assets
-        name: Compile www assets
+        name: Compile www assets (manual)
         language: node
         stages: ['manual']
         'types_or': [javascript, ts, tsx]
@@ -745,7 +745,7 @@ repos:
         pass_filenames: false
         additional_dependencies: ['yarn@1.22.19']
       - id: compile-www-assets-dev
-        name: Compile www assets in dev mode
+        name: Compile www assets in dev mode (manual)
         language: node
         stages: ['manual']
         'types_or': [javascript, ts, tsx]
@@ -1105,6 +1105,15 @@ repos:
         exclude: ^docs/rtd-deprecation
         require_serial: true
         additional_dependencies: ['rich>=12.4.4']
+      - id: mypy
+        stages: ['manual']
+        name: Run mypy for specified packages (manual)
+        language: python
+        entry: ./scripts/ci/pre_commit/pre_commit_mypy.py
+        pass_filenames: false
+        files: ^.*\.py$
+        require_serial: true
+        additional_dependencies: ['rich>=12.4.4']
       - id: check-provider-yaml-valid
         name: Validate provider.yaml files
         entry: ./scripts/ci/pre_commit/pre_commit_check_provider_yaml_files.py
diff --git a/STATIC_CODE_CHECKS.rst b/STATIC_CODE_CHECKS.rst
index bfe03692ca..7e093aa0c4 100644
--- a/STATIC_CODE_CHECKS.rst
+++ b/STATIC_CODE_CHECKS.rst
@@ -114,6 +114,13 @@ Available pre-commit checks
 This table lists pre-commit hooks used by Airflow. The ``Image`` column 
indicates which hooks
 require Breeze Docker image to be built locally.
 
+.. note:: Manual pre-commits
+
+  Most of the checks we run are configured to run automatically when you 
commit the code. However,
+  there are some checks that are not run automatically and you need to run 
them manually. Those
+  checks are marked with ``manual`` in the ``Description`` column in the table 
below. You can run
+  them manually by running ``pre-commit run --hook-stage manual <hook-id>``.
+
 .. note:: Disabling particular checks
 
   In case you have a problem with running particular ``pre-commit`` check you 
can still continue using the
@@ -127,6 +134,24 @@ require Breeze Docker image to be built locally.
   the image by setting ``SKIP_BREEZE_PRE_COMMITS`` to "true". This will mark 
the tests as "green" automatically
   when run locally (note that those checks will anyway run in CI).
 
+.. note:: Mypy checks
+
+  When we run mypy checks locally when committing a change, one of the 
``mypy-*`` checks is run, ``mypy-core``,
+  ``mypy-dev``, ``mypy-providers``, ``mypy-docs``, depending on the files you 
are changing. The mypy checks
+  are run by passing those changed files to mypy. This is way faster than 
running checks for all files (even
+  if mypy cache is used - especially when you change a file in airflow core 
that is imported and used by many
+  files). However, in some cases, it produces different results than when 
running checks for the whole set
+  of files, because ``mypy`` does not even know that some types are defined in 
other files and it might not
+  be able to follow imports properly if they are dynamic. Therefore in CI we 
run ``mypy`` check for whole
+  directories (``airflow`` - excluding providers, ``airflow/providers``, 
``dev`` and ``docs``) to make sure
+  that we catch all ``mypy`` errors - so you can experience different results 
when running mypy locally and
+  in CI. If you want to run mypy checks for all files locally, you can do it 
by running the following
+  command (example for ``airflow`` files):
+
+     .. code-block:: bash
+
+        MYPY_PACKAGES="airflow" pre-commit run --hook-stage manual mypy 
--all-files
+
 .. note:: Mypy volume cache
 
   MyPy uses a separate docker-volume (called ``mypy-cache-volume``) that keeps 
the cache of last MyPy
@@ -135,6 +160,7 @@ require Breeze Docker image to be built locally.
   is the hard problem in computer science). This might happen for example when 
we upgrade MyPY. In such
   cases you might need to manually remove the cache volume by running ``breeze 
down --cleanup-mypy-cache``.
 
+
   .. BEGIN AUTO-GENERATED STATIC CHECK LIST
 
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
@@ -258,9 +284,9 @@ require Breeze Docker image to be built locally.
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
 | codespell                                                 | Run codespell to 
check for common misspellings in files      |         |
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
-| compile-www-assets                                        | Compile www 
assets                                           |         |
+| compile-www-assets                                        | Compile www 
assets (manual)                                  |         |
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
-| compile-www-assets-dev                                    | Compile www 
assets in dev mode                               |         |
+| compile-www-assets-dev                                    | Compile www 
assets in dev mode (manual)                      |         |
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
 | create-missing-init-py-files-tests                        | Create missing 
init.py files in tests                        |         |
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
@@ -316,6 +342,8 @@ require Breeze Docker image to be built locally.
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
 | mixed-line-ending                                         | Detect if mixed 
line ending is used (\r vs. \r\n)            |         |
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
+| mypy                                                      | Run mypy for 
specified packages (manual)                     | *       |
++-----------------------------------------------------------+--------------------------------------------------------------+---------+
 | mypy-core                                                 | Run mypy for 
core                                            | *       |
 
+-----------------------------------------------------------+--------------------------------------------------------------+---------+
 | mypy-dev                                                  | Run mypy for dev 
                                            | *       |
diff --git a/airflow/decorators/__init__.pyi b/airflow/decorators/__init__.pyi
index 9c6f19c311..5574fa361e 100644
--- a/airflow/decorators/__init__.pyi
+++ b/airflow/decorators/__init__.pyi
@@ -62,7 +62,7 @@ __all__ = [
 
 class TaskDecoratorCollection:
     @overload
-    def python(
+    def python(  # type: ignore[misc]
         self,
         *,
         multiple_outputs: bool | None = None,
@@ -90,7 +90,7 @@ class TaskDecoratorCollection:
     def python(self, python_callable: Callable[FParams, FReturn]) -> 
Task[FParams, FReturn]: ...
     # [END mixin_for_typing]
     @overload
-    def __call__(
+    def __call__(  # type: ignore[misc]
         self,
         *,
         multiple_outputs: bool | None = None,
@@ -103,7 +103,7 @@ class TaskDecoratorCollection:
     def __call__(self, python_callable: Callable[FParams, FReturn]) -> 
Task[FParams, FReturn]:
         """Aliasing ``python``; signature should match exactly."""
     @overload
-    def virtualenv(
+    def virtualenv(  # type: ignore[misc]
         self,
         *,
         multiple_outputs: bool | None = None,
@@ -189,7 +189,9 @@ class TaskDecoratorCollection:
             such as transmission a large amount of XCom to TaskAPI.
         """
     @overload
-    def branch(self, *, multiple_outputs: bool | None = None, **kwargs) -> 
TaskDecorator:
+    def branch(  # type: ignore[misc]
+        self, *, multiple_outputs: bool | None = None, **kwargs
+    ) -> TaskDecorator:
         """Create a decorator to wrap the decorated callable into a 
BranchPythonOperator.
 
         For more information on how to use this decorator, see 
:ref:`concepts:branching`.
@@ -201,7 +203,7 @@ class TaskDecoratorCollection:
     @overload
     def branch(self, python_callable: Callable[FParams, FReturn]) -> 
Task[FParams, FReturn]: ...
     @overload
-    def branch_virtualenv(
+    def branch_virtualenv(  # type: ignore[misc]
         self,
         *,
         multiple_outputs: bool | None = None,
@@ -294,7 +296,7 @@ class TaskDecoratorCollection:
         self, python_callable: Callable[FParams, FReturn]
     ) -> Task[FParams, FReturn]: ...
     @overload
-    def short_circuit(
+    def short_circuit(  # type: ignore[misc]
         self,
         *,
         multiple_outputs: bool | None = None,
@@ -629,7 +631,7 @@ class TaskDecoratorCollection:
         :param progress_callback: Callback function for receiving k8s 
container logs.
         """
     @overload
-    def sensor(
+    def sensor(  # type: ignore[misc]
         self,
         *,
         poke_interval: float = ...,
@@ -666,7 +668,7 @@ class TaskDecoratorCollection:
     @overload
     def sensor(self, python_callable: Callable[FParams, FReturn] | None = 
None) -> Task[FParams, FReturn]: ...
     @overload
-    def pyspark(
+    def pyspark(  # type: ignore[misc]
         self,
         *,
         multiple_outputs: bool | None = None,
@@ -688,7 +690,7 @@ class TaskDecoratorCollection:
         self, python_callable: Callable[FParams, FReturn] | None = None
     ) -> Task[FParams, FReturn]: ...
     @overload
-    def bash(
+    def bash(  # type: ignore[misc]
         self,
         *,
         env: dict[str, str] | None = None,
diff --git a/airflow/models/taskreschedule.py b/airflow/models/taskreschedule.py
index a098001773..69356661e2 100644
--- a/airflow/models/taskreschedule.py
+++ b/airflow/models/taskreschedule.py
@@ -38,6 +38,7 @@ if TYPE_CHECKING:
 
     from airflow.models.operator import Operator
     from airflow.models.taskinstance import TaskInstance
+    from airflow.serialization.pydantic.taskinstance import 
TaskInstancePydantic
 
 
 class TaskReschedule(TaskInstanceDependencies):
@@ -103,7 +104,7 @@ class TaskReschedule(TaskInstanceDependencies):
     @classmethod
     def stmt_for_task_instance(
         cls,
-        ti: TaskInstance,
+        ti: TaskInstance | TaskInstancePydantic,
         *,
         try_number: int | None = None,
         descending: bool = False,
diff --git a/airflow/operators/latest_only.py b/airflow/operators/latest_only.py
index bca66dc2d4..7b4341b101 100644
--- a/airflow/operators/latest_only.py
+++ b/airflow/operators/latest_only.py
@@ -45,7 +45,7 @@ class LatestOnlyOperator(BaseBranchOperator):
     def choose_branch(self, context: Context) -> str | Iterable[str]:
         # If the DAG Run is externally triggered, then return without
         # skipping downstream tasks
-        dag_run: DagRun = context["dag_run"]
+        dag_run: DagRun = context["dag_run"]  # type: ignore[assignment]
         if dag_run.external_trigger:
             self.log.info("Externally triggered DAG_Run: allowing execution to 
proceed.")
             return 
list(context["task"].get_direct_relative_ids(upstream=False))
diff --git a/airflow/providers/apache/druid/hooks/druid.py 
b/airflow/providers/apache/druid/hooks/druid.py
index 455c780f76..92d4aa6408 100644
--- a/airflow/providers/apache/druid/hooks/druid.py
+++ b/airflow/providers/apache/druid/hooks/druid.py
@@ -200,7 +200,7 @@ class DruidDbApiHook(DbApiHook):
         endpoint = conn.extra_dejson.get("endpoint", "druid/v2/sql")
         return f"{conn_type}://{host}/{endpoint}"
 
-    def set_autocommit(self, conn: connect, autocommit: bool) -> 
NotImplementedError:
+    def set_autocommit(self, conn: connect, autocommit: bool) -> None:
         raise NotImplementedError()
 
     def insert_rows(
@@ -211,5 +211,5 @@ class DruidDbApiHook(DbApiHook):
         commit_every: int = 1000,
         replace: bool = False,
         **kwargs: Any,
-    ) -> NotImplementedError:
+    ) -> None:
         raise NotImplementedError()
diff --git a/airflow/providers/databricks/hooks/databricks_sql.py 
b/airflow/providers/databricks/hooks/databricks_sql.py
index 6c31691c45..d18d804b04 100644
--- a/airflow/providers/databricks/hooks/databricks_sql.py
+++ b/airflow/providers/databricks/hooks/databricks_sql.py
@@ -174,7 +174,7 @@ class DatabricksSqlHook(BaseDatabricksHook, DbApiHook):
             )
         return self._sql_conn
 
-    @overload
+    @overload  # type: ignore[override]
     def run(
         self,
         sql: str | Iterable[str],
@@ -249,7 +249,7 @@ class DatabricksSqlHook(BaseDatabricksHook, DbApiHook):
                 self.set_autocommit(conn, autocommit)
 
                 with closing(conn.cursor()) as cur:
-                    self._run_command(cur, sql_statement, parameters)
+                    self._run_command(cur, sql_statement, parameters)  # type: 
ignore[attr-defined]
                     if handler is not None:
                         raw_result = handler(cur)
                         if self.return_tuple:
diff --git a/airflow/providers/exasol/hooks/exasol.py 
b/airflow/providers/exasol/hooks/exasol.py
index 6d52b5122b..81c9381b3a 100644
--- a/airflow/providers/exasol/hooks/exasol.py
+++ b/airflow/providers/exasol/hooks/exasol.py
@@ -162,7 +162,7 @@ class ExasolHook(DbApiHook):
             )
         return cols
 
-    @overload
+    @overload  # type: ignore[override]
     def run(
         self,
         sql: str | Iterable[str],
@@ -232,7 +232,9 @@ class ExasolHook(DbApiHook):
                 with closing(conn.execute(sql_statement, parameters)) as 
exa_statement:
                     self.log.info("Running statement: %s, parameters: %s", 
sql_statement, parameters)
                     if handler is not None:
-                        result = 
self._make_common_data_structure(handler(exa_statement))
+                        result = self._make_common_data_structure(  # type: 
ignore[attr-defined]
+                            handler(exa_statement)
+                        )
                         if return_single_query_results(sql, return_last, 
split_statements):
                             _last_result = result
                             _last_columns = self.get_description(exa_statement)
diff --git a/airflow/providers/google/cloud/hooks/cloud_run.py 
b/airflow/providers/google/cloud/hooks/cloud_run.py
index 8741aa1d4d..61a6fadb76 100644
--- a/airflow/providers/google/cloud/hooks/cloud_run.py
+++ b/airflow/providers/google/cloud/hooks/cloud_run.py
@@ -31,7 +31,7 @@ from google.cloud.run_v2 import (
     RunJobRequest,
     UpdateJobRequest,
 )
-from google.longrunning import operations_pb2
+from google.longrunning import operations_pb2  # type: ignore[attr-defined]
 
 from airflow.exceptions import AirflowException
 from airflow.providers.google.common.consts import CLIENT_INFO
diff --git a/airflow/providers/google/cloud/operators/bigquery.py 
b/airflow/providers/google/cloud/operators/bigquery.py
index c9fdee102e..18bd065cb2 100644
--- a/airflow/providers/google/cloud/operators/bigquery.py
+++ b/airflow/providers/google/cloud/operators/bigquery.py
@@ -449,7 +449,7 @@ class BigQueryValueCheckOperator(_BigQueryDbHookMixin, 
SQLValueCheckOperator):
             # job.result() returns a RowIterator. Mypy expects an instance of 
SupportsNext[Any] for
             # the next() call which the RowIterator does not resemble to. 
Hence, ignore the arg-type error.
             records = next(job.result())  # type: ignore[arg-type]
-            self.check_value(records)
+            self.check_value(records)  # type: ignore[attr-defined]
             self.log.info("Current state of job %s is %s", job.job_id, 
job.state)
 
     @staticmethod
diff --git a/airflow/providers/google/cloud/triggers/cloud_run.py 
b/airflow/providers/google/cloud/triggers/cloud_run.py
index f47a7ac1b3..c0d3458c12 100644
--- a/airflow/providers/google/cloud/triggers/cloud_run.py
+++ b/airflow/providers/google/cloud/triggers/cloud_run.py
@@ -25,7 +25,7 @@ from airflow.providers.google.cloud.hooks.cloud_run import 
CloudRunAsyncHook
 from airflow.triggers.base import BaseTrigger, TriggerEvent
 
 if TYPE_CHECKING:
-    from google.longrunning import operations_pb2
+    from google.longrunning import operations_pb2  # type: ignore[attr-defined]
 
 DEFAULT_BATCH_LOCATION = "us-central1"
 
diff --git a/airflow/providers/google/cloud/utils/credentials_provider.py 
b/airflow/providers/google/cloud/utils/credentials_provider.py
index 18c00b0844..df83358e4b 100644
--- a/airflow/providers/google/cloud/utils/credentials_provider.py
+++ b/airflow/providers/google/cloud/utils/credentials_provider.py
@@ -30,7 +30,7 @@ from urllib.parse import urlencode
 import google.auth
 import google.auth.credentials
 import google.oauth2.service_account
-from google.auth import impersonated_credentials
+from google.auth import impersonated_credentials  # type: ignore[attr-defined]
 from google.auth.environment_vars import CREDENTIALS, LEGACY_PROJECT, PROJECT
 
 from airflow.exceptions import AirflowException
diff --git a/airflow/providers/google/common/hooks/base_google.py 
b/airflow/providers/google/common/hooks/base_google.py
index 72c51212ac..b03eaf7547 100644
--- a/airflow/providers/google/common/hooks/base_google.py
+++ b/airflow/providers/google/common/hooks/base_google.py
@@ -36,7 +36,7 @@ import requests
 import tenacity
 from asgiref.sync import sync_to_async
 from google.api_core.exceptions import Forbidden, ResourceExhausted, 
TooManyRequests
-from google.auth import _cloud_sdk, compute_engine
+from google.auth import _cloud_sdk, compute_engine  # type: 
ignore[attr-defined]
 from google.auth.environment_vars import CLOUD_SDK_CONFIG_DIR, CREDENTIALS
 from google.auth.exceptions import RefreshError
 from google.auth.transport import _http_client
diff --git a/airflow/providers/google/common/utils/id_token_credentials.py 
b/airflow/providers/google/common/utils/id_token_credentials.py
index 6a41438a22..c2b78bb2e1 100644
--- a/airflow/providers/google/common/utils/id_token_credentials.py
+++ b/airflow/providers/google/common/utils/id_token_credentials.py
@@ -36,7 +36,7 @@ from typing import TYPE_CHECKING
 
 import google.auth.transport
 from google.auth import credentials as google_auth_credentials, 
environment_vars, exceptions
-from google.oauth2 import credentials as oauth2_credentials, service_account
+from google.oauth2 import credentials as oauth2_credentials, service_account  
# type: ignore[attr-defined]
 
 if TYPE_CHECKING:
     import google.oauth2
@@ -146,7 +146,7 @@ def _get_gcloud_sdk_credentials(
     target_audience: str | None,
 ) -> google_auth_credentials.Credentials | None:
     """Gets the credentials and project ID from the Cloud SDK."""
-    from google.auth import _cloud_sdk
+    from google.auth import _cloud_sdk  # type: ignore[attr-defined]
 
     # Check if application default credentials exist.
     credentials_filename = 
_cloud_sdk.get_application_default_credentials_path()
diff --git a/airflow/providers/grpc/hooks/grpc.py 
b/airflow/providers/grpc/hooks/grpc.py
index b0bec1b9ce..a6a3ca1271 100644
--- a/airflow/providers/grpc/hooks/grpc.py
+++ b/airflow/providers/grpc/hooks/grpc.py
@@ -21,7 +21,7 @@ from typing import Any, Callable, Generator
 
 import grpc
 from google import auth as google_auth
-from google.auth import jwt as google_auth_jwt
+from google.auth import jwt as google_auth_jwt  # type: ignore[attr-defined]
 from google.auth.transport import (
     grpc as google_auth_transport_grpc,
     requests as google_auth_transport_requests,
diff --git a/airflow/providers/postgres/hooks/postgres.py 
b/airflow/providers/postgres/hooks/postgres.py
index 76ce4dacef..481733ece9 100644
--- a/airflow/providers/postgres/hooks/postgres.py
+++ b/airflow/providers/postgres/hooks/postgres.py
@@ -331,7 +331,9 @@ class PostgresHook(DbApiHook):
         if is_redshift:
             authority = 
self._get_openlineage_redshift_authority_part(connection)
         else:
-            authority = DbApiHook.get_openlineage_authority_part(connection, 
default_port=5432)
+            authority = DbApiHook.get_openlineage_authority_part(  # type: 
ignore[attr-defined]
+                connection, default_port=5432
+            )
 
         return DatabaseInfo(
             scheme="postgres" if not is_redshift else "redshift",
diff --git a/airflow/providers/snowflake/hooks/snowflake.py 
b/airflow/providers/snowflake/hooks/snowflake.py
index 4b0d13b5e5..cd1dc87165 100644
--- a/airflow/providers/snowflake/hooks/snowflake.py
+++ b/airflow/providers/snowflake/hooks/snowflake.py
@@ -300,7 +300,7 @@ class SnowflakeHook(DbApiHook):
     def get_autocommit(self, conn):
         return getattr(conn, "autocommit_mode", False)
 
-    @overload
+    @overload  # type: ignore[override]
     def run(
         self,
         sql: str | Iterable[str],
@@ -385,10 +385,10 @@ class SnowflakeHook(DbApiHook):
             with self._get_cursor(conn, return_dictionaries) as cur:
                 results = []
                 for sql_statement in sql_list:
-                    self._run_command(cur, sql_statement, parameters)
+                    self._run_command(cur, sql_statement, parameters)  # type: 
ignore[attr-defined]
 
                     if handler is not None:
-                        result = self._make_common_data_structure(handler(cur))
+                        result = 
self._make_common_data_structure(handler(cur))  # type: ignore[attr-defined]
                         if return_single_query_results(sql, return_last, 
split_statements):
                             _last_result = result
                             _last_description = cur.description
diff --git a/airflow/providers/trino/operators/trino.py 
b/airflow/providers/trino/operators/trino.py
index 7f90bf9947..20798977b6 100644
--- a/airflow/providers/trino/operators/trino.py
+++ b/airflow/providers/trino/operators/trino.py
@@ -65,11 +65,11 @@ class TrinoOperator(SQLExecuteQueryOperator):
         )
 
     def on_kill(self) -> None:
-        if self._hook is not None and isinstance(self._hook, TrinoHook):
-            query_id = "'" + self._hook.query_id + "'"
+        if self._hook is not None and isinstance(self._hook, TrinoHook):  # 
type: ignore[attr-defined]
+            query_id = "'" + self._hook.query_id + "'"  # type: 
ignore[attr-defined]
             try:
-                self.log.info("Stopping query run with queryId - %s", 
self._hook.query_id)
-                self._hook.run(
+                self.log.info("Stopping query run with queryId - %s", 
self._hook.query_id)  # type: ignore[attr-defined]
+                self._hook.run(  # type: ignore[attr-defined]
                     sql=f"CALL system.runtime.kill_query(query_id => 
{query_id},message => 'Job "
                     f"killed by "
                     f"user');",
diff --git a/airflow/providers/vertica/hooks/vertica.py 
b/airflow/providers/vertica/hooks/vertica.py
index 91672e2aec..191fa10a8b 100644
--- a/airflow/providers/vertica/hooks/vertica.py
+++ b/airflow/providers/vertica/hooks/vertica.py
@@ -132,7 +132,7 @@ class VerticaHook(DbApiHook):
         conn = connect(**conn_config)
         return conn
 
-    @overload
+    @overload  # type: ignore[override]
     def run(
         self,
         sql: str | Iterable[str],
diff --git a/airflow/providers_manager.py b/airflow/providers_manager.py
index 162f2d32af..979bd2ecf1 100644
--- a/airflow/providers_manager.py
+++ b/airflow/providers_manager.py
@@ -404,7 +404,7 @@ class ProvidersManager(LoggingMixin, metaclass=Singleton):
         return ProvidersManager._initialized
 
     @staticmethod
-    def initialization_stack_trace() -> str:
+    def initialization_stack_trace() -> str | None:
         return ProvidersManager._initialization_stack_trace
 
     def __init__(self):
@@ -418,7 +418,7 @@ class ProvidersManager(LoggingMixin, metaclass=Singleton):
         # Keeps dict of hooks keyed by connection type
         self._hooks_dict: dict[str, HookInfo] = {}
         self._fs_set: set[str] = set()
-        self._taskflow_decorators: dict[str, Callable] = LazyDictWithCache()
+        self._taskflow_decorators: dict[str, Callable] = LazyDictWithCache()  
# type: ignore[assignment]
         # keeps mapping between connection_types and hook class, package they 
come from
         self._hook_provider_dict: dict[str, HookClassProvider] = {}
         # Keeps dict of hooks keyed by connection type. They are lazy 
evaluated at access time
@@ -1120,7 +1120,9 @@ class ProvidersManager(LoggingMixin, metaclass=Singleton):
         """Retrieve all configs defined in the providers."""
         for provider_package, provider in self._provider_dict.items():
             if provider.data.get("config"):
-                self._provider_configs[provider_package] = 
provider.data.get("config")
+                self._provider_configs[provider_package] = (
+                    provider.data.get("config")  # type: ignore[assignment]
+                )
 
     def _discover_plugins(self) -> None:
         """Retrieve all plugins defined in the providers."""
@@ -1196,7 +1198,7 @@ class ProvidersManager(LoggingMixin, metaclass=Singleton):
     @property
     def taskflow_decorators(self) -> dict[str, TaskDecorator]:
         self.initialize_providers_taskflow_decorator()
-        return self._taskflow_decorators
+        return self._taskflow_decorators  # type: ignore[return-value]
 
     @property
     def extra_links_class_names(self) -> list[str]:
diff --git a/airflow/sensors/base.py b/airflow/sensors/base.py
index 8bf40c913b..d1a01ac87e 100644
--- a/airflow/sensors/base.py
+++ b/airflow/sensors/base.py
@@ -212,7 +212,9 @@ class BaseSensorOperator(BaseOperator, SkipMixin):
         if self.reschedule:
             # If reschedule, use the start date of the first try (first try 
can be either the very
             # first execution of the task, or the first execution after the 
task was cleared.)
-            first_try_number = context["ti"].max_tries - self.retries + 1
+            max_tries: int = context["ti"].max_tries or 0
+            retries: int = self.retries or 0
+            first_try_number = max_tries - retries + 1
             with create_session() as session:
                 start_date = session.scalar(
                     TaskReschedule.stmt_for_task_instance(
diff --git a/airflow/utils/operator_helpers.py 
b/airflow/utils/operator_helpers.py
index ef1f05e304..5f92467147 100644
--- a/airflow/utils/operator_helpers.py
+++ b/airflow/utils/operator_helpers.py
@@ -172,7 +172,7 @@ class KeywordParameters:
 
     def unpacking(self) -> Mapping[str, Any]:
         """Dump the kwargs mapping to unpack with ``**`` in a function call."""
-        if self._wildcard and isinstance(self._kwargs, Context):
+        if self._wildcard and isinstance(self._kwargs, Context):  # type: 
ignore[misc]
             return lazy_mapping_from_context(self._kwargs)
         return self._kwargs
 
diff --git a/dev/breeze/src/airflow_breeze/pre_commit_ids.py 
b/dev/breeze/src/airflow_breeze/pre_commit_ids.py
index 00b94a86e7..469ecd9cc9 100644
--- a/dev/breeze/src/airflow_breeze/pre_commit_ids.py
+++ b/dev/breeze/src/airflow_breeze/pre_commit_ids.py
@@ -103,6 +103,7 @@ PRE_COMMIT_LIST = [
     "lint-markdown",
     "lint-openapi",
     "mixed-line-ending",
+    "mypy",
     "mypy-core",
     "mypy-dev",
     "mypy-docs",
diff --git a/dev/breeze/src/airflow_breeze/utils/selective_checks.py 
b/dev/breeze/src/airflow_breeze/utils/selective_checks.py
index 9a184b822b..79bae9f8ce 100644
--- a/dev/breeze/src/airflow_breeze/utils/selective_checks.py
+++ b/dev/breeze/src/airflow_breeze/utils/selective_checks.py
@@ -563,6 +563,43 @@ class SelectiveChecks:
             )
             return False
 
+    @cached_property
+    def mypy_packages(self) -> list[str]:
+        packages_to_run: list[str] = []
+        if (
+            self._matching_files(
+                FileGroupForCi.ALL_AIRFLOW_PYTHON_FILES, 
CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES
+            )
+            or self.full_tests_needed
+        ):
+            packages_to_run.append("airflow")
+        if (
+            self._matching_files(
+                FileGroupForCi.ALL_PROVIDERS_PYTHON_FILES, 
CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES
+            )
+            or self._are_all_providers_affected()
+        ) and self._default_branch == "main":
+            packages_to_run.append("airflow/providers")
+        if (
+            self._matching_files(
+                FileGroupForCi.ALL_DOCS_PYTHON_FILES, CI_FILE_GROUP_MATCHES, 
CI_FILE_GROUP_EXCLUDES
+            )
+            or self.full_tests_needed
+        ):
+            packages_to_run.append("docs")
+        if (
+            self._matching_files(
+                FileGroupForCi.ALL_DEV_PYTHON_FILES, CI_FILE_GROUP_MATCHES, 
CI_FILE_GROUP_EXCLUDES
+            )
+            or self.full_tests_needed
+        ):
+            packages_to_run.append("dev")
+        return packages_to_run
+
+    @cached_property
+    def needs_mypy(self) -> bool:
+        return self.mypy_packages != []
+
     @cached_property
     def needs_python_scans(self) -> bool:
         return self._should_be_run(FileGroupForCi.PYTHON_PRODUCTION_FILES)
@@ -802,6 +839,14 @@ class SelectiveChecks:
     def skip_pre_commits(self) -> str:
         pre_commits_to_skip = set()
         pre_commits_to_skip.add("identity")
+        # Skip all mypy "individual" file checks if we are running mypy checks 
in CI
+        # In the CI we always run mypy for the whole "package" rather than for 
`--all-files` because
+        # The pre-commit will semi-randomly skip such list of files into 
several groups and we want
+        # to make sure that such checks are always run in CI for whole "group" 
of files - i.e.
+        # whole package rather than for individual files. That's why we skip 
those checks in CI
+        # and run them via `mypy-all` command instead and dedicated CI job in 
matrix
+        # This will also speed up static-checks job usually as the jobs will 
be running in parallel
+        pre_commits_to_skip.update({"mypy-providers", "mypy-core", 
"mypy-docs", "mypy-dev"})
         if self._default_branch != "main":
             # Skip those tests on all "release" branches
             pre_commits_to_skip.update(
@@ -810,29 +855,13 @@ class SelectiveChecks:
                     "check-extra-packages-references",
                     "check-provider-yaml-valid",
                     "lint-helm-chart",
-                    "mypy-providers",
                 )
             )
+
         if self.full_tests_needed:
             # when full tests are needed, we do not want to skip any checks 
and we should
             # run all the pre-commits just to be sure everything is ok when 
some structural changes occurred
             return ",".join(sorted(pre_commits_to_skip))
-        if not self._matching_files(
-            FileGroupForCi.ALL_PROVIDERS_PYTHON_FILES, CI_FILE_GROUP_MATCHES, 
CI_FILE_GROUP_EXCLUDES
-        ):
-            pre_commits_to_skip.add("mypy-providers")
-        if not self._matching_files(
-            FileGroupForCi.ALL_AIRFLOW_PYTHON_FILES, CI_FILE_GROUP_MATCHES, 
CI_FILE_GROUP_EXCLUDES
-        ):
-            pre_commits_to_skip.add("mypy-core")
-        if not self._matching_files(
-            FileGroupForCi.ALL_DOCS_PYTHON_FILES, CI_FILE_GROUP_MATCHES, 
CI_FILE_GROUP_EXCLUDES
-        ):
-            pre_commits_to_skip.add("mypy-docs")
-        if not self._matching_files(
-            FileGroupForCi.ALL_DEV_PYTHON_FILES, CI_FILE_GROUP_MATCHES, 
CI_FILE_GROUP_EXCLUDES
-        ):
-            pre_commits_to_skip.add("mypy-dev")
         if not self._matching_files(FileGroupForCi.WWW_FILES, 
CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES):
             pre_commits_to_skip.add("ts-compile-format-lint-www")
         if not self._matching_files(
diff --git a/dev/breeze/tests/test_selective_checks.py 
b/dev/breeze/tests/test_selective_checks.py
index 8c11e2628e..667342267e 100644
--- a/dev/breeze/tests/test_selective_checks.py
+++ b/dev/breeze/tests/test_selective_checks.py
@@ -75,7 +75,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, 
str], stderr: str):
                     print_in_color("\nOutput received:")
                     print_in_color(received_output_as_dict)
                     print_in_color()
-                    assert expected_value == received_value
+                    assert received_value == expected_value
                 else:
                     print(
                         f"\n[red]ERROR: The key '{expected_key}' missing but "
@@ -111,6 +111,8 @@ def assert_outputs_are_printed(expected_outputs: dict[str, 
str], stderr: str):
                     "mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": None,
+                    "needs-mypy": "false",
+                    "mypy-packages": "[]",
                 },
                 id="No tests on simple change",
             )
@@ -130,10 +132,12 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "false",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-dev,"
+                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,"
                     "mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "API Always",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow']",
                 },
                 id="Only API tests and DOCS should run",
             )
@@ -153,10 +157,12 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "false",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-dev,"
+                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,"
                     "mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "Always Operators",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow']",
                 },
                 id="Only Operator tests and DOCS should run",
             )
@@ -176,11 +182,13 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "false",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-dev,"
+                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,"
                     "mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "Always 
BranchExternalPython BranchPythonVenv "
                     "ExternalPython Operators PythonVenv",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow']",
                 },
                 id="Only Python tests",
             )
@@ -200,10 +208,12 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "false",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-dev,"
+                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,"
                     "mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "Always 
Serialization",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow']",
                 },
                 id="Only Serialization tests",
             )
@@ -227,11 +237,13 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "true",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"identity,lint-helm-chart,mypy-dev,mypy-docs,"
+                    "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,"
                     "ts-compile-format-lint-www",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "API Always 
Providers[amazon] "
                     "Providers[common.sql,openlineage,pgvector,postgres] 
Providers[google]",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow', 'airflow/providers']",
                 },
                 id="API and providers tests and docs should run",
             )
@@ -251,11 +263,13 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "false",
                     "docs-build": "false",
-                    "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,"
+                    "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,"
                     "ts-compile-format-lint-www",
                     "run-kubernetes-tests": "false",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "Always 
Providers[apache.beam] Providers[google]",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow/providers']",
                 },
                 id="Selected Providers and docs should run",
             )
@@ -280,6 +294,8 @@ def assert_outputs_are_printed(expected_outputs: dict[str, 
str], stderr: str):
                     "run-kubernetes-tests": "false",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": None,
+                    "needs-mypy": "false",
+                    "mypy-packages": "[]",
                 },
                 id="Only docs builds should run - no tests needed",
             )
@@ -303,11 +319,13 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "true",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,ts-compile-format-lint-www",
+                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "run-kubernetes-tests": "true",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "Always 
Providers[amazon] "
                     "Providers[common.sql,openlineage,pgvector,postgres] 
Providers[google]",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow/providers']",
                 },
                 id="Helm tests, providers (both upstream and downstream),"
                 "kubernetes tests and docs should run",
@@ -333,11 +351,13 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "true",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,ts-compile-format-lint-www",
+                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "run-kubernetes-tests": "true",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "Always "
                     
"Providers[airbyte,apache.livy,dbt.cloud,dingding,discord,http] 
Providers[amazon]",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow/providers']",
                 },
                 id="Helm tests, http and all relevant providers, kubernetes 
tests and "
                 "docs should run even if unimportant files were added",
@@ -362,10 +382,12 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "run-tests": "true",
                     "run-amazon-tests": "false",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,ts-compile-format-lint-www",
+                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "run-kubernetes-tests": "true",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "Always 
Providers[airbyte,http]",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow/providers']",
                 },
                 id="Helm tests, airbyte/http providers, kubernetes tests and "
                 "docs should run even if unimportant files were added",
@@ -389,12 +411,14 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                     "needs-helm-tests": "true",
                     "run-tests": "true",
                     "docs-build": "true",
-                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,mypy-dev,"
+                    "skip-pre-commits": 
"check-provider-yaml-valid,identity,mypy-core,mypy-dev,"
                     "mypy-docs,mypy-providers,ts-compile-format-lint-www",
                     "run-amazon-tests": "false",
                     "run-kubernetes-tests": "true",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "Always",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow']",
                 },
                 id="Docs should run even if unimportant files were added and 
prod image "
                 "should be build for chart changes",
@@ -416,9 +440,11 @@ def assert_outputs_are_printed(expected_outputs: dict[str, 
str], stderr: str):
                     "run-amazon-tests": "true",
                     "docs-build": "true",
                     "full-tests-needed": "true",
-                    "skip-pre-commits": "identity",
+                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                     "upgrade-to-newer-dependencies": "true",
                     "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
                 },
                 id="Everything should run - including all providers and 
upgrading to "
                 "newer requirements as setup.py changed and all Python 
versions",
@@ -440,9 +466,11 @@ def assert_outputs_are_printed(expected_outputs: dict[str, 
str], stderr: str):
                     "run-amazon-tests": "true",
                     "docs-build": "true",
                     "full-tests-needed": "true",
-                    "skip-pre-commits": "identity",
+                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                     "upgrade-to-newer-dependencies": "true",
                     "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
                 },
                 id="Everything should run and upgrading to newer requirements 
as dependencies change",
             )
@@ -462,13 +490,15 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                 "needs-helm-tests": "false",
                 "run-tests": "true",
                 "docs-build": "true",
-                "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "run-kubernetes-tests": "false",
                 "upgrade-to-newer-dependencies": "false",
                 "run-amazon-tests": "true",
                 "parallel-test-types-list-as-string": "Always 
Providers[amazon] "
                 
"Providers[apache.hive,cncf.kubernetes,common.sql,exasol,ftp,http,"
                 
"imap,microsoft.azure,mongo,mysql,openlineage,postgres,salesforce,ssh] 
Providers[google]",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow/providers']",
             },
             id="Providers tests run including amazon tests if amazon provider 
files changed",
         ),
@@ -486,10 +516,12 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                 "run-tests": "true",
                 "run-amazon-tests": "false",
                 "docs-build": "false",
-                "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "run-kubernetes-tests": "false",
                 "upgrade-to-newer-dependencies": "false",
                 "parallel-test-types-list-as-string": "Always 
Providers[airbyte,http]",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow/providers']",
             },
             id="Providers tests run without amazon tests if no amazon file 
changed",
         ),
@@ -509,12 +541,14 @@ def assert_outputs_are_printed(expected_outputs: 
dict[str, str], stderr: str):
                 "run-tests": "true",
                 "run-amazon-tests": "true",
                 "docs-build": "true",
-                "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "run-kubernetes-tests": "false",
                 "upgrade-to-newer-dependencies": "false",
                 "parallel-test-types-list-as-string": "Always 
Providers[amazon] "
                 
"Providers[apache.hive,cncf.kubernetes,common.sql,exasol,ftp,http,"
                 
"imap,microsoft.azure,mongo,mysql,openlineage,postgres,salesforce,ssh] 
Providers[google]",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow/providers']",
             },
             id="Providers tests run including amazon tests if amazon provider 
files changed",
         ),
@@ -537,8 +571,11 @@ def assert_outputs_are_printed(expected_outputs: dict[str, 
str], stderr: str):
                 "run-amazon-tests": "false",
                 "docs-build": "false",
                 "run-kubernetes-tests": "false",
+                "skip-pre-commits": 
"identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "upgrade-to-newer-dependencies": "false",
                 "parallel-test-types-list-as-string": "Always 
Providers[common.io,openlineage]",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers']",
             },
             id="Only Always and Common.IO tests should run when only common.io 
and tests/always changed",
         ),
@@ -578,9 +615,11 @@ def test_expected_output_pull_request_main(
                     "docs-build": "true",
                     "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD,
                     "full-tests-needed": "true",
-                    "skip-pre-commits": "identity",
+                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
                 },
                 id="Everything should run including all providers when full 
tests are needed",
             )
@@ -605,9 +644,11 @@ def test_expected_output_pull_request_main(
                     "docs-build": "true",
                     "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD,
                     "full-tests-needed": "true",
-                    "skip-pre-commits": "identity",
+                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
                 },
                 id="Everything should run including full providers when full "
                 "tests are needed even with different label set as well",
@@ -630,9 +671,11 @@ def test_expected_output_pull_request_main(
                     "docs-build": "true",
                     "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD,
                     "full-tests-needed": "true",
-                    "skip-pre-commits": "identity",
+                    "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
                 },
                 id="Everything should run including full providers when"
                 "full tests are needed even if no files are changed",
@@ -655,14 +698,14 @@ def test_expected_output_pull_request_main(
                     "docs-build": "true",
                     "docs-list-as-string": "apache-airflow docker-stack",
                     "full-tests-needed": "true",
-                    "skip-pre-commits": "check-airflow-provider-compatibility,"
-                    
"check-extra-packages-references,check-provider-yaml-valid,identity,"
-                    "lint-helm-chart,mypy-providers",
+                    "skip-pre-commits": 
"check-airflow-provider-compatibility,check-extra-packages-references,check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                     "skip-provider-tests": "true",
                     "upgrade-to-newer-dependencies": "false",
                     "parallel-test-types-list-as-string": "API Always 
BranchExternalPython "
                     "BranchPythonVenv CLI Core ExternalPython Operators Other 
PlainAsserts "
                     "PythonVenv Serialization WWW",
+                    "needs-mypy": "true",
+                    "mypy-packages": "['airflow', 'docs', 'dev']",
                 },
                 id="Everything should run except Providers and lint pre-commit 
"
                 "when full tests are needed for non-main branch",
@@ -707,6 +750,8 @@ def test_expected_output_full_tests_needed(
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "true",
                 "parallel-test-types-list-as-string": None,
+                "needs-mypy": "false",
+                "mypy-packages": "[]",
             },
             id="Nothing should run if only non-important files changed",
         ),
@@ -735,6 +780,8 @@ def test_expected_output_full_tests_needed(
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "true",
                 "parallel-test-types-list-as-string": "Always",
+                "needs-mypy": "false",
+                "mypy-packages": "[]",
             },
             id="No Helm tests, No providers no lint charts, should run if "
             "only chart/providers changed in non-main but PROD image should be 
built",
@@ -759,13 +806,13 @@ def test_expected_output_full_tests_needed(
                 "docs-build": "true",
                 "docs-list-as-string": "apache-airflow docker-stack",
                 "full-tests-needed": "false",
-                "skip-pre-commits": 
"check-airflow-provider-compatibility,check-extra-packages-references,"
-                "check-provider-yaml-valid,identity,lint-helm-chart,"
-                "mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"check-airflow-provider-compatibility,check-extra-packages-references,check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "run-kubernetes-tests": "true",
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "true",
                 "parallel-test-types-list-as-string": "Always CLI",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow']",
             },
             id="Only CLI tests and Kubernetes tests should run if cli/chart 
files changed in non-main branch",
         ),
@@ -788,11 +835,11 @@ def test_expected_output_full_tests_needed(
                 "run-kubernetes-tests": "false",
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "true",
-                "skip-pre-commits": 
"check-airflow-provider-compatibility,check-extra-packages-references,"
-                "check-provider-yaml-valid,identity,lint-helm-chart,"
-                "mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"check-airflow-provider-compatibility,check-extra-packages-references,check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "parallel-test-types-list-as-string": "API Always 
BranchExternalPython BranchPythonVenv "
                 "CLI Core ExternalPython Operators Other PlainAsserts 
PythonVenv Serialization WWW",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow']",
             },
             id="All tests except Providers and helm lint pre-commit "
             "should run if core file changed in non-main branch",
@@ -832,6 +879,8 @@ def test_expected_output_pull_request_v2_7(
                 
"mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "skip-provider-tests": "true",
                 "parallel-test-types-list-as-string": None,
+                "needs-mypy": "false",
+                "mypy-packages": "[]",
             },
             id="Nothing should run if only non-important files changed",
         ),
@@ -847,11 +896,12 @@ def test_expected_output_pull_request_v2_7(
                 "run-tests": "true",
                 "docs-build": "true",
                 "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD,
-                "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,"
-                "mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "true",
                 "parallel-test-types-list-as-string": "Always",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow']",
             },
             id="Only Always and docs build should run if only system tests 
changed",
         ),
@@ -877,7 +927,7 @@ def test_expected_output_pull_request_v2_7(
                 "cncf.kubernetes common.sql facebook google hashicorp 
microsoft.azure "
                 "microsoft.mssql mysql openlineage oracle postgres "
                 "presto salesforce samba sftp ssh trino",
-                "skip-pre-commits": 
"identity,mypy-dev,mypy-docs,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "run-kubernetes-tests": "true",
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "false",
@@ -885,6 +935,8 @@ def test_expected_output_pull_request_v2_7(
                 
"Providers[apache.beam,apache.cassandra,cncf.kubernetes,common.sql,facebook,hashicorp,"
                 
"microsoft.azure,microsoft.mssql,mysql,openlineage,oracle,postgres,presto,salesforce,"
                 "samba,sftp,ssh,trino] Providers[google]",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers']",
             },
             id="CLI tests and Google-related provider tests should run if 
cli/chart files changed but "
             "prod image should be build too and k8s tests too",
@@ -906,12 +958,13 @@ def test_expected_output_pull_request_v2_7(
                 "run-tests": "true",
                 "docs-build": "true",
                 "docs-list-as-string": "apache-airflow",
-                "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-dev,"
-                "mypy-docs,mypy-providers,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "run-kubernetes-tests": "false",
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "true",
                 "parallel-test-types-list-as-string": "API Always CLI 
Operators WWW",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow']",
             },
             id="No providers tests should run if only CLI/API/Operators/WWW 
file changed",
         ),
@@ -927,12 +980,13 @@ def test_expected_output_pull_request_v2_7(
                 "run-tests": "true",
                 "docs-build": "true",
                 "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD,
-                "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-dev,"
-                "mypy-docs,mypy-providers,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "run-kubernetes-tests": "false",
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "false",
                 "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers']",
             },
             id="Tests for all providers should run if model file changed",
         ),
@@ -948,12 +1002,13 @@ def test_expected_output_pull_request_v2_7(
                 "run-tests": "true",
                 "docs-build": "true",
                 "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD,
-                "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-dev,"
-                "mypy-docs,mypy-providers,ts-compile-format-lint-www",
+                "skip-pre-commits": 
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www",
                 "run-kubernetes-tests": "false",
                 "upgrade-to-newer-dependencies": "false",
                 "skip-provider-tests": "false",
                 "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers']",
             },
             id="Tests for all providers should run if any other than 
API/WWW/CLI/Operators file changed.",
         ),
@@ -990,9 +1045,11 @@ def test_expected_output_pull_request_target(
                 "run-tests": "true",
                 "docs-build": "true",
                 "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD,
-                "skip-pre-commits": "identity",
+                "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                 "upgrade-to-newer-dependencies": "true",
                 "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
             },
             id="All tests run on push even if unimportant file changed",
         ),
@@ -1009,12 +1066,13 @@ def test_expected_output_pull_request_target(
                 "needs-helm-tests": "false",
                 "run-tests": "true",
                 "docs-build": "true",
-                "skip-pre-commits": 
"check-airflow-provider-compatibility,check-extra-packages-references,"
-                
"check-provider-yaml-valid,identity,lint-helm-chart,mypy-providers",
+                "skip-pre-commits": 
"check-airflow-provider-compatibility,check-extra-packages-references,check-provider-yaml-valid,identity,lint-helm-chart,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                 "docs-list-as-string": "apache-airflow docker-stack",
                 "upgrade-to-newer-dependencies": "true",
                 "parallel-test-types-list-as-string": "API Always 
BranchExternalPython BranchPythonVenv "
                 "CLI Core ExternalPython Operators Other PlainAsserts 
PythonVenv Serialization WWW",
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'docs', 'dev']",
             },
             id="All tests except Providers and Helm run on push"
             " even if unimportant file changed in non-main branch",
@@ -1032,10 +1090,12 @@ def test_expected_output_pull_request_target(
                 "needs-helm-tests": "true",
                 "run-tests": "true",
                 "docs-build": "true",
-                "skip-pre-commits": "identity",
+                "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers",
                 "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD,
                 "upgrade-to-newer-dependencies": "true",
                 "parallel-test-types-list-as-string": 
ALL_CI_SELECTIVE_TEST_TYPES,
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
             },
             id="All tests run on push if core file changed",
         ),
@@ -1084,11 +1144,13 @@ def 
test_no_commit_provided_trigger_full_build_for_any_event_type(github_event):
             "needs-helm-tests": "true",
             "run-tests": "true",
             "docs-build": "true",
-            "skip-pre-commits": "identity",
+            "skip-pre-commits": 
"identity,mypy-core,mypy-dev,mypy-docs,mypy-providers",
             "upgrade-to-newer-dependencies": "true"
             if github_event in [GithubEvents.PUSH, GithubEvents.SCHEDULE]
             else "false",
             "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES,
+            "needs-mypy": "true",
+            "mypy-packages": "['airflow', 'airflow/providers', 'docs', 'dev']",
         },
         str(stderr),
     )
@@ -1577,3 +1639,91 @@ def test_provider_compatibility_checks(labels: 
tuple[str, ...], expected_outputs
         default_branch="main",
     )
     assert_outputs_are_printed(expected_outputs, str(stderr))
+
+
+@pytest.mark.parametrize(
+    "files, expected_outputs, default_branch, pr_labels",
+    [
+        pytest.param(
+            ("README.md",),
+            {
+                "needs-mypy": "false",
+                "mypy-packages": "[]",
+            },
+            "main",
+            (),
+            id="No mypy checks on non-python files",
+        ),
+        pytest.param(
+            ("airflow/cli/file.py",),
+            {
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow']",
+            },
+            "main",
+            (),
+            id="Airflow mypy checks on airflow regular files",
+        ),
+        pytest.param(
+            ("airflow/models/file.py",),
+            {
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers']",
+            },
+            "main",
+            (),
+            id="Airflow mypy checks on airflow files that can trigger provider 
tests",
+        ),
+        pytest.param(
+            ("airflow/providers/a_file.py",),
+            {
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow/providers']",
+            },
+            "main",
+            (),
+            id="Airflow mypy checks on provider files",
+        ),
+        pytest.param(
+            ("docs/a_file.py",),
+            {
+                "needs-mypy": "true",
+                "mypy-packages": "['docs']",
+            },
+            "main",
+            (),
+            id="Doc checks on doc files",
+        ),
+        pytest.param(
+            ("dev/a_package/a_file.py",),
+            {
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
+            },
+            "main",
+            (),
+            id="All mypy checks on def files changed (full tests needed are 
implicit)",
+        ),
+        pytest.param(
+            ("readme.md",),
+            {
+                "needs-mypy": "true",
+                "mypy-packages": "['airflow', 'airflow/providers', 'docs', 
'dev']",
+            },
+            "main",
+            ("full tests needed",),
+            id="All mypy checks on full tests needed",
+        ),
+    ],
+)
+def test_mypy_matches(
+    files: tuple[str, ...], expected_outputs: dict[str, str], default_branch: 
str, pr_labels: tuple[str, ...]
+):
+    stderr = SelectiveChecks(
+        files=files,
+        commit_ref="HEAD",
+        default_branch=default_branch,
+        github_event=GithubEvents.PULL_REQUEST,
+        pr_labels=pr_labels,
+    )
+    assert_outputs_are_printed(expected_outputs, str(stderr))
diff --git a/images/breeze/output_static-checks.svg 
b/images/breeze/output_static-checks.svg
index 6d16c9a61b..be61aa2829 100644
--- a/images/breeze/output_static-checks.svg
+++ b/images/breeze/output_static-checks.svg
@@ -343,18 +343,18 @@
 </text><text class="breeze-static-checks-r5" x="0" y="922.8" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-37)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="922.8" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-37)">doctoc&#160;|&#160;end-of-file-fixer&#160;|&#160;fix-encoding-pragma&#160;|&#160;flynt&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</t
 [...]
 </text><text class="breeze-static-checks-r5" x="0" y="947.2" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-38)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="947.2" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-38)">generate-airflow-diagrams&#160;|&#160;generate-pypi-readme&#160;|&#160;identity&#160;|&#160;insert-license&#160;|&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="947.2" textLength="12.2" 
clip-path="url(#bre [...]
 </text><text class="breeze-static-checks-r5" x="0" y="971.6" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-39)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="971.6" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-39)">lint-chart-schema&#160;|&#160;lint-css&#160;|&#160;lint-dockerfile&#160;|&#160;lint-helm-chart&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5 [...]
-</text><text class="breeze-static-checks-r5" x="0" y="996" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-40)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="996" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-40)">lint-json-schema&#160;|&#160;lint-markdown&#160;|&#160;lint-openapi&#160;|&#160;mixed-line-ending&#160;|&#160;mypy-core&#160;|</text><text
 class="breeze-static-checks-r5" x="1451.8" y="996" textLength="12.2" 
clip-path="url(#breeze-static- [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1020.4" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1020.4" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-41)">mypy-dev&#160;|&#160;mypy-docs&#160;|&#160;mypy-providers&#160;|&#160;pretty-format-json&#160;|&#160;python-no-log-warn&#160;|</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1020.4" textLength="12.2" 
clip-path="url(#breez [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1044.8" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1044.8" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-42)">replace-bad-characters&#160;|&#160;rst-backticks&#160;|&#160;ruff&#160;|&#160;ruff-format&#160;|&#160;shellcheck&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1044.8" text [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1069.2" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1069.2" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-43)">trailing-whitespace&#160;|&#160;ts-compile-format-lint-www&#160;|&#160;update-black-version&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1069.2" textLength="12.2" c [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1093.6" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1093.6" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-44)">update-breeze-cmd-output&#160;|&#160;update-breeze-readme-config-hash&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-ch [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1118" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-45)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1118" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-45)">update-common-sql-api-stubs&#160;|&#160;update-er-diagram&#160;|&#160;update-extras&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="14 [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1142.4" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-46)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1142.4" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-46)">update-in-the-wild-to-be-sorted&#160;|&#160;update-inlined-dockerfile-scripts&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1142.4" textLengt [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1166.8" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-47)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1166.8" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-47)">update-installed-providers-to-be-sorted&#160;|&#160;update-local-yml-file&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8"  [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1191.2" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-48)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1191.2" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-48)">update-migration-references&#160;|&#160;update-providers-dependencies&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-ch [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1215.6" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-49)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1215.6" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-49)">update-spelling-wordlist-to-be-sorted&#160;|&#160;update-supported-versions&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1215.6" [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1240" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-50)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1240" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-50)">update-vendored-in-k8s-json-schema&#160;|&#160;update-version&#160;|&#160;validate-pyproject&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1240" textLength="12.2" 
clip-path="u [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1264.4" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1264.4" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-51)">yamllint)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&
 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="996" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-40)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="996" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-40)">lint-json-schema&#160;|&#160;lint-markdown&#160;|&#160;lint-openapi&#160;|&#160;mixed-line-ending&#160;|&#160;mypy&#160;|&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="996" textLength="12.2" clip- 
[...]
+</text><text class="breeze-static-checks-r5" x="0" y="1020.4" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1020.4" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-41)">mypy-core&#160;|&#160;mypy-dev&#160;|&#160;mypy-docs&#160;|&#160;mypy-providers&#160;|&#160;pretty-format-json&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="10 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1044.8" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1044.8" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-42)">python-no-log-warn&#160;|&#160;replace-bad-characters&#160;|&#160;rst-backticks&#160;|&#160;ruff&#160;|&#160;ruff-format&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1044.8" textLength="12.2" 
clip-path="url(#breez [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1069.2" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1069.2" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-43)">|&#160;shellcheck&#160;|&#160;trailing-whitespace&#160;|&#160;ts-compile-format-lint-www&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks- [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1093.6" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1093.6" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-44)">update-black-version&#160;|&#160;update-breeze-cmd-output&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;
 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1118" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-45)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1118" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-45)">update-breeze-readme-config-hash&#160;|&#160;update-common-sql-api-stubs&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1142.4" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-46)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1142.4" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-46)">update-er-diagram&#160;|&#160;update-extras&#160;|&#160;update-in-the-wild-to-be-sorted&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1142.4" [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1166.8" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-47)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1166.8" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-47)">update-inlined-dockerfile-scripts&#160;|&#160;update-installed-providers-to-be-sorted&#160;|&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1166.8" textLength="12.2" 
clip-path="url(#breeze-static-c [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1191.2" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-48)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1191.2" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-48)">update-local-yml-file&#160;|&#160;update-migration-references&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#
 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1215.6" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-49)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1215.6" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-49)">update-providers-dependencies&#160;|&#160;update-spelling-wordlist-to-be-sorted&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1215.6" textLength="12.2" c [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1240" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-50)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1240" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-50)">update-supported-versions&#160;|&#160;update-vendored-in-k8s-json-schema&#160;|&#160;update-version&#160;|</text><text
 class="breeze-static-checks-r5" x="1451.8" y="1240" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-50)"> [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1264.4" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">│</text><text 
class="breeze-static-checks-r7" x="451.4" y="1264.4" textLength="988.2" 
clip-path="url(#breeze-static-checks-line-51)">validate-pyproject&#160;|&#160;yamllint)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;
 [...]
 </text><text class="breeze-static-checks-r5" x="0" y="1288.8" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-52)">│</text><text 
class="breeze-static-checks-r4" x="24.4" y="1288.8" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-52)">-</text><text 
class="breeze-static-checks-r4" x="36.6" y="1288.8" textLength="61" 
clip-path="url(#breeze-static-checks-line-52)">-show</text><text 
class="breeze-static-checks-r4" x="97.6" y="1288.8" textLength="195.2" 
clip-path="url(# [...]
 </text><text class="breeze-static-checks-r5" x="0" y="1313.2" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-53)">│</text><text 
class="breeze-static-checks-r4" x="24.4" y="1313.2" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-53)">-</text><text 
class="breeze-static-checks-r4" x="36.6" y="1313.2" textLength="134.2" 
clip-path="url(#breeze-static-checks-line-53)">-initialize</text><text 
class="breeze-static-checks-r4" x="170.8" y="1313.2" textLength="146.4" clip-p 
[...]
 </text><text class="breeze-static-checks-r5" x="0" y="1337.6" 
textLength="12.2" clip-path="url(#breeze-static-checks-line-54)">│</text><text 
class="breeze-static-checks-r4" x="24.4" y="1337.6" textLength="12.2" 
clip-path="url(#breeze-static-checks-line-54)">-</text><text 
class="breeze-static-checks-r4" x="36.6" y="1337.6" textLength="48.8" 
clip-path="url(#breeze-static-checks-line-54)">-max</text><text 
class="breeze-static-checks-r4" x="85.4" y="1337.6" textLength="292.8" 
clip-path="url( [...]
diff --git a/images/breeze/output_static-checks.txt 
b/images/breeze/output_static-checks.txt
index cba6394a9c..9ae25e18c1 100644
--- a/images/breeze/output_static-checks.txt
+++ b/images/breeze/output_static-checks.txt
@@ -1 +1 @@
-cc154a8e6d64f6034782bac9898f3a05
+d5902ce9c52ac5338e019056b557da73
diff --git a/scripts/ci/pre_commit/common_precommit_utils.py 
b/scripts/ci/pre_commit/common_precommit_utils.py
index 76370b15ad..8926bc1823 100644
--- a/scripts/ci/pre_commit/common_precommit_utils.py
+++ b/scripts/ci/pre_commit/common_precommit_utils.py
@@ -45,12 +45,32 @@ def read_airflow_version() -> str:
     raise RuntimeError("Couldn't find __version__ in AST")
 
 
-def filter_out_providers_on_non_main_branch(files: list[str]) -> list[str]:
-    """When running build on non-main branch do not take providers into 
account"""
+def pre_process_files(files: list[str]) -> list[str]:
+    """Pre-process files passed to mypy.
+
+    * When running build on non-main branch do not take providers into account.
+    * When running "airflow/providers" package, then we need to add 
--namespace-packages flag.
+    * When running "airflow" package, then we need to exclude providers.
+    """
     default_branch = os.environ.get("DEFAULT_BRANCH")
     if not default_branch or default_branch == "main":
         return files
-    return [file for file in files if not 
file.startswith(f"airflow{os.sep}providers")]
+    result = [file for file in files if not 
file.startswith(f"airflow{os.sep}providers")]
+    if "airflow/providers" in files:
+        if len(files) > 1:
+            raise RuntimeError(
+                "When running `airflow/providers` package, you cannot run any 
other packages because only "
+                "airflow/providers package requires --namespace-packages flag 
to be set"
+            )
+        result.append("--namespace-packages")
+    if "airflow" in files:
+        if len(files) > 1:
+            raise RuntimeError(
+                "When running `airflow` package, you cannot run any other 
packages because only "
+                "airflow/providers package requires --exclude 
airflow/providers/.* flag to be set"
+            )
+        result.extend(["--exclude", "airflow/providers/.*"])
+    return result
 
 
 def insert_documentation(file_path: Path, content: list[str], header: str, 
footer: str):
diff --git a/scripts/ci/pre_commit/pre_commit_mypy.py 
b/scripts/ci/pre_commit/pre_commit_mypy.py
index 61fc4baede..8f38f27ac4 100755
--- a/scripts/ci/pre_commit/pre_commit_mypy.py
+++ b/scripts/ci/pre_commit/pre_commit_mypy.py
@@ -18,6 +18,7 @@
 from __future__ import annotations
 
 import os
+import shlex
 import sys
 from pathlib import Path
 
@@ -25,15 +26,18 @@ sys.path.insert(0, str(Path(__file__).parent.resolve()))
 
 from common_precommit_utils import (
     console,
-    filter_out_providers_on_non_main_branch,
     initialize_breeze_precommit,
+    pre_process_files,
     run_command_via_breeze_shell,
 )
 
 initialize_breeze_precommit(__name__, __file__)
 
-files_to_test = filter_out_providers_on_non_main_branch(sys.argv[1:])
-if files_to_test == ["--namespace-packages"]:
+files_to_test = pre_process_files(sys.argv[1:])
+mypy_packages = os.environ.get("MYPY_PACKAGES")
+if mypy_packages:
+    files_to_test += shlex.split(mypy_packages)
+if files_to_test == ["--namespace-packages"] or files_to_test == []:
     print("No files to tests. Quitting")
     sys.exit(0)
 
@@ -51,19 +55,26 @@ res = run_command_via_breeze_shell(
         "MOUNT_SOURCES": "selected",
     },
 )
+ci_environment = os.environ.get("CI")
 if res.returncode != 0:
+    if mypy_packages and ci_environment:
+        console.print(
+            "[yellow]You are running mypy with the packages selected. If you 
want to"
+            "reproduce it locally, you need to run the following command:\n"
+        )
+        console.print(
+            f'MYPY_PACKAGES="{mypy_packages}" pre-commit run --hook-stage 
manual mypy --all-files\n'
+        )
     upgrading = os.environ.get("UPGRADE_TO_NEWER_DEPENDENCIES", "false") != 
"false"
     if upgrading:
         console.print(
-            "[yellow]You are running mypy with the image that has dependencies 
upgraded automatically."
+            "[yellow]You are running mypy with the image that has dependencies 
upgraded automatically.\n"
         )
     flag = " --upgrade-to-newer-dependencies" if upgrading else ""
     console.print(
-        "[yellow]If you see strange stacktraces above, "
-        f"run `breeze ci-image build --python 3.8{flag}` and try again. "
-        "You can also run `breeze down --cleanup-mypy-cache` to clean up the 
cache used. "
-        "Still sometimes diff heuristic in mypy is behaving abnormal, to 
double check you can "
-        "call `breeze static-checks --type mypy-[dev|core|providers|docs] 
--all-files` "
-        'and then commit via `git commit --no-verify -m "commit message"`. CI 
will do a full check.'
+        "[yellow]If you see strange stacktraces above, and can't reproduce it, 
please run"
+        " this command and try again:\n"
     )
+    console.print(f"breeze ci-image build --python 3.8{flag}\n")
+    console.print("[yellow]You can also run `breeze down --cleanup-mypy-cache` 
to clean up the cache used.\n")
 sys.exit(res.returncode)


Reply via email to