ferruzzi commented on code in PR #61153: URL: https://github.com/apache/airflow/pull/61153#discussion_r2843555171
########## airflow-core/src/airflow/executors/workloads/callback.py: ########## @@ -0,0 +1,158 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +"""Callback workload schemas for executor communication.""" + +from __future__ import annotations + +from enum import Enum +from importlib import import_module +from pathlib import Path +from typing import TYPE_CHECKING, Literal +from uuid import UUID + +import structlog +from pydantic import BaseModel, Field, field_validator + +from airflow.executors.workloads.base import BaseDagBundleWorkload, BundleInfo + +if TYPE_CHECKING: + from airflow.api_fastapi.auth.tokens import JWTGenerator + from airflow.models import DagRun + from airflow.models.callback import Callback as CallbackModel, CallbackKey + +log = structlog.get_logger(__name__) + + +class CallbackFetchMethod(str, Enum): + """Methods used to fetch callback at runtime.""" + + # For future use once Dag Processor callbacks (on_success_callback/on_failure_callback) get moved to executors + DAG_ATTRIBUTE = "dag_attribute" + + # For deadline callbacks since they import callbacks through the import path + IMPORT_PATH = "import_path" + + +class CallbackDTO(BaseModel): Review Comment: "Two hard things", for sure. How strongly do you feel about this? I think "TaskInstance" by itself is confusingly overloaded (I count four different classes/objects named just TaskInstance, IIRC) and we should start differentiating them instead of relying on the reader checking the import module name to know which TI object that are dealing with. Which all gets compounded by using import aliases some places and not others, etc. We have `import TaskInstance as TI` some places, ` import TaskInstance as TIModel` in some places, `import TaskInstance as TaskInstanceSDK`, etc. We already have `Response`, `Body`, `Key`, `State` suffixes, so adding a new one isn't a stretch. I like DTO; it's short, simple, and not uncommon outside of Airflow. We may not be using it anywhere else yet but I don't think that should stop us from introducing it. I looked at the `datamodels` module and it doesn't appear to be related, so naming it that might lead to even more confusion. In fact, I might argue the opposite; it seems to make way more sense to rename `execution_api/datamodels/taskinstance.TaskInstance` to `execution_api/datamodels/taskinstance.TaskInstanceDataModel` than it does to use that name for the Pydantic data transfer object here, and using it here would just add confusion. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
