olegkachur-e commented on code in PR #52005: URL: https://github.com/apache/airflow/pull/52005#discussion_r2231262067
########## providers/google/src/airflow/providers/google/cloud/triggers/dataproc.py: ########## @@ -214,6 +216,167 @@ async def run(self): raise e +class DataprocSubmitJobTrigger(DataprocBaseTrigger): + """DataprocSubmitJobTrigger runs on the trigger worker to perform Build operation.""" + + def __init__( + self, + job: dict, + request_id: str | None = None, + retry: Retry | _MethodDefault = DEFAULT, + timeout: float | None = None, + metadata: Sequence[tuple[str, str]] = (), + **kwargs, + ): + super().__init__(**kwargs) + self.job = job + self.request_id = request_id + self.retry = retry + self.timeout = timeout + self.metadata = metadata + self.job_id = None # Initialize job_id to None + + def _normalize_retry_value(self, retry_value): + """ + Normalize retry value for serialization and API calls. + + Since DEFAULT and Retry objects don't serialize well, we convert them to None. + """ + if retry_value is DEFAULT or retry_value is None: + return None + # For other retry objects (like Retry instances), use None as fallback + # since they are complex objects that don't serialize well + return None Review Comment: The [docs](https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/deferring.html#triggering-deferral-from-task-start) mentions this limitation as `trigger_kwargs: Keyword arguments to pass to the trigger_cls when it’s initialized. Note that all the arguments need to be serializable by Airflow. It’s the main limitation of this feature.` If we don't need it, maybe then just avoid passing it? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org