kacpermuda commented on code in PR #44477: URL: https://github.com/apache/airflow/pull/44477#discussion_r1863608223
########## providers/src/airflow/providers/google/cloud/operators/dataproc.py: ########## @@ -2060,6 +2066,36 @@ def on_kill(self): if self.job_id and self.cancel_on_kill: self.hook.cancel_job(job_id=self.job_id, project_id=self.project_id, region=self.region) + def _inject_openlineage_properties_into_spark_job_config(self, job: dict, context: Context) -> dict: + """ + Inject OpenLineage properties into the Spark job configuration. + + Note: + This function will modify the job configuration ONLY + when the automatic injection of OpenLineage properties is enabled. + If You are not using OpenLineage integration, you can safely ignore this function. + # TODO Add more information on what this function does and when it's not doing anything + + Read more about this feature at: # TODO: Add link to the documentation + + Args: + job: The original Dataproc job definition. + context: The Airflow context in which the job is running. + + Returns: + The modified job configuration with OpenLineage properties injected, if applicable. + """ + from airflow.providers.google.cloud.openlineage.utils import ( Review Comment: That makes sense. The local import was an oversight, but thinking it through again, I realize I can eliminate the entire method and directly call the utils function instead. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org