jj-ian commented on a change in pull request #4083: [AIRFLOW-3211] Reattach to GCP Dataproc jobs upon Airflow restart URL: https://github.com/apache/airflow/pull/4083#discussion_r246891824
########## File path: airflow/contrib/hooks/gcp_dataproc_hook.py ########## @@ -33,12 +33,82 @@ def __init__(self, dataproc_api, project_id, job, region='global', self.dataproc_api = dataproc_api self.project_id = project_id self.region = region + + # Check if the job to submit is already running on the cluster. + # If so, don't resubmit the job. + try: + cluster_name = job['job']['placement']['clusterName'] + except KeyError: + self.log.error('Job to submit is incorrectly configured.') + raise + + jobs_on_cluster_response = dataproc_api.projects().regions().jobs().list( + projectId=self.project_id, + region=self.region, + clusterName=cluster_name).execute() + + UUID_LENGTH = 9 Review comment: I'm going to go ahead with implementing the solution I proposed above @fenglu-g; if you have any reservations please let me know. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services