korolkevich commented on code in PR #37518:
URL: https://github.com/apache/airflow/pull/37518#discussion_r1494086360


##########
airflow/providers/google/cloud/transfers/s3_to_gcs.py:
##########
@@ -269,6 +269,7 @@ def transfer_files_async(self, files: list[str], gcs_hook: 
GCSHook, s3_hook: S3H
         self.defer(
             trigger=CloudStorageTransferServiceCreateJobsTrigger(
                 project_id=gcs_hook.project_id,
+                gcp_conn_id=self.gcp_conn_id,

Review Comment:
   @pankajastro Thank you :)
   But I still do not understand.
   - In the current scenario, we are using both a comparable Amazon provider 
and the Google Provider, and encountering an error from 
CloudStorageTransferServiceCreateJobsTrigger as it exists.
   - Now, if we update the Google Provider, will there be any problems with any 
comparable Amazon provider? I think there won't be any problem.
   - In the case of a new version of the Amazon provider and an old version of 
the Google Provider, we encounter the current error with 
CloudStorageTransferServiceCreateJobsTrigger as it exists.
   
   
   As i can see S3ToGCSOperator is depens on Amazon provider but not the 
opposite. And problem inside this class.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to