Kobe-Wang opened a new issue #10895:
URL: https://github.com/apache/airflow/issues/10895


   **Description**
   
   Could we copy the different filename from s3 with the table name in Redshift 
using S3ToRedshiftTransfer? 
   
   **Use case / motivation**
   
   I want to use the S3ToRedshiftTransfer operator to copy S3 files to 
Redshift, but I found the table name in Redshift is the same as the S3 filename 
when I traced the S3ToRedshiftTransfer source code. To be more specific, if I 
want to use the name is the difference between the table name in Redshift and 
filename in S3,  I can not use the S3ToRedshiftTransfer operator. Could you 
consider this part to stay flexible for the S3ToRedshiftTransfer operator?
   
   Add the copy_query source code from the S3ToRedshiftTransfer operator as the 
followings: (link: 
https://airflow.apache.org/docs/stable/_modules/airflow/operators/s3_to_redshift_operator.html
 )
   
   ```python
   copy_query = """
               COPY {schema}.{table}
               FROM 's3://{s3_bucket}/{s3_key}/{table}'
               with credentials
               
'aws_access_key_id={access_key};aws_secret_access_key={secret_key}'
               {copy_options};
           """.format(schema=self.schema,
                      table=self.table,
                      s3_bucket=self.s3_bucket,
                      s3_key=self.s3_key,
                      access_key=credentials.access_key,
                      secret_key=credentials.secret_key,
                      copy_options=copy_options)
   ``` 
   
   **Related Issues**
   
   No


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to