mik-laj commented on a change in pull request #7688: [AIRFLOW-6794] Allow AWS 
Operator RedshiftToS3Transfer To Run a Custom Query
URL: https://github.com/apache/airflow/pull/7688#discussion_r390993206
 
 

 ##########
 File path: UPDATING.md
 ##########
 @@ -1047,6 +1047,24 @@ If the DAG relies on tasks with other trigger rules 
(i.e. `all_done`) being skip
 
 The goal of this change is to achieve a more consistent and configurale 
cascading behaviour based on the `BaseBranchOperator` (see 
[AIRFLOW-2923](https://jira.apache.org/jira/browse/AIRFLOW-2923) and 
[AIRFLOW-1784](https://jira.apache.org/jira/browse/AIRFLOW-1784)).
 
+
+### RedshiftToS3Transfer:: signature changed
+
+Previous versions of the `RedshiftToS3Transfer` operator required `schema` and 
`table` arguments as the first 2
+positional arguments. This signature was changed in 2.0 and
+the `s3_bucket` and `s3_key` are the first 2 positional reguements. This 
should be followed by either `schema` and `table` and table arguements or a 
single `custom_select_query` arguement. 
+The `schema` and `table` arg combo unloads everything from the given table 
into an s3 object while `custom_select_query` allows you to specify your own 
redshift select query to filter data on and unload to s3. 
+
+In order to use this operator:
+```python
+result = RedshiftToS3Transfer('schema', 'table', 's3_bucket', 's3_key')  # 
Pre-2.0 call
 
 Review comment:
   This code is invalid allways.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to