potiuk commented on code in PR #37055:
URL: https://github.com/apache/airflow/pull/37055#discussion_r1468901595


##########
airflow/providers/amazon/aws/transfers/sql_to_s3.py:
##########
@@ -194,13 +197,32 @@ def execute(self, context: Context) -> None:
 
     def _partition_dataframe(self, df: pd.DataFrame) -> Iterable[tuple[str, 
pd.DataFrame]]:
         """Partition dataframe using pandas groupby() method."""
+        try:
+            import secrets
+            import string
+
+            import numpy as np
+        except ImportError:
+            pass
+        # if max_rows_per_file argument is specified, a temporary column with 
a random unusual name will be
+        # added to the dataframe. This column is used to dispatch the 
dataframe into smaller ones using groupby()
+        random_column_name = ""
+        if self.max_rows_per_file and not self.groupby_kwargs:

Review Comment:
   Could you please rather fail in case both parameters are specified? I think 
it's better to fail explicitly rather than silently swallow `max_ros_per_file`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to