ferruzzi commented on code in PR #35037:
URL: https://github.com/apache/airflow/pull/35037#discussion_r1364689097


##########
airflow/providers/amazon/aws/hooks/s3.py:
##########
@@ -912,14 +912,27 @@ def get_key(self, key: str, bucket_name: str | None = 
None) -> S3ResourceObject:
         :param bucket_name: the name of the bucket
         :return: the key object from the bucket
         """
+
+        def __sanitize_extra_args() -> dict[str, str]:
+            """Parse extra_args and return a dict with only the args listed in 
ALLOWED_DOWNLOAD_ARGS."""
+            return {
+                arg_name: arg_value
+                for (arg_name, arg_value) in self.extra_args.items()
+                if arg_name in S3Transfer(self.conn).ALLOWED_DOWNLOAD_ARGS
+            }
+
         s3_resource = self.get_session().resource(
             "s3",
             endpoint_url=self.conn_config.endpoint_url,
             config=self.config,
             verify=self.verify,
         )
         obj = s3_resource.Object(bucket_name, key)
-        obj.load()
+
+        # TODO inline this after debugging
+        new_args = __sanitize_extra_args()
+
+        obj.load(**new_args)

Review Comment:
   yeah, considering how much documentation boto* has, there is a surprising 
amount which is not documented.
   
   Pretty sure base64 was one I tried, but I went through a bunch and didn't 
write down all of them.
   
   
   Either way, I guess we need to sort out a way to actually try this now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to