ferruzzi commented on code in PR #35037: URL: https://github.com/apache/airflow/pull/35037#discussion_r1364679959
########## airflow/providers/amazon/aws/hooks/s3.py: ########## @@ -912,14 +912,27 @@ def get_key(self, key: str, bucket_name: str | None = None) -> S3ResourceObject: :param bucket_name: the name of the bucket :return: the key object from the bucket """ + + def __sanitize_extra_args() -> dict[str, str]: + """Parse extra_args and return a dict with only the args listed in ALLOWED_DOWNLOAD_ARGS.""" + return { + arg_name: arg_value + for (arg_name, arg_value) in self.extra_args.items() + if arg_name in S3Transfer(self.conn).ALLOWED_DOWNLOAD_ARGS + } + s3_resource = self.get_session().resource( "s3", endpoint_url=self.conn_config.endpoint_url, config=self.config, verify=self.verify, ) obj = s3_resource.Object(bucket_name, key) - obj.load() + + # TODO inline this after debugging + new_args = __sanitize_extra_args() + + obj.load(**new_args) Review Comment: Hm. When i tried using it manually in a DAG I kept getting an error that the encryption key format was invalid but didn't find any documentation on what the actual expected format was... I tried string, bytestring, raw string, hex, a handful of things I could think of, but I took that to mean it was at least getting that far and someone who knew enough to want to use the feature would know the right format. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org