dazza-codes commented on a change in pull request #6811: [RFC][AIRFLOW-6245] 
Add custom waiters for AWS batch jobs
URL: https://github.com/apache/airflow/pull/6811#discussion_r360732567
 
 

 ##########
 File path: airflow/providers/amazon/aws/hooks/batch_client.py
 ##########
 @@ -0,0 +1,503 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+A client for AWS batch services
+
+.. seealso::
+
+    - http://boto3.readthedocs.io/en/latest/guide/configuration.html
+    - http://boto3.readthedocs.io/en/latest/reference/services/batch.html
+    - https://docs.aws.amazon.com/batch/latest/APIReference/Welcome.html
+"""
+
+from random import uniform
+from time import sleep
+from typing import Optional
+
+import botocore.client
+import botocore.exceptions
+from typing_extensions import Protocol
+
+from airflow import AirflowException, LoggingMixin
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+# pylint: disable=invalid-name, unused-argument
+
+
+class BatchProtocol(Protocol):
+    """
+    A typing extension for ``boto3.client('batch') -> botocore.client.Batch``
+    """
+
+    def describe_jobs(self, jobs):
+        """Get job descriptions from AWS batch
+
+        :param jobs: a list of JobId to describe
+        :type jobs: List[str]
+
+        :return: an API response to describe jobs
+        :rtype: dict
+        """
+        ...
+
+    def get_waiter(self, waiter_name):
 
 Review comment:
   Given that `submit_job` is already matching the `boto3` or `botocore` 
parameter names, I've decided to follow that pattern for 
`get_waiter(waiterName)` also.  If Airflow should stick with snake case, that 
change might belong on a larger change-set for this AWS package.  The use of 
camel case for the `AwsBatchProtocol` is OK only because it's a typing 
extension for the lack of typing in the `botocore` library; using snake case 
for any Airflow code is preferred (so it's `job_id` for anything in the 
`AwsBatchClient` rather than `jobId` everywhere).  Or is Airflow code supposed 
to bow to the improper use of camel case because `botocore` does it?  IMO, I'd 
prefer to drop all use of the camel case on anything that wraps botocore and 
have the wrappers manage the mappings from Airflow code to botocore - but that 
could be beyond the scope of this PR at this time.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to