troychen728 commented on a change in pull request #3658: [AIRFLOW-2524] Add Amazon SageMaker Training URL: https://github.com/apache/incubator-airflow/pull/3658#discussion_r206700100
########## File path: airflow/contrib/operators/sagemaker_create_training_job_operator.py ########## @@ -0,0 +1,98 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.contrib.hooks.sagemaker_hook import SageMakerHook +from airflow.models import BaseOperator +from airflow.utils import apply_defaults +from airflow.exceptions import AirflowException + + +class SageMakerCreateTrainingJobOperator(BaseOperator): + + """ + Initiate a SageMaker training + + This operator returns The ARN of the model created in Amazon SageMaker + + :param training_job_config: + The configuration necessary to start a training job (templated) + :type training_job_config: dict + :param region_name: The AWS region_name + :type region_name: string + :param sagemaker_conn_id: The SageMaker connection ID to use. + :type aws_conn_id: string Review comment: Hi Fokko, Thank you so much for your review. I really appreciate your feedback. I didn't figure out how to reply to your request, so I'll just reply to you here. The main reason why I separate it to operator and sensor is that the success of the training job have two stages: successfully kick off a training job, and the training job successfully finishes. The operator tells about the first status, and the sensor tells the latter one. Also, since a training job is hosted at an AWS instance, not the instance that is hosting Airflow, so this way, other operators can set upstream to the operator, rather than the sensor, if they aren't dependent on the model actually being created. Also, by using the sensor, users can set parameters like poke_interval, which makes more sense for a sensor rather than an operator. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services