Few comments:

   - I like the rules
   - For google we add additional "google/gcp" , "google/gsuite".
   "google/marketing_platform" packages in "providers". I am not sure if
   Amazon will ever have something outside of AWS to interact with (but I can
   imagine this might happen) so maybe from the beginning we will use
   "providers/amazon/aws" ? That's more "future-proof".
   - We have separate discussion about backporting to 1.10 and the current
   proposal is to move "providers" to "airflow_integrations" (or similar)
   package to allow forward-compatible backporting to 1.10  - so maybe we
   should finish this discussion first so that we could move it in the
   "target" package.

J.

On Wed, Oct 30, 2019 at 9:38 AM Ash Berlin-Taylor <a...@apache.org> wrote:

> +1 to all of those conventions. And although that is a lot of classes in
> airflow/providers/aws/operators/s3.py I like it that way rather than split
> out.
>
> The only odd on is obviously `lambda_` but we have to do something to that.
>
> -ash
>
> > On 30 Oct 2019, at 08:13, Bas Harenslak <basharens...@godatadriven.com>
> wrote:
> >
> > Hi,
> >
> > How about the following conventions?
> >
> >  *   No AWS in whatever notation in the class name, since this is
> already in the module (airflow.providers.aws….)
> >  *   Only exception to above is the AwsHook, which is a “base” hook
> which most other hooks inherit from
> >  *   No “Transfer” in class names, since all transfer operators already
> have “to” in the name, e.g. RedshiftToS3Transfer becomes
> RedshiftToS3Operator
> >  *   For all class names, the naming convention is {service}{component},
> e.g. “AthenaHook"
> >  *   For all “transfer” names, the naming convention is
> {service}To{service}{component}, e.g. “RedshiftToS3Operator"
> >  *   Migrate all AWS related components to /airflow/providers/aws/…
> >  *   Sidenote: lambda is a reserved Python keyword, so name the Lambda
> service files "lambda_.py", according to Python conventions to avoid name
> clashes
> >
> > Applying these rules, I get the following changes (done some regex,
> might have missed a few):
> >
> > ./contrib/hooks/aws_dynamodb_hook.py:AwsDynamoDBHook ->
> ./providers/aws/hooks/dynamodb.py:DynamoDBHook
> > ./contrib/hooks/aws_firehose_hook.py:AwsFirehoseHook ->
> ./providers/aws/hooks/firehose.py:FirehoseHook
> > ./contrib/hooks/aws_glue_catalog_hook.py:AwsGlueCatalogHook ->
> ./providers/aws/hooks/glue.py:GlueHook
> > ./contrib/hooks/aws_hook.py:AwsHook ->
> ./providers/aws/hooks/base/base.py:AwsHook
> > ./contrib/hooks/aws_lambda_hook.py:AwsLambdaHook ->
> ./providers/aws/hooks/lambda_.py:LambdaHook
> > ./contrib/hooks/aws_logs_hook.py:AwsLogsHook ->
> ./providers/aws/hooks/cloudwatch.py:CloudwatchLogsHook
> > ./contrib/hooks/aws_sns_hook.py:AwsSnsHook ->
> ./providers/aws/hooks/sns.py:SnsHook
> > ./contrib/hooks/aws_sqs_hook.py:SQSHook ->
> ./providers/aws/hooks/sqs.py:SqsHook
> > ./contrib/hooks/emr_hook.py:EmrHook ->
> ./providers/aws/hooks/emr.py:EmrHook
> > ./contrib/hooks/redshift_hook.py:RedshiftHook ->
> ./providers/aws/hooks/redshift.py:RedshiftHook
> > ./contrib/hooks/sagemaker_hook.py:SageMakerHook ->
> ./providers/aws/hooks/sagemaker.py:SagemakerHook
> > ./hooks/S3_hook.py:S3Hook -> ./providers/aws/hooks/s3.py:S3Hook
> > ./providers/aws/hooks/athena.py:AWSAthenaHook ->
> ./providers/aws/hooks/athena.py:AthenaHook
> >
> > ./contrib/operators/aws_sqs_publish_operator.py:SQSPublishOperator ->
> ./providers/aws/operators/sqs.py:SQSPublishOperator
> > ./contrib/operators/dynamodb_to_s3.py:DynamoDBToS3Operator ->
> ./providers/aws/operators/s3.py:DynamoDBToS3Operator
> > ./contrib/operators/hive_to_dynamodb.py:HiveToDynamoDBTransferOperator
> -> ./providers/aws/operators/dynamodb.py:HiveToDynamoDBOperator
> >
> ./contrib/operators/imap_attachment_to_s3_operator.py:ImapAttachmentToS3Operator
> -> ./providers/aws/operators/s3.py:ImapAttachmentToS3Operator
> > ./contrib/operators/mongo_to_s3.py:MongoToS3Operator ->
> ./providers/aws/operators/s3.py:MongoToS3Operator
> > ./contrib/operators/s3_copy_object_operator.py:S3CopyObjectOperator ->
> ./providers/aws/operators/s3.py:S3CopyObjectOperator
> >
> ./contrib/operators/s3_delete_objects_operator.py:S3DeleteObjectsOperator
> -> ./providers/aws/operators/s3.py:S3DeleteObjectsOperator
> > ./contrib/operators/s3_list_operator.py:S3ListOperator ->
> ./providers/aws/operators/s3.py:S3ListOperator
> > ./contrib/operators/s3_to_gcs_operator.py:S3ToGoogleCloudStorageOperator
> -> ./providers/aws/operators/s3.py:S3ToGoogleCloudStorageOperator
> > ./contrib/operators/s3_to_sftp_operator.py:S3ToSFTPOperator ->
> ./providers/aws/operators/s3.py:S3ToSFTPOperator
> > ./contrib/operators/sftp_to_s3_operator.py:SFTPToS3Operator ->
> ./providers/aws/operators/s3.py:SFTPToS3Operator
> > ./operators/gcs_to_s3.py:GoogleCloudStorageToS3Operator ->
> ./providers/aws/operators/s3.py:GoogleCloudStorageToS3Operator
> > ./operators/google_api_to_s3_transfer.py:GoogleApiToS3Transfer ->
> ./providers/aws/operators/s3.py:GoogleApiToS3Operator
> > ./operators/redshift_to_s3_operator.py:RedshiftToS3Transfer ->
> ./providers/aws/operators/s3.py:RedshiftToS3Operator
> > ./operators/s3_file_transform_operator.py:S3FileTransformOperator ->
> ./providers/aws/operators/s3.py:S3FileTransformOperator
> > ./operators/s3_to_hive_operator.py:S3ToHiveTransfer ->
> ./providers/aws/operators/s3.py:S3ToHiveOperator
> > ./operators/s3_to_redshift_operator.py:S3ToRedshiftTransfer ->
> ./providers/aws/operators/s3.py:S3ToRedshiftOperator
> > ./providers/aws/operators/athena.py:AWSAthenaOperator ->
> ./providers/aws/operators/athena.py:AthenaOperator
> > ./providers/aws/operators/batch.py:AWSBatchOperator ->
> ./providers/aws/operators/batch.py:BatchOperator
> >
> >
> ./contrib/sensors/aws_glue_catalog_partition_sensor.py:AwsGlueCatalogPartitionSensor
> -> ./providers/aws/sensors/glue.py:GlueCatalogPartitionSensor
> >
> ./contrib/sensors/aws_redshift_cluster_sensor.py:AwsRedshiftClusterSensor
> -> ./providers/aws/sensors/redshift.py:RedshiftClusterSensor
> > ./contrib/sensors/aws_sqs_sensor.py:SQSSensor ->
> ./providers/aws/sensors/sqs.py:SQSSensor
> > ./providers/aws/sensors/athena.py:AthenaSensor ->
> ./providers/aws/sensors/athena.py:AthenaSensor
> > ./sensors/s3_key_sensor.py:S3KeySensor ->
> ./providers/aws/sensors/s3.py:S3KeySensor
> > ./sensors/s3_prefix_sensor.py:S3PrefixSensor ->
> ./providers/aws/sensors/s3.py:S3PrefixSensor
> >
> > WDYT?
> >
> > Bas
> >
> > On 30 Oct 2019, at 08:52, Tomasz Urbaszek <tomasz.urbas...@polidea.com
> <mailto:tomasz.urbas...@polidea.com>> wrote:
> >
> > I think it's a great idea and I am willing to help.
> >
> > Bests,
> > Tomek
> >
> > On Wed, Oct 30, 2019 at 3:17 AM MinJae Kwon <mingram...@gmail.com
> <mailto:mingram...@gmail.com>> wrote:
> >
> > Hi all.
> >
> > While I was working on AIRFLOW-5803, I found there are inconsistencies
> > between class names for each AWS operators/hooks.
> >
> > For example, Athena hook is named as 'AWSAthenaHook', but Dynamodb hook
> is
> > named as 'AwsDynamoDBHook', and S3 hook is just 'S3Hook' (So,
> AIRFLOW-5803
> > is trying to rename S3Hook to AWSS3Hook)
> >
> > So we might need to have some conventions or guidelines for AWS
> > integrations like GCP (
> >
> https://docs.google.com/document/d/1_rTdJSLCt0eyrAylmmgYc3yZr-_h51fVlnvMmWqhCkY/edit?ts=5bb72dfd#
> > )
> >
> > What do you think about it?
> >
> > Thank you.
> >
> >
> >
> > --
> >
> > Tomasz Urbaszek
> > Polidea <https://www.polidea.com/> | Junior Software Engineer
> >
> > M: +48 505 628 493 <+48505628493>
> > E: tomasz.urbas...@polidea.com<mailto:tomasz.urbas...@polidea.com> <
> tomasz.urbasz...@polidea.com<mailto:tomasz.urbasz...@polidea.com>>
> >
> > Unique Tech
> > Check out our projects! <https://www.polidea.com/our-work>
> >
>
>

-- 

Jarek Potiuk
Polidea <https://www.polidea.com/> | Principal Software Engineer

M: +48 660 796 129 <+48660796129>
[image: Polidea] <https://www.polidea.com/>

Reply via email to