mik-laj commented on a change in pull request #18945: URL: https://github.com/apache/airflow/pull/18945#discussion_r728417941
########## File path: airflow/providers/google/cloud/example_dags/example_dataproc_metastore.py ########## @@ -0,0 +1,202 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +""" +Example Airflow DAG that show how to use various Dataproc Metastore +operators to manage a service. +""" + +import os + +from airflow import models +from airflow.providers.google.cloud.operators.dataproc_metastore import ( + DataprocMetastoreCreateBackupOperator, + DataprocMetastoreCreateMetadataImportOperator, + DataprocMetastoreCreateServiceOperator, + DataprocMetastoreDeleteBackupOperator, + DataprocMetastoreDeleteServiceOperator, + DataprocMetastoreExportMetadataOperator, + DataprocMetastoreGetServiceOperator, + DataprocMetastoreListBackupsOperator, + DataprocMetastoreRestoreServiceOperator, + DataprocMetastoreUpdateServiceOperator, +) +from airflow.utils.dates import days_ago + +PROJECT_ID = os.environ.get("GCP_PROJECT_ID", "<PROJECT_ID>") +SERVICE_ID = os.environ.get("GCP_DATAPROC_METASTORE_SERVICE_ID", "dataproc-metastore-system-tests-service-1") +BACKUP_ID = os.environ.get("GCP_DATAPROC_METASTORE_BACKUP_ID", "dataproc-metastore-system-tests-backup-1") +REGION = os.environ.get("GCP_REGION", "<REGION>") +BUCKET = os.environ.get("GCP_DATAPROC_METASTORE_BUCKET", "dataproc-metastore-system-tests") +METADATA_IMPORT_FILE = os.environ.get("GCS_METADATA_IMPORT_FILE", None) +METADATA_IMPORT_ID = "dataproc-metastore-system-tests-metadata-import-1" +TIMEOUT = 1200 +GCS_URI = f"gs://{BUCKET}/{METADATA_IMPORT_FILE}" +DB_TYPE = "MYSQL" +DESTINATION_GCS_FOLDER = "gs://{BUCKET}>" + +# Service definition +# Docs: https://cloud.google.com/dataproc-metastore/docs/reference/rest/v1/projects.locations.services#Service +# [START how_to_cloud_dataproc_metastore_create_service] +SERVICE = { + "name": "test-service", +} +# [END how_to_cloud_dataproc_metastore_create_service] + +# Update service +# [START how_to_cloud_dataproc_metastore_update_service] +SERVICE_TO_UPDATE = { + "labels": { + "mylocalmachine": "mylocalmachine", + "systemtest": "systemtest", + } +} +UPDATE_MASK = {"paths": ["labels"]} +# [END how_to_cloud_dataproc_metastore_update_service] + +# Backup definition +# [START how_to_cloud_dataproc_metastore_create_backup] +BACKUP = { + "name": "test-backup", +} +# [END how_to_cloud_dataproc_metastore_create_backup] + +# Metadata import definition +# [START how_to_cloud_dataproc_metastore_create_metadata_import] +METADATA_IMPORT = { + "name": "test-metadata-import", + "database_dump": { + "gcs_uri": GCS_URI, + "database_type": DB_TYPE, + }, +} +# [END how_to_cloud_dataproc_metastore_create_metadata_import] + + +with models.DAG("example_gcp_dataproc_metastore", start_date=days_ago(1), schedule_interval="@once") as dag: + # [START how_to_cloud_dataproc_metastore_create_service_operator] + create_service = DataprocMetastoreCreateServiceOperator( + task_id="create_service", + region=REGION, + project_id=PROJECT_ID, + service=SERVICE, + service_id=SERVICE_ID, + timeout=TIMEOUT, + ) + # [END how_to_cloud_dataproc_metastore_create_service_operator] + + # [START how_to_cloud_dataproc_metastore_get_service_operator] + get_service_details = DataprocMetastoreGetServiceOperator( + task_id="get_service", + region=REGION, + project_id=PROJECT_ID, + service_id=SERVICE_ID, Review comment: This will work as long as the job definition is in a separate Python file. To add examples to documentation, we use `example-include` directive, To allow us to keep docs and example dags in sync https://github.com/apache/airflow/blob/af4a5e006e4f5c9f203afeac039b22c6adee317f/docs/apache-airflow-providers-google/operators/cloud/dataproc.rst#L54-L58 Alternatively, we can also use the `code-block` directive, but this means that the example will not be tested and executed on CI. If it is not running on a CI, the examples very quickly become out-of-date and are more of a burden to maintain. https://github.com/apache/airflow/blob/060345c0d982765e39da5fa8b2e2c6a01e89e394/docs/apache-airflow/tutorial.rst#L205-L231 For example: In [this PR](https://github.com/apache/airflow/pull/18036/files), I deprecated default value of `pod_name` parameter in `EKSCreateClusterOperator`. This meant that if someone loads this file `DeprecationWarning` will be displayed. But we also [detect such problems](https://github.com/apache/airflow/pull/17900) automatically on CI, and we won't allow a merge PR that doesn't update the examples. This makes these examples much easier to maintain. We don't have to worry if all the examples have been updated to show the best practices because we detect all such problems automatically and if something still needs to be changed, we will inform everyone on the CI. If we used the `code-block` directive, we would have to find a way to detect similar problems automatically as well. This is one example, but we have other automated checks that allow us to take care of the high quality of documentation and examples > Just thought it was an opportunity for some legitimate use around default_args and document it. I agree that this is a good use for `default_args`, but given the problems with using these examples in the documentation, we don't necessarily have to do. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org