josh-fell commented on a change in pull request #18657:
URL: https://github.com/apache/airflow/pull/18657#discussion_r720854510



##########
File path: airflow/providers/apache/druid/example_dags/example_druid_dag.py
##########
@@ -19,22 +19,19 @@
 """
 Example Airflow DAG to submit Apache Druid json index file using 
`DruidOperator`
 """
+from datetime import datetime
+
 from airflow.models import DAG
 from airflow.providers.apache.druid.operators.druid import DruidOperator
-from airflow.utils.dates import days_ago
 
 with DAG(
     dag_id='example_druid_operator',
     schedule_interval=None,
-    start_date=days_ago(2),
+    start_date=datetime(2021, 1, 1),
     tags=['example'],
 ) as dag:
     # [START howto_operator_druid_submit]
-    submit_job = DruidOperator(
-        task_id='spark_submit_job',
-        json_index_file='json_index.json',
-        druid_ingest_conn_id='druid_ingest_default',
-    )
+    submit_job = DruidOperator(task_id='spark_submit_job', 
json_index_file='json_index.json')

Review comment:
       Yes. We've been removing these default connection IDs across the example 
DAGs. Do you feel differently?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to