[airflow] branch main updated: Remove turbaszek from CODEOWNERS (#28928)

2023-01-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 3ec5cc1cea Remove turbaszek from CODEOWNERS (#28928)
3ec5cc1cea is described below

commit 3ec5cc1ceaa490431dc798314fd92412a7a97773
Author: Tomek Urbaszek 
AuthorDate: Fri Jan 13 19:08:33 2023 +0100

Remove turbaszek from CODEOWNERS (#28928)
---
 .github/CODEOWNERS | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS
index e4c16b38d3..c1a5b5fdbb 100644
--- a/.github/CODEOWNERS
+++ b/.github/CODEOWNERS
@@ -59,8 +59,7 @@
 /airflow/secrets  @dstandish @kaxil @potiuk @ashb
 
 # Providers
-/airflow/providers/google/ @turbaszek
-/airflow/providers/snowflake/ @turbaszek @potiuk @mik-laj
+/airflow/providers/snowflake/ @potiuk @mik-laj
 /airflow/providers/cncf/kubernetes @jedcunningham
 /airflow/providers/dbt/cloud/ @josh-fell
 /airflow/providers/tabular/ @Fokko



[airflow] branch main updated: Add doc and sample dag for GCSToS3Operator (#23730)

2022-05-16 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new ca25436108 Add doc and sample dag for GCSToS3Operator (#23730)
ca25436108 is described below

commit ca2543610872ccf62ccb085c5e0b6f9b8717c1aa
Author: Vincent <97131062+vincb...@users.noreply.github.com>
AuthorDate: Mon May 16 15:16:16 2022 -0400

Add doc and sample dag for GCSToS3Operator (#23730)
---
 .../amazon/aws/example_dags/example_gcs_to_s3.py   | 41 +
 .../providers/amazon/aws/transfers/gcs_to_s3.py|  4 ++
 airflow/providers/amazon/provider.yaml |  1 +
 .../operators/transfer/gcs_to_s3.rst   | 52 ++
 4 files changed, 98 insertions(+)

diff --git a/airflow/providers/amazon/aws/example_dags/example_gcs_to_s3.py 
b/airflow/providers/amazon/aws/example_dags/example_gcs_to_s3.py
new file mode 100644
index 00..d9d04c73ff
--- /dev/null
+++ b/airflow/providers/amazon/aws/example_dags/example_gcs_to_s3.py
@@ -0,0 +1,41 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import os
+from datetime import datetime
+
+from airflow import DAG
+from airflow.providers.amazon.aws.transfers.gcs_to_s3 import GCSToS3Operator
+
+BUCKET = os.getenv("BUCKET", "bucket")
+S3_KEY = os.getenv("S3_KEY", "s3:///")
+
+with DAG(
+dag_id="example_gcs_to_s3",
+schedule_interval=None,
+start_date=datetime(2021, 1, 1),
+tags=["example"],
+catchup=False,
+) as dag:
+# [START howto_transfer_gcs_to_s3]
+gcs_to_s3 = GCSToS3Operator(
+task_id="gcs_to_s3",
+bucket=BUCKET,
+dest_s3_key=S3_KEY,
+replace=True,
+)
+# [END howto_transfer_gcs_to_s3]
diff --git a/airflow/providers/amazon/aws/transfers/gcs_to_s3.py 
b/airflow/providers/amazon/aws/transfers/gcs_to_s3.py
index 3dc8aa0b87..b521ce5360 100644
--- a/airflow/providers/amazon/aws/transfers/gcs_to_s3.py
+++ b/airflow/providers/amazon/aws/transfers/gcs_to_s3.py
@@ -32,6 +32,10 @@ class GCSToS3Operator(BaseOperator):
 """
 Synchronizes a Google Cloud Storage bucket with an S3 bucket.
 
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:GCSToS3Operator`
+
 :param bucket: The Google Cloud Storage bucket to find the objects. 
(templated)
 :param prefix: Prefix string which filters objects whose name begin with
 this prefix. (templated)
diff --git a/airflow/providers/amazon/provider.yaml 
b/airflow/providers/amazon/provider.yaml
index 6cd923d442..413b6dcfed 100644
--- a/airflow/providers/amazon/provider.yaml
+++ b/airflow/providers/amazon/provider.yaml
@@ -482,6 +482,7 @@ transfers:
 python-module: airflow.providers.amazon.aws.transfers.dynamodb_to_s3
   - source-integration-name: Google Cloud Storage (GCS)
 target-integration-name: Amazon Simple Storage Service (S3)
+how-to-guide: 
/docs/apache-airflow-providers-amazon/operators/transfer/gcs_to_s3.rst
 python-module: airflow.providers.amazon.aws.transfers.gcs_to_s3
   - source-integration-name: Amazon Glacier
 target-integration-name: Google Cloud Storage (GCS)
diff --git 
a/docs/apache-airflow-providers-amazon/operators/transfer/gcs_to_s3.rst 
b/docs/apache-airflow-providers-amazon/operators/transfer/gcs_to_s3.rst
new file mode 100644
index 00..f19d005e94
--- /dev/null
+++ b/docs/apache-airflow-providers-amazon/operators/transfer/gcs_to_s3.rst
@@ -0,0 +1,52 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+

[airflow] branch main updated (9e25bc211f -> 3bf9a1df38)

2022-05-16 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 9e25bc211f Handle invalid date parsing in webserver views. (#23161)
 add 3bf9a1df38 Add fields to CLOUD_SQL_EXPORT_VALIDATION. (#23724)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/cloud/operators/cloud_sql.py | 19 ++-
 .../google/cloud/operators/test_cloud_sql.py  | 16 ++--
 2 files changed, 32 insertions(+), 3 deletions(-)



[airflow] branch main updated: Add Dataplex operators (#20377)

2022-03-14 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 87c1246  Add Dataplex operators (#20377)
87c1246 is described below

commit 87c1246b79769f20214a339aadc6a8270d453953
Author: Wojciech Januszek 
AuthorDate: Mon Mar 14 19:07:49 2022 +

Add Dataplex operators (#20377)
---
 .../google/cloud/example_dags/example_dataplex.py  | 122 ++
 airflow/providers/google/cloud/hooks/dataplex.py   | 247 
 airflow/providers/google/cloud/links/dataplex.py   |  76 
 .../providers/google/cloud/operators/dataplex.py   | 428 +
 airflow/providers/google/cloud/sensors/dataplex.py | 119 ++
 airflow/providers/google/provider.yaml |  16 +
 .../operators/cloud/dataplex.rst   | 105 +
 docs/spelling_wordlist.txt |   3 +
 setup.py   |   1 +
 .../providers/google/cloud/hooks/test_dataplex.py  | 120 ++
 .../google/cloud/operators/test_dataplex.py| 179 +
 .../google/cloud/operators/test_dataplex_system.py |  45 +++
 .../google/cloud/sensors/test_dataplex.py  | 103 +
 13 files changed, 1564 insertions(+)

diff --git a/airflow/providers/google/cloud/example_dags/example_dataplex.py 
b/airflow/providers/google/cloud/example_dags/example_dataplex.py
new file mode 100644
index 000..aabe17a
--- /dev/null
+++ b/airflow/providers/google/cloud/example_dags/example_dataplex.py
@@ -0,0 +1,122 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+Example Airflow DAG that shows how to use Dataplex.
+"""
+
+import datetime
+import os
+
+from airflow import models
+from airflow.models.baseoperator import chain
+from airflow.providers.google.cloud.operators.dataplex import (
+DataplexCreateTaskOperator,
+DataplexDeleteTaskOperator,
+DataplexGetTaskOperator,
+DataplexListTasksOperator,
+)
+from airflow.providers.google.cloud.sensors.dataplex import 
DataplexTaskStateSensor
+
+PROJECT_ID = os.environ.get("GCP_PROJECT_ID", "INVALID PROJECT ID")
+REGION = os.environ.get("GCP_REGION", "INVALID REGION")
+LAKE_ID = os.environ.get("GCP_LAKE_ID", "INVALID LAKE ID")
+SERVICE_ACC = os.environ.get("GCP_DATAPLEX_SERVICE_ACC", 
"x...@developer.gserviceaccount.com")
+BUCKET = os.environ.get("GCP_DATAPLEX_BUCKET", "INVALID BUCKET NAME")
+SPARK_FILE_NAME = os.environ.get("SPARK_FILE_NAME", "INVALID FILE NAME")
+SPARK_FILE_FULL_PATH = f"gs://{BUCKET}/{SPARK_FILE_NAME}"
+DATAPLEX_TASK_ID = "task001"
+TRIGGER_SPEC_TYPE = "ON_DEMAND"
+
+# [START howto_dataplex_configuration]
+EXAMPLE_TASK_BODY = {
+"trigger_spec": {"type_": TRIGGER_SPEC_TYPE},
+"execution_spec": {"service_account": SERVICE_ACC},
+"spark": {"python_script_file": SPARK_FILE_FULL_PATH},
+}
+# [END howto_dataplex_configuration]
+
+with models.DAG(
+"example_dataplex", start_date=datetime.datetime(2021, 1, 1), 
schedule_interval="@once", catchup=False
+) as dag:
+# [START howto_dataplex_create_task_operator]
+create_dataplex_task = DataplexCreateTaskOperator(
+project_id=PROJECT_ID,
+region=REGION,
+lake_id=LAKE_ID,
+body=EXAMPLE_TASK_BODY,
+dataplex_task_id=DATAPLEX_TASK_ID,
+task_id="create_dataplex_task",
+)
+# [END howto_dataplex_create_task_operator]
+
+# [START howto_dataplex_async_create_task_operator]
+create_dataplex_task_async = DataplexCreateTaskOperator(
+project_id=PROJECT_ID,
+region=REGION,
+lake_id=LAKE_ID,
+body=EXAMPLE_TASK_BODY,
+dataplex_task_id=DATAPLEX_TASK_ID,
+asynchronous=True,
+task_id="create_dataplex_task_async",
+)
+# [END howto_dataplex_async_create_task_operator]
+
+# [START howto_dataplex

[airflow] branch main updated (c75774d -> c8d64c9)

2022-02-26 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from c75774d  Add `db clean` CLI command for purging old data (#20838)
 add c8d64c9  Add Github integration steps for committers (#21834)

No new revisions were added by this update.

Summary of changes:
 COMMITTERS.rst | 11 +++
 1 file changed, 11 insertions(+)


[airflow] branch main updated: Add Dataproc assets/links (#21756)

2022-02-23 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 3b4c26e  Add Dataproc assets/links (#21756)
3b4c26e is described below

commit 3b4c26eb3a1c8d4938be80ab7fa0711561e91f8f
Author: Wojciech Januszek 
AuthorDate: Wed Feb 23 10:59:48 2022 +

Add Dataproc assets/links (#21756)

Co-authored-by: Wojciech Januszek 
---
 airflow/providers/google/cloud/links/dataproc.py   | 112 
 .../providers/google/cloud/operators/dataproc.py   | 143 -
 airflow/providers/google/provider.yaml |   4 +-
 .../google/cloud/operators/test_dataproc.py| 126 +-
 4 files changed, 227 insertions(+), 158 deletions(-)

diff --git a/airflow/providers/google/cloud/links/dataproc.py 
b/airflow/providers/google/cloud/links/dataproc.py
new file mode 100644
index 000..ffa286f
--- /dev/null
+++ b/airflow/providers/google/cloud/links/dataproc.py
@@ -0,0 +1,112 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google Dataproc links."""
+
+from datetime import datetime
+from typing import TYPE_CHECKING
+
+from airflow.models import BaseOperator, BaseOperatorLink, XCom
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+DATAPROC_BASE_LINK = "https://console.cloud.google.com/dataproc;
+DATAPROC_JOB_LOG_LINK = DATAPROC_BASE_LINK + 
"/jobs/{resource}?region={region}={project_id}"
+DATAPROC_CLUSTER_LINK = (
+DATAPROC_BASE_LINK + 
"/clusters/{resource}/monitoring?region={region}={project_id}"
+)
+DATAPROC_WORKFLOW_TEMPLATE_LINK = (
+DATAPROC_BASE_LINK + 
"/workflows/templates/{region}/{resource}?project={project_id}"
+)
+DATAPROC_WORKFLOW_LINK = DATAPROC_BASE_LINK + 
"/workflows/instances/{region}/{resource}?project={project_id}"
+DATAPROC_BATCH_LINK = DATAPROC_BASE_LINK + 
"/batches/{region}/{resource}/monitoring?project={project_id}"
+DATAPROC_BATCHES_LINK = DATAPROC_BASE_LINK + "/batches?project={project_id}"
+
+
+class DataprocLink(BaseOperatorLink):
+"""Helper class for constructing Dataproc resource link"""
+
+name = "Dataproc resource"
+key = "conf"
+
+@staticmethod
+def persist(
+context: "Context",
+task_instance,
+url: str,
+resource: str,
+):
+task_instance.xcom_push(
+context=context,
+key=DataprocLink.key,
+value={
+"region": task_instance.region,
+"project_id": task_instance.project_id,
+"url": url,
+"resource": resource,
+},
+)
+
+def get_link(self, operator: BaseOperator, dttm: datetime):
+conf = XCom.get_one(
+key=DataprocLink.key, dag_id=operator.dag.dag_id, 
task_id=operator.task_id, execution_date=dttm
+)
+return (
+conf["url"].format(
+region=conf["region"], project_id=conf["project_id"], 
resource=conf["resource"]
+)
+if conf
+else ""
+)
+
+
+class DataprocListLink(BaseOperatorLink):
+"""Helper class for constructing list of Dataproc resources link"""
+
+name = "Dataproc resources"
+key = "list_conf"
+
+@staticmethod
+def persist(
+context: "Context",
+task_instance,
+url: str,
+):
+task_instance.xcom_push(
+context=context,
+key=DataprocListLink.key,
+value={
+"project_id": task_instance.project_id,
+"url": url,
+},
+)
+
+def get_link(self, operator: BaseOperator, dttm: datetime):
+list_conf = XCom.get_one(
+key=DataprocListLink.key,
+dag_id=operato

[airflow] branch main updated (da485da -> 91014f0)

2022-02-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from da485da  Add support for BeamGoPipelineOperator (#20386)
 add 91014f0  Add str to task_ids typing in BaseOperator.xcom_pull(#21541)

No new revisions were added by this update.

Summary of changes:
 airflow/models/baseoperator.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


[airflow] branch main updated (9815e12 -> 58452f9)

2022-01-03 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 9815e12  Update manage-dags-files.rst to fix some inconsistencies 
(#20630)
 add 58452f9  Add hook for integrating with Google Calendar (#20542)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/provider.yaml |   7 +
 airflow/providers/google/suite/hooks/calendar.py   | 239 +
 docs/integration-logos/gcp/Google-Calendar.png | Bin 0 -> 21046 bytes
 .../providers/google/suite/hooks/test_calendar.py  | 100 +
 4 files changed, 346 insertions(+)
 create mode 100644 airflow/providers/google/suite/hooks/calendar.py
 create mode 100644 docs/integration-logos/gcp/Google-Calendar.png
 create mode 100644 tests/providers/google/suite/hooks/test_calendar.py


[airflow-site] branch main updated: Correcting URL display name for Brent's GitHub handle (#478)

2021-09-08 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/main by this push:
 new 7bc4f88  Correcting URL display name for Brent's GitHub handle (#478)
7bc4f88 is described below

commit 7bc4f885a74ac2a11c072974186cdd2ce21cac37
Author: Josh Fell <48934154+josh-f...@users.noreply.github.com>
AuthorDate: Wed Sep 8 04:09:51 2021 -0400

Correcting URL display name for Brent's GitHub handle (#478)
---
 landing-pages/site/content/en/announcements/_index.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/landing-pages/site/content/en/announcements/_index.md 
b/landing-pages/site/content/en/announcements/_index.md
index ff55247..726b9e0 100644
--- a/landing-pages/site/content/en/announcements/_index.md
+++ b/landing-pages/site/content/en/announcements/_index.md
@@ -16,7 +16,7 @@ menu:
 
 # August 27, 2021
 
-Airflow PMC welcomes **Brent Bovenzi** 
([@aneesh-joseph](https://github.com/bbovenzi)) as the newest Airflow 
Committer. 
+Airflow PMC welcomes **Brent Bovenzi** 
([@bbovenzi](https://github.com/bbovenzi)) as the newest Airflow Committer. 
 
 # August 23, 2021
 


[airflow] branch main updated (d83e5f2 -> c3b8212)

2021-07-10 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from d83e5f2  Update changelog with Python 3.9 support.
 add c3b8212  Added template_fields_renderers for MySQL Operator (#16914)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/mysql/operators/mysql.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)


[airflow] branch master updated (85b2ccb -> aa4713e)

2021-05-21 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 85b2ccb  Add `KubernetesPodOperat` `pod-template-file` jinja template 
support (#15942)
 add aa4713e  Use api version only in GoogleAdsHook not operators (#15266)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/ads/hooks/ads.py   | 11 +++
 airflow/providers/google/ads/operators/ads.py   | 10 +-
 airflow/providers/google/ads/transfers/ads_to_gcs.py| 10 +-
 tests/providers/google/ads/operators/test_ads.py|  8 +++-
 tests/providers/google/ads/transfers/test_ads_to_gcs.py |  8 +++-
 5 files changed, 39 insertions(+), 8 deletions(-)


[airflow] branch master updated (74c1ce0 -> 1543bb7)

2021-05-11 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 74c1ce0  Add resolution to force dependencies to use patched version 
of lodash (#15777)
 add 1543bb7  Fixed type annotations in DAG decorator (#15778)

No new revisions were added by this update.

Summary of changes:
 airflow/models/dag.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)


[airflow] branch master updated (be8d2b1 -> a3b0a27)

2021-04-24 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from be8d2b1  Use pull_request.user, not actor to determine PR user (#15504)
 add a3b0a27  Add code style note: no asserts (#15512)

No new revisions were added by this update.

Summary of changes:
 CONTRIBUTING.rst | 19 +++
 1 file changed, 19 insertions(+)


[airflow] branch master updated (af2d11e -> 95ae24a)

2021-04-03 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from af2d11e  Restore base lineage backend (#14146)
 add 95ae24a  Increase timeout for building the docs (#15157)

No new revisions were added by this update.

Summary of changes:
 docs/exts/docs_build/code_utils.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


[airflow] branch master updated (9ac1d0a -> af2d11e)

2021-04-03 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 9ac1d0a  Close issues that are pending response from the issue author 
(#15170)
 add af2d11e  Restore base lineage backend (#14146)

No new revisions were added by this update.

Summary of changes:
 airflow/lineage/__init__.py | 22 ++
 airflow/lineage/backend.py  | 47 +++
 docs/apache-airflow/lineage.rst | 21 ++
 tests/lineage/test_lineage.py   | 49 -
 4 files changed, 138 insertions(+), 1 deletion(-)
 create mode 100644 airflow/lineage/backend.py


[airflow] branch master updated: Override project in dataprocSubmitJobOperator (#14981)

2021-03-28 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 099c490  Override project in dataprocSubmitJobOperator (#14981)
099c490 is described below

commit 099c490cffae9556e56e141addcb41e9676e0d8f
Author: Sam Wheating 
AuthorDate: Sun Mar 28 10:53:06 2021 -0400

Override project in dataprocSubmitJobOperator (#14981)
---
 .../providers/google/cloud/operators/dataproc.py   |  7 +++--
 .../google/cloud/operators/test_dataproc.py| 32 ++
 2 files changed, 37 insertions(+), 2 deletions(-)

diff --git a/airflow/providers/google/cloud/operators/dataproc.py 
b/airflow/providers/google/cloud/operators/dataproc.py
index bcfb2c7..d578565 100644
--- a/airflow/providers/google/cloud/operators/dataproc.py
+++ b/airflow/providers/google/cloud/operators/dataproc.py
@@ -858,6 +858,9 @@ class DataprocJobBaseOperator(BaseOperator):
 :type job_name: str
 :param cluster_name: The name of the DataProc cluster.
 :type cluster_name: str
+:param project_id: The ID of the Google Cloud project the cluster belongs 
to,
+if not specified the project will be inferred from the provided GCP 
connection.
+:type project_id: str
 :param dataproc_properties: Map for the Hive properties. Ideal to put in
 default arguments (templated)
 :type dataproc_properties: dict
@@ -912,6 +915,7 @@ class DataprocJobBaseOperator(BaseOperator):
 *,
 job_name: str = '{{task.task_id}}_{{ds_nodash}}',
 cluster_name: str = "cluster-1",
+project_id: Optional[str] = None,
 dataproc_properties: Optional[Dict] = None,
 dataproc_jars: Optional[List[str]] = None,
 gcp_conn_id: str = 'google_cloud_default',
@@ -943,9 +947,8 @@ class DataprocJobBaseOperator(BaseOperator):
 
 self.job_error_states = job_error_states if job_error_states is not 
None else {'ERROR'}
 self.impersonation_chain = impersonation_chain
-
 self.hook = DataprocHook(gcp_conn_id=gcp_conn_id, 
impersonation_chain=impersonation_chain)
-self.project_id = self.hook.project_id
+self.project_id = self.hook.project_id if project_id is None else 
project_id
 self.job_template = None
 self.job = None
 self.dataproc_job_id = None
diff --git a/tests/providers/google/cloud/operators/test_dataproc.py 
b/tests/providers/google/cloud/operators/test_dataproc.py
index e1c712e..e66acb4 100644
--- a/tests/providers/google/cloud/operators/test_dataproc.py
+++ b/tests/providers/google/cloud/operators/test_dataproc.py
@@ -781,6 +781,12 @@ class TestDataProcSparkSqlOperator(unittest.TestCase):
 "labels": {"airflow-version": AIRFLOW_VERSION},
 "spark_sql_job": {"query_list": {"queries": [query]}, 
"script_variables": variables},
 }
+other_project_job = {
+"reference": {"project_id": "other-project", "job_id": 
"{{task.task_id}}_{{ds_nodash}}_" + job_id},
+"placement": {"cluster_name": "cluster-1"},
+"labels": {"airflow-version": AIRFLOW_VERSION},
+"spark_sql_job": {"query_list": {"queries": [query]}, 
"script_variables": variables},
+}
 
 @mock.patch(DATAPROC_PATH.format("DataprocHook"))
 def test_deprecation_warning(self, mock_hook):
@@ -815,6 +821,32 @@ class TestDataProcSparkSqlOperator(unittest.TestCase):
 
 @mock.patch(DATAPROC_PATH.format("uuid.uuid4"))
 @mock.patch(DATAPROC_PATH.format("DataprocHook"))
+def test_execute_override_project_id(self, mock_hook, mock_uuid):
+mock_uuid.return_value = self.job_id
+mock_hook.return_value.project_id = GCP_PROJECT
+mock_hook.return_value.wait_for_job.return_value = None
+mock_hook.return_value.submit_job.return_value.reference.job_id = 
self.job_id
+
+op = DataprocSubmitSparkSqlJobOperator(
+project_id="other-project",
+task_id=TASK_ID,
+region=GCP_LOCATION,
+gcp_conn_id=GCP_CONN_ID,
+query=self.query,
+variables=self.variables,
+impersonation_chain=IMPERSONATION_CHAIN,
+)
+op.execute(context={})
+mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, 
impersonation_chain=IMPERSONATION_CHAIN)
+mock_hook.return_value.submit_job.assert_called_once_with(
+project_id="other-project", job=self.other_project_job, 
location=GCP_LOCATION
+)
+mock_hook.return_value.wait_for_job.assert_called_once_with(
+job_id=self.job_id, location=GCP_LOCATI

[airflow] branch master updated: Google Dataflow Hook to handle no Job Type (#14914)

2021-03-23 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new a7e144b  Google Dataflow Hook to handle no Job Type (#14914)
a7e144b is described below

commit a7e144bec855f6ccf0fa5ae8447894195ffe170f
Author: Tobiasz Kędzierski 
AuthorDate: Tue Mar 23 19:48:42 2021 +0100

Google Dataflow Hook to handle no Job Type (#14914)


Co-authored-by: Tomek Urbaszek 
---
 airflow/providers/google/cloud/hooks/dataflow.py   |  2 +-
 .../providers/google/cloud/hooks/test_dataflow.py  | 28 ++
 2 files changed, 29 insertions(+), 1 deletion(-)

diff --git a/airflow/providers/google/cloud/hooks/dataflow.py 
b/airflow/providers/google/cloud/hooks/dataflow.py
index f0986e6..7c53507 100644
--- a/airflow/providers/google/cloud/hooks/dataflow.py
+++ b/airflow/providers/google/cloud/hooks/dataflow.py
@@ -404,7 +404,7 @@ class _DataflowJobsController(LoggingMixin):
 :raise: Exception
 """
 if self._wait_until_finished is None:
-wait_for_running = job['type'] == 
DataflowJobType.JOB_TYPE_STREAMING
+wait_for_running = job.get('type') == 
DataflowJobType.JOB_TYPE_STREAMING
 else:
 wait_for_running = not self._wait_until_finished
 
diff --git a/tests/providers/google/cloud/hooks/test_dataflow.py 
b/tests/providers/google/cloud/hooks/test_dataflow.py
index 7ceef1f..03d5ce3 100644
--- a/tests/providers/google/cloud/hooks/test_dataflow.py
+++ b/tests/providers/google/cloud/hooks/test_dataflow.py
@@ -1416,6 +1416,34 @@ class TestDataflowJob(unittest.TestCase):
 
 # fmt: off
 @parameterized.expand([
+# RUNNING
+(DataflowJobStatus.JOB_STATE_RUNNING, None, False),
+(DataflowJobStatus.JOB_STATE_RUNNING, True, False),
+(DataflowJobStatus.JOB_STATE_RUNNING, False, True),
+# AWAITING STATE
+(DataflowJobStatus.JOB_STATE_PENDING, None, False),
+(DataflowJobStatus.JOB_STATE_PENDING, True, False),
+(DataflowJobStatus.JOB_STATE_PENDING, False, True),
+])
+# fmt: on
+def test_check_dataflow_job_state_without_job_type(self, job_state, 
wait_until_finished, expected_result):
+job = {"id": "id-2", "name": "name-2", "currentState": job_state}
+dataflow_job = _DataflowJobsController(
+dataflow=self.mock_dataflow,
+project_number=TEST_PROJECT,
+name="name-",
+location=TEST_LOCATION,
+poll_sleep=0,
+job_id=None,
+num_retries=20,
+multiple_jobs=True,
+wait_until_finished=wait_until_finished,
+)
+result = dataflow_job._check_dataflow_job_state(job)
+assert result == expected_result
+
+# fmt: off
+@parameterized.expand([
 (DataflowJobType.JOB_TYPE_BATCH, DataflowJobStatus.JOB_STATE_FAILED,
 "Google Cloud Dataflow job name-2 has failed\\."),
 (DataflowJobType.JOB_TYPE_STREAMING, 
DataflowJobStatus.JOB_STATE_FAILED,


[airflow-site] branch master updated: Add blog post link to its title (#395)

2021-03-21 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/master by this push:
 new 3c43db0  Add blog post link to its title (#395)
3c43db0 is described below

commit 3c43db077666bf3be2d97e21c2f0e05ee4cee1b5
Author: Ehsan Poursaeed 
AuthorDate: Sun Mar 21 15:27:04 2021 +0330

Add blog post link to its title (#395)
---
 landing-pages/site/layouts/partials/boxes/blogpost.html | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)

diff --git a/landing-pages/site/layouts/partials/boxes/blogpost.html 
b/landing-pages/site/layouts/partials/boxes/blogpost.html
index f17e94e..d8b1188 100644
--- a/landing-pages/site/layouts/partials/boxes/blogpost.html
+++ b/landing-pages/site/layouts/partials/boxes/blogpost.html
@@ -29,7 +29,11 @@
 
 {{ .Date.Format "Mon, Jan 2, 2006" }}
 
-{{ .Params.title }}
+
+
+{{ .Params.title }}
+
+
 {{ .Params.author }}
 {{ .Params.description 
}}
 


[airflow-site] branch master updated: Add simple dag editor plugin to ecosystem (#390)

2021-03-18 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/master by this push:
 new a6f6d8d  Add simple dag editor plugin to ecosystem (#390)
a6f6d8d is described below

commit a6f6d8d727e7fdd1f3cbe078ca55c5ea911484fa
Author: ohadmata 
AuthorDate: Thu Mar 18 10:28:45 2021 +0200

Add simple dag editor plugin to ecosystem (#390)
---
 landing-pages/site/content/en/ecosystem/_index.md | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/landing-pages/site/content/en/ecosystem/_index.md 
b/landing-pages/site/content/en/ecosystem/_index.md
index 9cdcee8..b87945c 100644
--- a/landing-pages/site/content/en/ecosystem/_index.md
+++ b/landing-pages/site/content/en/ecosystem/_index.md
@@ -86,3 +86,5 @@ If you would you like to be included on this page, please 
reach out to the [Apac
 [Pylint-Airflow](https://github.com/BasPH/pylint-airflow) - A Pylint plugin 
for static code analysis on Airflow code.
 
 [whirl](https://github.com/godatadriven/whirl) - Fast iterative local 
development and testing of Apache Airflow workflows.
+
+[simple-dag-editor](https://github.com/ohadmata/simple-dag-editor) - Zero 
configuration Airflow plugin that let you manage your DAG files.



[airflow] branch v1-10-stable updated: Add conf not importable from airflow rule (#14400)

2021-03-14 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-stable by this push:
 new 60991f0  Add conf not importable from airflow rule (#14400)
60991f0 is described below

commit 60991f0fdd23a7a57ce63c10722d93116ee9c20f
Author: Sunday Mgbogu <32062279+digitalsimb...@users.noreply.github.com>
AuthorDate: Sun Mar 14 11:48:19 2021 +0100

Add conf not importable from airflow rule (#14400)

Closes: #13945
---
 .../rules/fix_conf_not_importable_from_airflow.py  | 60 +
 .../test_fix_conf_not_importable_from_airflow.py   | 75 ++
 2 files changed, 135 insertions(+)

diff --git a/airflow/upgrade/rules/fix_conf_not_importable_from_airflow.py 
b/airflow/upgrade/rules/fix_conf_not_importable_from_airflow.py
new file mode 100644
index 000..88dd41f
--- /dev/null
+++ b/airflow/upgrade/rules/fix_conf_not_importable_from_airflow.py
@@ -0,0 +1,60 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.upgrade.rules.base_rule import BaseRule
+from airflow import conf
+from airflow.utils.dag_processing import list_py_file_paths
+
+
+class ProperlyImportConfFromAirflow(BaseRule):
+"""
+  ProperlyImportConfFromAirflow class to ensure proper import of conf to 
work in Airflow 2.0
+  """
+title = "Ensure Users Properly Import conf from Airflow"
+description = """\
+In Airflow-2.0, it's not possible to import `conf` from airflow by using 
`import conf from airflow`
+To ensure your code works in Airflow 2.0, you should use `from 
airflow.configuration import conf`.
+  """
+
+wrong_conf_import = "from airflow import conf"
+proper_conf_import = "from airflow.configuration import conf"
+
+@staticmethod
+def _conf_import_info(file_path, line_number):
+return "Affected file: {} (line {})".format(file_path, line_number)
+
+def _check_file(self, file_path):
+problems = []
+conf_import_check = self.wrong_conf_import
+with open(file_path, "r") as file_pointer:
+try:
+for line_number, line in enumerate(file_pointer, 1):
+if conf_import_check in line:
+problems.append(self._conf_import_info(file_path, 
line_number))
+except UnicodeDecodeError:
+problems.append("Unable to read python file 
{}".format(file_path))
+return problems
+
+def check(self):
+dag_folder = conf.get("core", "dags_folder")
+files = list_py_file_paths(directory=dag_folder, 
include_examples=False)
+problems = []
+for file in files:
+if not file.endswith(".py"):
+continue
+problems.extend(self._check_file(file))
+return problems
diff --git a/tests/upgrade/rules/test_fix_conf_not_importable_from_airflow.py 
b/tests/upgrade/rules/test_fix_conf_not_importable_from_airflow.py
new file mode 100644
index 000..b65906f
--- /dev/null
+++ b/tests/upgrade/rules/test_fix_conf_not_importable_from_airflow.py
@@ -0,0 +1,75 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from contextl

[airflow] branch master updated: Refactor info command to use AirflowConsole (#14757)

2021-03-14 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 7d1eae3  Refactor info command to use AirflowConsole (#14757)
7d1eae3 is described below

commit 7d1eae34348d642437f2392cb5f49ac4f1e7b89b
Author: Tomek Urbaszek 
AuthorDate: Sun Mar 14 11:09:32 2021 +0100

Refactor info command to use AirflowConsole (#14757)

This change unifies way how we render info output. This solves
few problems:
- users can use output flag
- because of that users can use plain output which can be useful
when working with docker
---
 airflow/cli/cli_parser.py   |   1 +
 airflow/cli/commands/info_command.py| 315 ++--
 airflow/cli/simple_table.py |   9 +-
 tests/cli/commands/test_info_command.py | 107 +--
 4 files changed, 197 insertions(+), 235 deletions(-)

diff --git a/airflow/cli/cli_parser.py b/airflow/cli/cli_parser.py
index c39328f..a1c3370 100644
--- a/airflow/cli/cli_parser.py
+++ b/airflow/cli/cli_parser.py
@@ -1551,6 +1551,7 @@ airflow_commands: List[CLICommand] = [
 ARG_ANONYMIZE,
 ARG_FILE_IO,
 ARG_VERBOSE,
+ARG_OUTPUT,
 ),
 ),
 ActionCommand(
diff --git a/airflow/cli/commands/info_command.py 
b/airflow/cli/commands/info_command.py
index aea6ea8..a0a65b6 100644
--- a/airflow/cli/commands/info_command.py
+++ b/airflow/cli/commands/info_command.py
@@ -22,14 +22,14 @@ import os
 import platform
 import subprocess
 import sys
-from typing import Optional
+from typing import List, Optional
 from urllib.parse import urlsplit, urlunsplit
 
 import requests
 import tenacity
 
 from airflow import configuration
-from airflow.cli.simple_table import AirflowConsole, SimpleTable
+from airflow.cli.simple_table import AirflowConsole
 from airflow.providers_manager import ProvidersManager
 from airflow.typing_compat import Protocol
 from airflow.utils.cli import suppress_logs_and_warning
@@ -91,7 +91,6 @@ class PiiAnonymizer(Anonymizer):
 if url_parts.netloc:
 # unpack
 userinfo = None
-host = None
 username = None
 password = None
 
@@ -179,202 +178,162 @@ _MACHINE_TO_ARCHITECTURE = {
 }
 
 
-class _BaseInfo:
-def info(self, console: AirflowConsole) -> None:
-"""
-Print required information to provided console.
-You should implement this function in custom classes.
-"""
-raise NotImplementedError()
+class AirflowInfo:
+"""Renders information about Airflow instance"""
 
-def show(self) -> None:
-"""Shows info"""
-console = AirflowConsole()
-self.info(console)
+def __init__(self, anonymizer):
+self.anonymizer = anonymizer
 
-def render_text(self) -> str:
-"""Exports the info to string"""
-console = AirflowConsole(record=True)
-with console.capture():
-self.info(console)
-return console.export_text()
-
-
-class AirflowInfo(_BaseInfo):
-"""All information related to Airflow, system and other."""
-
-def __init__(self, anonymizer: Anonymizer):
-self.airflow_version = airflow_version
-self.system = SystemInfo(anonymizer)
-self.tools = ToolsInfo(anonymizer)
-self.paths = PathsInfo(anonymizer)
-self.config = ConfigInfo(anonymizer)
-self.provider = ProvidersInfo()
-
-def info(self, console: AirflowConsole):
-console.print(
-f"[bold][green]Apache Airflow[/bold][/green]: 
{self.airflow_version}\n", highlight=False
-)
-self.system.info(console)
-self.tools.info(console)
-self.paths.info(console)
-self.config.info(console)
-self.provider.info(console)
-
-
-class SystemInfo(_BaseInfo):
-"""Basic system and python information"""
-
-def __init__(self, anonymizer: Anonymizer):
-self.operating_system = OperatingSystem.get_current()
-self.arch = Architecture.get_current()
-self.uname = platform.uname()
-self.locale = locale.getdefaultlocale()
-self.python_location = anonymizer.process_path(sys.executable)
-self.python_version = sys.version.replace("\n", " ")
-
-def info(self, console: AirflowConsole):
-table = SimpleTable(title="System info")
-table.add_column()
-table.add_column(width=100)
-table.add_row("OS", self.operating_system or "NOT AVAILABLE")
-table.add_row("architecture", self.arch or "NOT AVAILABLE"

[airflow-site] branch master updated: Add Apache Airflow Survey 2020 blog post (#385)

2021-03-09 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/master by this push:
 new 409ceb4  Add Apache Airflow Survey 2020 blog post (#385)
409ceb4 is described below

commit 409ceb40633f127ed0f4fb508bfdae2e7d77b319
Author: Tomek Urbaszek 
AuthorDate: Tue Mar 9 22:55:49 2021 +0100

Add Apache Airflow Survey 2020 blog post (#385)

* Add Apache Airflow Survey 2020 blog post

Co-authored-by: Leah E. Cole <6719667+leahec...@users.noreply.github.com>
Co-authored-by: Kaxil Naik 
Co-authored-by: Xiaodong DENG 
---
 ...What_best_describes_your_current_occupation.png | Bin 0 -> 91268 bytes
 .../What_executor_type_do_you_use.png  | Bin 0 -> 59914 bytes
 ...al_services_do_you_use_in_your_Airflow_DAGs.png | Bin 0 -> 104634 bytes
 .../airflow-survey-2020/Where_are_you_based.png| Bin 0 -> 95584 bytes
 .../content/en/blog/airflow-survey-2020/index.md   | 435 +
 5 files changed, 435 insertions(+)

diff --git 
a/landing-pages/site/content/en/blog/airflow-survey-2020/What_best_describes_your_current_occupation.png
 
b/landing-pages/site/content/en/blog/airflow-survey-2020/What_best_describes_your_current_occupation.png
new file mode 100644
index 000..63ca2fb
Binary files /dev/null and 
b/landing-pages/site/content/en/blog/airflow-survey-2020/What_best_describes_your_current_occupation.png
 differ
diff --git 
a/landing-pages/site/content/en/blog/airflow-survey-2020/What_executor_type_do_you_use.png
 
b/landing-pages/site/content/en/blog/airflow-survey-2020/What_executor_type_do_you_use.png
new file mode 100644
index 000..9c0b3af
Binary files /dev/null and 
b/landing-pages/site/content/en/blog/airflow-survey-2020/What_executor_type_do_you_use.png
 differ
diff --git 
a/landing-pages/site/content/en/blog/airflow-survey-2020/What_external_services_do_you_use_in_your_Airflow_DAGs.png
 
b/landing-pages/site/content/en/blog/airflow-survey-2020/What_external_services_do_you_use_in_your_Airflow_DAGs.png
new file mode 100644
index 000..fa15a74
Binary files /dev/null and 
b/landing-pages/site/content/en/blog/airflow-survey-2020/What_external_services_do_you_use_in_your_Airflow_DAGs.png
 differ
diff --git 
a/landing-pages/site/content/en/blog/airflow-survey-2020/Where_are_you_based.png
 
b/landing-pages/site/content/en/blog/airflow-survey-2020/Where_are_you_based.png
new file mode 100644
index 000..130ed36
Binary files /dev/null and 
b/landing-pages/site/content/en/blog/airflow-survey-2020/Where_are_you_based.png
 differ
diff --git a/landing-pages/site/content/en/blog/airflow-survey-2020/index.md 
b/landing-pages/site/content/en/blog/airflow-survey-2020/index.md
new file mode 100644
index 000..aaed88b
--- /dev/null
+++ b/landing-pages/site/content/en/blog/airflow-survey-2020/index.md
@@ -0,0 +1,435 @@
+---
+title: "Airflow Survey 2020"
+linkTitle: "Airflow Survey 2020"
+author: "Tomek Urbaszek"
+twitter: "turbaszek"
+github: "turbaszek"
+linkedin: "tomaszurbaszek"
+description: "We observe steady growth in number of users as well as in an 
amount of active contributors. So listening and understanding our community is 
of high importance."
+tags: ["community", "survey", "users"]
+date: "2021-03-09"
+---
+# Apache Airflow Survey 2020
+
+World of data processing tools is growing steadily. Apache Airflow seems to be 
already considered as
+crucial component of this complex ecosystem. We observe steady growth in 
number of users as well as in
+an amount of active contributors. So listening and understanding our community 
is of high importance.
+
+It's worth to note that the 2020 survey was still mostly about 1.10.X version 
of Apache Airflow and
+possibly many drawbacks were addressed in the 2.0 version that was released in 
December 2020. But if this
+is true, we will learn next year!
+
+## Overview of the user
+
+![](What_best_describes_your_current_occupation.png)
+
+**What best describes your current occupation? (single choice)**
+
+| |   No. | % |
+|-|---|---|
+| Data Engineer   |   115 | 56.65 |
+| Developer   |28 | 13.79 |
+| DevOps  |17 |  8.37 |
+| Solutions Architect |14 |  6.9  |
+| Data Scientist  |12 |  5.91 |
+| Other   |10 |  4.93 |
+| Data Analyst| 4 |  1.97 |
+| Support Engineer| 3 |  1.48 |
+
+Those results are not a surprise as Airflow is a tool dedicated to 
data-related tasks. The majority of
+our users are data engineers, scientists or analysts. The 2020 results are 
similar to [those from 2019](https://airflow.apache.org/blog/airflow-survey/) 
with
+visible slight increase in ML use cases.
+
+Additiona

[airflow] branch master updated: Adds new Airbyte provider (#14492)

2021-03-06 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 20b72ae  Adds new Airbyte provider (#14492)
20b72ae is described below

commit 20b72aea4dc1e25f2aa3cfe62b45ca1ff29d1cbb
Author: Marcos Marx 
AuthorDate: Sat Mar 6 11:19:30 2021 -0300

Adds new Airbyte provider (#14492)

This commit add hook, operators and sensors to interact with
Airbyte external service.
---
 CONTRIBUTING.rst   |  23 ++--
 INSTALL|  22 ++--
 airflow/providers/airbyte/CHANGELOG.rst|  25 
 airflow/providers/airbyte/__init__.py  |  17 +++
 airflow/providers/airbyte/example_dags/__init__.py |  16 +++
 .../example_dags/example_airbyte_trigger_job.py|  64 +++
 airflow/providers/airbyte/hooks/__init__.py|  17 +++
 airflow/providers/airbyte/hooks/airbyte.py | 109 ++
 airflow/providers/airbyte/operators/__init__.py|  17 +++
 airflow/providers/airbyte/operators/airbyte.py |  85 ++
 airflow/providers/airbyte/provider.yaml|  51 +
 airflow/providers/airbyte/sensors/__init__.py  |  16 +++
 airflow/providers/airbyte/sensors/airbyte.py   |  73 
 airflow/providers/dependencies.json|   3 +
 docs/apache-airflow-providers-airbyte/commits.rst  |  27 +
 .../connections.rst|  36 ++
 docs/apache-airflow-providers-airbyte/index.rst| 121 
 .../operators/airbyte.rst  |  58 ++
 docs/apache-airflow/extra-packages-ref.rst |   2 +
 docs/integration-logos/airbyte/Airbyte.png | Bin 0 -> 7405 bytes
 docs/spelling_wordlist.txt |   2 +
 .../run_install_and_test_provider_packages.sh  |   2 +-
 setup.py   |   1 +
 tests/core/test_providers_manager.py   |   1 +
 tests/providers/airbyte/__init__.py|  16 +++
 tests/providers/airbyte/hooks/__init__.py  |  16 +++
 tests/providers/airbyte/hooks/test_airbyte.py  | 126 +
 tests/providers/airbyte/operators/__init__.py  |  16 +++
 tests/providers/airbyte/operators/test_airbyte.py  |  55 +
 tests/providers/airbyte/sensors/__init__.py|  16 +++
 tests/providers/airbyte/sensors/test_airbyte.py|  93 +++
 31 files changed, 1103 insertions(+), 23 deletions(-)

diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 466b8d9..90be997 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -585,17 +585,17 @@ This is the full list of those extras:
 
   .. START EXTRAS HERE
 
-all, all_dbs, amazon, apache.atlas, apache.beam, apache.cassandra, 
apache.druid, apache.hdfs,
-apache.hive, apache.kylin, apache.livy, apache.pig, apache.pinot, 
apache.spark, apache.sqoop,
-apache.webhdfs, async, atlas, aws, azure, cassandra, celery, cgroups, 
cloudant, cncf.kubernetes,
-crypto, dask, databricks, datadog, devel, devel_all, devel_ci, devel_hadoop, 
dingding, discord, doc,
-docker, druid, elasticsearch, exasol, facebook, ftp, gcp, gcp_api, 
github_enterprise, google,
-google_auth, grpc, hashicorp, hdfs, hive, http, imap, jdbc, jenkins, jira, 
kerberos, kubernetes,
-ldap, microsoft.azure, microsoft.mssql, microsoft.winrm, mongo, mssql, mysql, 
neo4j, odbc, openfaas,
-opsgenie, oracle, pagerduty, papermill, password, pinot, plexus, postgres, 
presto, qds, qubole,
-rabbitmq, redis, s3, salesforce, samba, segment, sendgrid, sentry, sftp, 
singularity, slack,
-snowflake, spark, sqlite, ssh, statsd, tableau, telegram, vertica, virtualenv, 
webhdfs, winrm,
-yandex, zendesk
+airbyte, all, all_dbs, amazon, apache.atlas, apache.beam, apache.cassandra, 
apache.druid,
+apache.hdfs, apache.hive, apache.kylin, apache.livy, apache.pig, apache.pinot, 
apache.spark,
+apache.sqoop, apache.webhdfs, async, atlas, aws, azure, cassandra, celery, 
cgroups, cloudant,
+cncf.kubernetes, crypto, dask, databricks, datadog, devel, devel_all, 
devel_ci, devel_hadoop,
+dingding, discord, doc, docker, druid, elasticsearch, exasol, facebook, ftp, 
gcp, gcp_api,
+github_enterprise, google, google_auth, grpc, hashicorp, hdfs, hive, http, 
imap, jdbc, jenkins,
+jira, kerberos, kubernetes, ldap, microsoft.azure, microsoft.mssql, 
microsoft.winrm, mongo, mssql,
+mysql, neo4j, odbc, openfaas, opsgenie, oracle, pagerduty, papermill, 
password, pinot, plexus,
+postgres, presto, qds, qubole, rabbitmq, redis, s3, salesforce, samba, 
segment, sendgrid, sentry,
+sftp, singularity, slack, snowflake, spark, sqlite, ssh, statsd, tableau, 
telegram, vertica,
+virtualenv, webhdfs, winrm, yandex, zendesk
 
   .. END EXTRAS HERE
 
@@ -653,6 +653,7 @@ Here is the list of packages and their ext

[airflow] branch master updated (8801a0c -> 0ef084c)

2021-03-02 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 8801a0c  remove inline tree js (#14552)
 add 0ef084c  Add plain format output to cli tables (#14546)

No new revisions were added by this update.

Summary of changes:
 airflow/cli/cli_parser.py |  6 +++---
 airflow/cli/simple_table.py   | 11 +++
 docs/apache-airflow/usage-cli.rst |  1 +
 docs/spelling_wordlist.txt|  1 +
 4 files changed, 16 insertions(+), 3 deletions(-)



[airflow] branch master updated: Make airflow info to work with pipes (#14528)

2021-02-28 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new a1097f6  Make airflow info to work with pipes (#14528)
a1097f6 is described below

commit a1097f6f29796bd11f8ed7b3651dfeb3e40eec09
Author: Tomek Urbaszek 
AuthorDate: Sun Feb 28 16:42:33 2021 +0100

Make airflow info to work with pipes (#14528)

After this change output from AirflowConsole can be piped in bash
without loosing some information thanks to fixed width of output.

Closes: #14518


Co-authored-by: Kamil Breguła 
---
 airflow/cli/commands/cheat_sheet_command.py |  7 ++-
 airflow/cli/commands/info_command.py| 21 ++---
 airflow/cli/simple_table.py |  6 ++
 3 files changed, 18 insertions(+), 16 deletions(-)

diff --git a/airflow/cli/commands/cheat_sheet_command.py 
b/airflow/cli/commands/cheat_sheet_command.py
index e77ebad..0dcce3c 100644
--- a/airflow/cli/commands/cheat_sheet_command.py
+++ b/airflow/cli/commands/cheat_sheet_command.py
@@ -14,13 +14,10 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-
 from typing import Iterable, List, Optional, Union
 
-from rich.console import Console
-
 from airflow.cli.cli_parser import ActionCommand, GroupCommand, 
airflow_commands
-from airflow.cli.simple_table import SimpleTable
+from airflow.cli.simple_table import AirflowConsole, SimpleTable
 from airflow.utils.cli import suppress_logs_and_warning
 from airflow.utils.helpers import partition
 
@@ -44,7 +41,7 @@ def display_commands_index():
 actions_iter, groups_iter = partition(lambda x: isinstance(x, 
GroupCommand), commands)
 actions, groups = list(actions_iter), list(groups_iter)
 
-console = Console()
+console = AirflowConsole()
 if actions:
 table = SimpleTable(title=help_msg or "Miscellaneous commands")
 table.add_column(width=40)
diff --git a/airflow/cli/commands/info_command.py 
b/airflow/cli/commands/info_command.py
index 9a32520..5db357a 100644
--- a/airflow/cli/commands/info_command.py
+++ b/airflow/cli/commands/info_command.py
@@ -27,10 +27,9 @@ from urllib.parse import urlsplit, urlunsplit
 
 import requests
 import tenacity
-from rich.console import Console
 
 from airflow import configuration
-from airflow.cli.simple_table import SimpleTable
+from airflow.cli.simple_table import AirflowConsole, SimpleTable
 from airflow.providers_manager import ProvidersManager
 from airflow.typing_compat import Protocol
 from airflow.utils.cli import suppress_logs_and_warning
@@ -181,7 +180,7 @@ _MACHINE_TO_ARCHITECTURE = {
 
 
 class _BaseInfo:
-def info(self, console: Console) -> None:
+def info(self, console: AirflowConsole) -> None:
 """
 Print required information to provided console.
 You should implement this function in custom classes.
@@ -190,12 +189,12 @@ class _BaseInfo:
 
 def show(self) -> None:
 """Shows info"""
-console = Console()
+console = AirflowConsole()
 self.info(console)
 
 def render_text(self) -> str:
 """Exports the info to string"""
-console = Console(record=True)
+console = AirflowConsole(record=True)
 with console.capture():
 self.info(console)
 return console.export_text()
@@ -212,7 +211,7 @@ class AirflowInfo(_BaseInfo):
 self.config = ConfigInfo(anonymizer)
 self.provider = ProvidersInfo()
 
-def info(self, console: Console):
+def info(self, console: AirflowConsole):
 console.print(
 f"[bold][green]Apache Airflow[/bold][/green]: 
{self.airflow_version}\n", highlight=False
 )
@@ -234,7 +233,7 @@ class SystemInfo(_BaseInfo):
 self.python_location = anonymizer.process_path(sys.executable)
 self.python_version = sys.version.replace("\n", " ")
 
-def info(self, console: Console):
+def info(self, console: AirflowConsole):
 table = SimpleTable(title="System info")
 table.add_column()
 table.add_column(width=100)
@@ -260,7 +259,7 @@ class PathsInfo(_BaseInfo):
 os.path.exists(os.path.join(path_elem, "airflow")) for path_elem 
in system_path
 )
 
-def info(self, console: Console):
+def info(self, console: AirflowConsole):
 table = SimpleTable(title="Paths info")
 table.add_column()
 table.add_column(width=150)
@@ -274,7 +273,7 @@ class PathsInfo(_BaseInfo):
 class ProvidersInfo(_BaseInfo):
 """providers information"""
 
-def i

[airflow] branch master updated (fff3444 -> 1b0851c)

2021-02-28 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from fff3444  Add health-check for celery worker (#14522)
 add 1b0851c  Change HTTP to HTTPS in links (#14527)

No new revisions were added by this update.

Summary of changes:
 INTHEWILD.md | 16 
 README.md|  6 +++---
 2 files changed, 11 insertions(+), 11 deletions(-)



[airflow] branch master updated (77f5629 -> bfef559)

2021-02-27 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 77f5629  Update docs about tableau and salesforce provider (#14495)
 add bfef559  Corrects order of argument in docstring in GCSHook.download 
method (#14497)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/cloud/hooks/gcs.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)



[airflow] branch master updated (11d03d2 -> 8ad2f9c)

2021-02-26 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 11d03d2  Add Azure Data Factory hook (#11015)
 add 8ad2f9c  Removes DigitalOcean from INTHEWILD.md (#14488)

No new revisions were added by this update.

Summary of changes:
 INTHEWILD.md | 1 -
 1 file changed, 1 deletion(-)



[airflow] branch master updated (c4da66c -> 5a3207e)

2021-02-25 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from c4da66c  Add PATH to basic_static_checks. (#14451)
 add 5a3207e  Add Snowflake provider to boring cyborg automation (#14432)

No new revisions were added by this update.

Summary of changes:
 .github/CODEOWNERS| 1 +
 .github/boring-cyborg.yml | 6 ++
 2 files changed, 7 insertions(+)



[airflow] branch master updated (6019c78 -> a48bedf)

2021-02-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 6019c78  Pprint default args and wrap (#14345)
 add a48bedf  Fix spelling in "ignorable" (#14348)

No new revisions were added by this update.

Summary of changes:
 airflow/models/taskinstance.py | 2 +-
 airflow/ti_deps/dep_context.py | 2 +-
 airflow/ti_deps/deps/base_ti_dep.py| 6 +++---
 airflow/ti_deps/deps/dag_ti_slots_available_dep.py | 2 +-
 airflow/ti_deps/deps/dag_unpaused_dep.py   | 2 +-
 airflow/ti_deps/deps/dagrun_exists_dep.py  | 2 +-
 airflow/ti_deps/deps/dagrun_id_dep.py  | 2 +-
 airflow/ti_deps/deps/exec_date_after_start_date_dep.py | 2 +-
 airflow/ti_deps/deps/not_in_retry_period_dep.py| 2 +-
 airflow/ti_deps/deps/not_previously_skipped_dep.py | 2 +-
 airflow/ti_deps/deps/pool_slots_available_dep.py   | 2 +-
 airflow/ti_deps/deps/prev_dagrun_dep.py| 2 +-
 airflow/ti_deps/deps/ready_to_reschedule.py| 2 +-
 airflow/ti_deps/deps/runnable_exec_date_dep.py | 2 +-
 airflow/ti_deps/deps/task_concurrency_dep.py   | 2 +-
 airflow/ti_deps/deps/task_not_running_dep.py   | 2 +-
 airflow/ti_deps/deps/trigger_rule_dep.py   | 2 +-
 airflow/ti_deps/deps/valid_state_dep.py| 2 +-
 docs/spelling_wordlist.txt | 2 +-
 19 files changed, 21 insertions(+), 21 deletions(-)



[airflow] branch master updated (82cb041 -> a7e4266)

2021-02-21 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 82cb041  Attempts to stabilize and improve speed of static checks 
(#14332)
 add a7e4266  Refactor GoogleDriveToGCSOperator to use common methods 
(#14276)

No new revisions were added by this update.

Summary of changes:
 .../google/cloud/transfers/gdrive_to_gcs.py| 60 ++
 .../google/cloud/transfers/test_gdrive_to_gcs.py   | 60 +-
 .../google/cloud/transfers/test_gdrive_to_local.py |  9 ++--
 3 files changed, 43 insertions(+), 86 deletions(-)



[airflow] branch master updated (b20e394 -> 8c060d5)

2021-02-18 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from b20e394  Fix race in test_celery_executor.py 
test_retry_on_error_sending_task test (#14273)
 add 8c060d5  Don't allow SlackHook.call method accept *args (#14289)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/slack/hooks/slack.py| 4 ++--
 tests/providers/slack/hooks/test_slack.py | 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)



[airflow] branch master updated: Add CODEOWNERS for automated PR review assignment (#14216)

2021-02-16 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new b23fc13  Add CODEOWNERS for automated PR review assignment (#14216)
b23fc13 is described below

commit b23fc137812f5eabf7834e07e032915e2a504c17
Author: Tomek Urbaszek 
AuthorDate: Tue Feb 16 11:00:44 2021 +0100

Add CODEOWNERS for automated PR review assignment (#14216)


Co-authored-by: Kaxil Naik 
Co-authored-by: Xiaodong DENG 
Co-authored-by: Jarek Potiuk 
Co-authored-by: Ash Berlin-Taylor 
---
 .github/CODEOWNERS | 36 
 1 file changed, 36 insertions(+)

diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS
new file mode 100644
index 000..a3089fc
--- /dev/null
+++ b/.github/CODEOWNERS
@@ -0,0 +1,36 @@
+# Core
+/airflow/executors/ @kaxil @XD-DENG @ashb @turbaszek
+/airflow/jobs/ @kaxil @ashb @XD-DENG
+/airflow/models/ @kaxil @XD-DENG @ashb @turbaszek
+
+# DAG Serialization
+/airflow/serialization/ @kaxil @ashb
+
+# Kubernetes
+/airflow/kubernetes/ @dimberman
+/airflow/kubernetes_executor_templates/ @dimberman
+/airflow/executors/celery_kubernetes_executor.py @dimberman
+/chart/ @dimberman @ashb
+
+# Docs
+/docs/ @mik-laj @kaxil @vikramkoka
+
+# API
+/airflow/api/ @mik-laj @kaxil
+/airflow/api_connexion/ @mik-laj @kaxil
+
+# WWW
+/airflow/www/ @ryanahamilton @ashb
+
+# Providers
+/airflow/providers/google/ @turbaszek @mik-laj
+
+# Dev tools
+/.github/workflows/ @potiuk @ashb @kaxil
+breeze @potiuk
+breeze-complete @potiuk
+Dockerfile @potiuk @ashb
+Dockerfile.ci @potiuk @ashb
+/dev/ @potiuk @ashb @kaxil
+/provider_packages/ @potiuk @ashb
+/scripts/ @potiuk @ashb



[airflow] branch master updated (e4629b6 -> 1ab4066)

2021-02-15 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e4629b6  docs: CI.rst: Typos and Link Fix (#14250)
 add 1ab4066  Add GoogleDriveToLocalOperator (#14191)

No new revisions were added by this update.

Summary of changes:
 ...gdrive_to_gcs.py => example_gdrive_to_local.py} | 20 
 .../transfers/gdrive_to_local.py}  | 58 --
 airflow/providers/google/provider.yaml |  4 ++
 airflow/providers/google/suite/hooks/drive.py  | 17 ++-
 .../{gcs_to_local.rst => gdrive_to_local.rst}  | 25 +-
 .../google/cloud/transfers/test_gdrive_to_local.py | 47 ++
 6 files changed, 119 insertions(+), 52 deletions(-)
 copy airflow/providers/google/cloud/example_dags/{example_gdrive_to_gcs.py => 
example_gdrive_to_local.py} (73%)
 copy airflow/providers/google/{suite/sensors/drive.py => 
cloud/transfers/gdrive_to_local.py} (69%)
 copy docs/apache-airflow-providers-google/operators/transfer/{gcs_to_local.rst 
=> gdrive_to_local.rst} (59%)
 create mode 100644 
tests/providers/google/cloud/transfers/test_gdrive_to_local.py



[airflow] branch master updated (af5eb55 -> d411f46)

2021-02-14 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from af5eb55  Fixes regexp in entrypoint to include password-less entries 
(#14221)
 add d411f46  Correct a typo in upgrading-to-2.rst (#14225)

No new revisions were added by this update.

Summary of changes:
 docs/apache-airflow/upgrading-to-2.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[airflow] branch master updated (9536953 -> 59c94c6)

2021-02-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 9536953  Restructure COMMITTERS.rst (#14218)
 add 59c94c6  Add `exists_ok` flag to 
BigQueryCreateEmptyTable(Dataset)Operator (#14026)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/cloud/hooks/bigquery.py |  2 +-
 airflow/providers/google/cloud/operators/bigquery.py | 12 ++--
 2 files changed, 11 insertions(+), 3 deletions(-)



[airflow] branch master updated (e3bcaa3 -> e31b27d)

2021-02-12 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e3bcaa3  Correct typo in GCSObjectsWtihPrefixExistenceSensor  (#14179)
 add e31b27d  Add materialized view support for BigQuery (#14201)

No new revisions were added by this update.

Summary of changes:
 .../example_dags/example_bigquery_operations.py| 24 +--
 airflow/providers/google/cloud/hooks/bigquery.py   |  6 
 .../providers/google/cloud/operators/bigquery.py   |  8 -
 .../operators/cloud/bigquery.rst   | 17 +++
 .../providers/google/cloud/hooks/test_bigquery.py  | 29 ++
 .../google/cloud/operators/test_bigquery.py| 34 ++
 6 files changed, 115 insertions(+), 3 deletions(-)



[airflow] branch master updated (6dc6339 -> e3bcaa3)

2021-02-12 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 6dc6339  Update Tree View date ticks (#14141)
 add e3bcaa3  Correct typo in GCSObjectsWtihPrefixExistenceSensor  (#14179)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md  |  2 +-
 airflow/contrib/sensors/gcs_sensor.py|  8 
 airflow/providers/google/BACKPORT_PROVIDER_README.md |  2 +-
 airflow/providers/google/cloud/sensors/gcs.py| 19 ++-
 tests/always/test_project_structure.py   |  1 +
 tests/deprecated_classes.py  |  2 +-
 tests/providers/google/cloud/sensors/test_gcs.py | 10 +-
 7 files changed, 31 insertions(+), 13 deletions(-)



[airflow] branch master updated (84ef24c -> e2a06a3)

2021-02-04 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 84ef24c  Remove unused 'context' variable in task_instance.py (#14049)
 add e2a06a3  Added json_render method to separate filtering from view 
(#14024)

No new revisions were added by this update.

Summary of changes:
 airflow/www/utils.py | 13 -
 airflow/www/views.py |  2 --
 2 files changed, 12 insertions(+), 3 deletions(-)



[airflow] branch master updated (604a37e -> 84ef24c)

2021-02-03 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 604a37e  Fix broken SLA Mechanism (#14056)
 add 84ef24c  Remove unused 'context' variable in task_instance.py (#14049)

No new revisions were added by this update.

Summary of changes:
 airflow/models/taskinstance.py | 1 -
 1 file changed, 1 deletion(-)



[airflow] branch v1-10-stable updated: Treat default value in HostnameCallable rule as good one (#13670)

2021-02-02 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-stable by this push:
 new bdae805  Treat default value in HostnameCallable rule as good one 
(#13670)
bdae805 is described below

commit bdae805a4d829f6fa714bf8be3df591d03de5685
Author: Tomek Urbaszek 
AuthorDate: Tue Feb 2 14:29:07 2021 +0100

Treat default value in HostnameCallable rule as good one (#13670)
---
 airflow/upgrade/rules/hostname_callable_rule.py | 7 ++-
 1 file changed, 6 insertions(+), 1 deletion(-)

diff --git a/airflow/upgrade/rules/hostname_callable_rule.py 
b/airflow/upgrade/rules/hostname_callable_rule.py
index fbc571a..a316e52 100644
--- a/airflow/upgrade/rules/hostname_callable_rule.py
+++ b/airflow/upgrade/rules/hostname_callable_rule.py
@@ -24,10 +24,15 @@ from airflow.upgrade.rules.base_rule import BaseRule
 class HostnameCallable(BaseRule):
 title = "Unify hostname_callable option in core section"
 
-description = ""
+description = "hostname_callable option is using now only dots instead of 
dots and colons"
 
 def check(self):
+default = "socket:getfqdn"
 hostname_callable_conf = conf.get("core", "hostname_callable")
+if hostname_callable_conf == default:
+# If users use default value there's nothing they should do
+return None
+
 if ":" in hostname_callable_conf:
 return (
 "Error: hostname_callable `{}` "



[airflow] branch master updated: Added missing return parameter in read function of FileTaskHandler (#14001)

2021-02-01 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 2366f86  Added missing return parameter in read function of 
FileTaskHandler (#14001)
2366f86 is described below

commit 2366f861ee97f50e2cff83d557a1ae97030febf9
Author: vikram Jadhav 
AuthorDate: Mon Feb 1 19:03:30 2021 +0530

Added missing return parameter in read function of FileTaskHandler (#14001)

this issue ouccurs when invalid try_number value is passed in get logs api
FIXES: #13638
---
 airflow/utils/log/file_task_handler.py |  2 +-
 tests/utils/test_log_handlers.py   | 48 ++
 2 files changed, 49 insertions(+), 1 deletion(-)

diff --git a/airflow/utils/log/file_task_handler.py 
b/airflow/utils/log/file_task_handler.py
index fd313c2..7617bda 100644
--- a/airflow/utils/log/file_task_handler.py
+++ b/airflow/utils/log/file_task_handler.py
@@ -207,7 +207,7 @@ class FileTaskHandler(logging.Handler):
 logs = [
 [('default_host', f'Error fetching the logs. Try number 
{try_number} is invalid.')],
 ]
-return logs
+return logs, [{'end_of_log': True}]
 else:
 try_numbers = [try_number]
 
diff --git a/tests/utils/test_log_handlers.py b/tests/utils/test_log_handlers.py
index 76115a2..fad5f8b 100644
--- a/tests/utils/test_log_handlers.py
+++ b/tests/utils/test_log_handlers.py
@@ -63,6 +63,54 @@ class TestFileTaskLogHandler(unittest.TestCase):
 handler = handlers[0]
 assert handler.name == FILE_TASK_HANDLER
 
+def test_file_task_handler_when_ti_value_is_invalid(self):
+def task_callable(ti, **kwargs):
+ti.log.info("test")
+
+dag = DAG('dag_for_testing_file_task_handler', start_date=DEFAULT_DATE)
+dag.create_dagrun(run_type=DagRunType.MANUAL, state=State.RUNNING, 
execution_date=DEFAULT_DATE)
+task = PythonOperator(
+task_id='task_for_testing_file_log_handler',
+dag=dag,
+python_callable=task_callable,
+)
+ti = TaskInstance(task=task, execution_date=DEFAULT_DATE)
+
+logger = ti.log
+ti.log.disabled = False
+
+file_handler = next(
+(handler for handler in logger.handlers if handler.name == 
FILE_TASK_HANDLER), None
+)
+assert file_handler is not None
+
+set_context(logger, ti)
+assert file_handler.handler is not None
+# We expect set_context generates a file locally.
+log_filename = file_handler.handler.baseFilename
+assert os.path.isfile(log_filename)
+assert log_filename.endswith("1.log"), log_filename
+
+ti.run(ignore_ti_state=True)
+
+file_handler.flush()
+file_handler.close()
+
+assert hasattr(file_handler, 'read')
+# Return value of read must be a tuple of list and list.
+# passing invalid `try_number` to read function
+logs, metadatas = file_handler.read(ti, 0)
+assert isinstance(logs, list)
+assert isinstance(metadatas, list)
+assert len(logs) == 1
+assert len(logs) == len(metadatas)
+assert isinstance(metadatas[0], dict)
+assert logs[0][0][0] == "default_host"
+assert logs[0][0][1] == "Error fetching the logs. Try number 0 is 
invalid."
+
+# Remove the generated tmp log file.
+os.remove(log_filename)
+
 def test_file_task_handler(self):
 def task_callable(ti, **kwargs):
 ti.log.info("test")



[airflow] branch master updated: Add Google Cloud Workflows Operators (#13366)

2021-01-28 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 6d6588f  Add Google Cloud Workflows Operators (#13366)
6d6588f is described below

commit 6d6588fe2b8bb5fa33e930646d963df3e0530f23
Author: Tomek Urbaszek 
AuthorDate: Thu Jan 28 20:35:09 2021 +0100

Add Google Cloud Workflows Operators (#13366)

Add Google Cloud Workflows Operators, system test, example and sensor

Co-authored-by: Tobiasz Kędzierski 
---
 .../google/cloud/example_dags/example_workflows.py | 197 ++
 airflow/providers/google/cloud/hooks/workflows.py  | 401 
 .../providers/google/cloud/operators/workflows.py  | 714 +
 .../providers/google/cloud/sensors/workflows.py| 123 
 airflow/providers/google/provider.yaml |  14 +
 .../operators/cloud/workflows.rst  | 185 ++
 setup.py   |   2 +
 .../providers/google/cloud/hooks/test_workflows.py | 256 
 .../google/cloud/operators/test_workflows.py   | 383 +++
 .../cloud/operators/test_workflows_system.py   |  29 +
 .../google/cloud/sensors/test_workflows.py | 108 
 .../google/cloud/utils/gcp_authenticator.py|   1 +
 12 files changed, 2413 insertions(+)

diff --git a/airflow/providers/google/cloud/example_dags/example_workflows.py 
b/airflow/providers/google/cloud/example_dags/example_workflows.py
new file mode 100644
index 000..0fab435
--- /dev/null
+++ b/airflow/providers/google/cloud/example_dags/example_workflows.py
@@ -0,0 +1,197 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import os
+
+from airflow import DAG
+from airflow.providers.google.cloud.operators.workflows import (
+WorkflowsCancelExecutionOperator,
+WorkflowsCreateExecutionOperator,
+WorkflowsCreateWorkflowOperator,
+WorkflowsDeleteWorkflowOperator,
+WorkflowsGetExecutionOperator,
+WorkflowsGetWorkflowOperator,
+WorkflowsListExecutionsOperator,
+WorkflowsListWorkflowsOperator,
+WorkflowsUpdateWorkflowOperator,
+)
+from airflow.providers.google.cloud.sensors.workflows import 
WorkflowExecutionSensor
+from airflow.utils.dates import days_ago
+
+LOCATION = os.environ.get("GCP_WORKFLOWS_LOCATION", "us-central1")
+PROJECT_ID = os.environ.get("GCP_PROJECT_ID", "an-id")
+
+WORKFLOW_ID = os.getenv("GCP_WORKFLOWS_WORKFLOW_ID", "airflow-test-workflow")
+
+# [START how_to_define_workflow]
+WORKFLOW_CONTENT = """
+- getCurrentTime:
+call: http.get
+args:
+url: https://us-central1-workflowsample.cloudfunctions.net/datetime
+result: currentTime
+- readWikipedia:
+call: http.get
+args:
+url: https://en.wikipedia.org/w/api.php
+query:
+action: opensearch
+search: ${currentTime.body.dayOfTheWeek}
+result: wikiResult
+- returnResult:
+return: ${wikiResult.body[1]}
+"""
+
+WORKFLOW = {
+"description": "Test workflow",
+"labels": {"airflow-version": "dev"},
+"source_contents": WORKFLOW_CONTENT,
+}
+# [END how_to_define_workflow]
+
+EXECUTION = {"argument": ""}
+
+SLEEP_WORKFLOW_ID = os.getenv("GCP_WORKFLOWS_SLEEP_WORKFLOW_ID", 
"sleep_workflow")
+SLEEP_WORKFLOW_CONTENT = """
+- someSleep:
+call: sys.sleep
+args:
+seconds: 120
+"""
+
+SLEEP_WORKFLOW = {
+"description": "Test workflow",
+"labels": {"airflow-version": "dev"},
+"source_contents": SLEEP_WORKFLOW_CONTENT,
+}
+
+
+with DAG("example_cloud_workflows", start_date=days_ago(1), 
schedule_interval=None) as dag:
+# [START how_to_create_workflow]
+create_workflow = WorkflowsCreateWorkflowOperator(
+task_id="create_workflow",
+location=LOCATION,
+project_id=PROJECT_ID,
+wo

[airflow] branch master updated: Fix and improve GCP BigTable hook and system test (#13896)

2021-01-27 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 810c15e  Fix and improve GCP BigTable hook and system test (#13896)
810c15e is described below

commit 810c15ed85d7bcde8d5b8bc44e1cbd4859e29d2e
Author: Tobiasz Kędzierski 
AuthorDate: Wed Jan 27 12:25:40 2021 +0100

Fix and improve GCP BigTable hook and system test (#13896)

Improve environment variables in GCP BigTable system test.
It will help to parametrize system tests.
---
 .../google/cloud/example_dags/example_bigtable.py  | 32 ++--
 airflow/providers/google/cloud/hooks/bigtable.py   |  9 +++-
 .../providers/google/cloud/hooks/test_bigtable.py  | 58 --
 .../google/cloud/operators/test_bigtable_system.py |  7 +--
 4 files changed, 81 insertions(+), 25 deletions(-)

diff --git a/airflow/providers/google/cloud/example_dags/example_bigtable.py 
b/airflow/providers/google/cloud/example_dags/example_bigtable.py
index ce852ed..fc62cdf 100644
--- a/airflow/providers/google/cloud/example_dags/example_bigtable.py
+++ b/airflow/providers/google/cloud/example_dags/example_bigtable.py
@@ -60,22 +60,22 @@ from airflow.providers.google.cloud.sensors.bigtable import 
BigtableTableReplica
 from airflow.utils.dates import days_ago
 
 GCP_PROJECT_ID = getenv('GCP_PROJECT_ID', 'example-project')
-CBT_INSTANCE_ID = getenv('CBT_INSTANCE_ID', 'some-instance-id')
-CBT_INSTANCE_DISPLAY_NAME = getenv('CBT_INSTANCE_DISPLAY_NAME', 
'Human-readable name')
+CBT_INSTANCE_ID = getenv('GCP_BIG_TABLE_INSTANCE_ID', 'some-instance-id')
+CBT_INSTANCE_DISPLAY_NAME = getenv('GCP_BIG_TABLE_INSTANCE_DISPLAY_NAME', 
'Human-readable name')
 CBT_INSTANCE_DISPLAY_NAME_UPDATED = getenv(
-"CBT_INSTANCE_DISPLAY_NAME_UPDATED", "Human-readable name - updated"
+"GCP_BIG_TABLE_INSTANCE_DISPLAY_NAME_UPDATED", 
f"{CBT_INSTANCE_DISPLAY_NAME} - updated"
 )
-CBT_INSTANCE_TYPE = getenv('CBT_INSTANCE_TYPE', '2')
-CBT_INSTANCE_TYPE_PROD = getenv('CBT_INSTANCE_TYPE_PROD', '1')
-CBT_INSTANCE_LABELS = getenv('CBT_INSTANCE_LABELS', '{}')
-CBT_INSTANCE_LABELS_UPDATED = getenv('CBT_INSTANCE_LABELS', '{"env": "prod"}')
-CBT_CLUSTER_ID = getenv('CBT_CLUSTER_ID', 'some-cluster-id')
-CBT_CLUSTER_ZONE = getenv('CBT_CLUSTER_ZONE', 'europe-west1-b')
-CBT_CLUSTER_NODES = getenv('CBT_CLUSTER_NODES', '3')
-CBT_CLUSTER_NODES_UPDATED = getenv('CBT_CLUSTER_NODES_UPDATED', '5')
-CBT_CLUSTER_STORAGE_TYPE = getenv('CBT_CLUSTER_STORAGE_TYPE', '2')
-CBT_TABLE_ID = getenv('CBT_TABLE_ID', 'some-table-id')
-CBT_POKE_INTERVAL = getenv('CBT_POKE_INTERVAL', '60')
+CBT_INSTANCE_TYPE = getenv('GCP_BIG_TABLE_INSTANCE_TYPE', '2')
+CBT_INSTANCE_TYPE_PROD = getenv('GCP_BIG_TABLE_INSTANCE_TYPE_PROD', '1')
+CBT_INSTANCE_LABELS = getenv('GCP_BIG_TABLE_INSTANCE_LABELS', '{}')
+CBT_INSTANCE_LABELS_UPDATED = getenv('GCP_BIG_TABLE_INSTANCE_LABELS_UPDATED', 
'{"env": "prod"}')
+CBT_CLUSTER_ID = getenv('GCP_BIG_TABLE_CLUSTER_ID', 'some-cluster-id')
+CBT_CLUSTER_ZONE = getenv('GCP_BIG_TABLE_CLUSTER_ZONE', 'europe-west1-b')
+CBT_CLUSTER_NODES = getenv('GCP_BIG_TABLE_CLUSTER_NODES', '3')
+CBT_CLUSTER_NODES_UPDATED = getenv('GCP_BIG_TABLE_CLUSTER_NODES_UPDATED', '5')
+CBT_CLUSTER_STORAGE_TYPE = getenv('GCP_BIG_TABLE_CLUSTER_STORAGE_TYPE', '2')
+CBT_TABLE_ID = getenv('GCP_BIG_TABLE_TABLE_ID', 'some-table-id')
+CBT_POKE_INTERVAL = getenv('GCP_BIG_TABLE_POKE_INTERVAL', '60')
 
 
 with models.DAG(
@@ -93,8 +93,8 @@ with models.DAG(
 instance_display_name=CBT_INSTANCE_DISPLAY_NAME,
 instance_type=int(CBT_INSTANCE_TYPE),
 instance_labels=json.loads(CBT_INSTANCE_LABELS),
-cluster_nodes=int(CBT_CLUSTER_NODES),
-cluster_storage_type=CBT_CLUSTER_STORAGE_TYPE,
+cluster_nodes=None,
+cluster_storage_type=int(CBT_CLUSTER_STORAGE_TYPE),
 task_id='create_instance_task',
 )
 create_instance_task2 = BigtableCreateInstanceOperator(
diff --git a/airflow/providers/google/cloud/hooks/bigtable.py 
b/airflow/providers/google/cloud/hooks/bigtable.py
index c5a2fa1..60e309d 100644
--- a/airflow/providers/google/cloud/hooks/bigtable.py
+++ b/airflow/providers/google/cloud/hooks/bigtable.py
@@ -169,7 +169,14 @@ class BigtableHook(GoogleBaseHook):
 instance_labels,
 )
 
-clusters = [instance.cluster(main_cluster_id, main_cluster_zone, 
cluster_nodes, cluster_storage_type)]
+cluster_kwargs = dict(
+cluster_id=main_cluster_id,
+location_id=main_cluster_zone,
+default_storage_type=cluster_storage_type,
+)
+if instance_type != enums.Instance.Type.DEVELOPMENT and cluster_nodes:
+cluster_kwargs["serve_nodes"] = cluster_nodes
+clusters = [instance.cluster(**cluster_kw

[airflow] branch master updated (7f4c88c -> 6616617)

2021-01-27 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7f4c88c  Fix docker-compose command to initialize the environment 
(#13914)
 add 6616617  Add env variables to PubSub example dag (#13794)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/cloud/example_dags/example_pubsub.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)



[airflow] branch master updated (adf7755 -> d0ab7f6)

2021-01-25 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from adf7755  Add extra field to get_connnection REST endpoint (#13885)
 add d0ab7f6  Add ExasolToS3Operator (#13847)

No new revisions were added by this update.

Summary of changes:
 CONTRIBUTING.rst   |   2 +-
 .../providers/amazon/aws/transfers/exasol_to_s3.py | 117 +
 airflow/providers/amazon/provider.yaml |   3 +
 airflow/providers/dependencies.json|   1 +
 airflow/providers/exasol/hooks/exasol.py   |  33 +-
 .../amazon/aws/transfers/test_exasol_to_s3.py  |  68 
 tests/providers/exasol/hooks/test_exasol.py|  20 
 7 files changed, 242 insertions(+), 2 deletions(-)
 create mode 100644 airflow/providers/amazon/aws/transfers/exasol_to_s3.py
 create mode 100644 tests/providers/amazon/aws/transfers/test_exasol_to_s3.py



[airflow] branch master updated (910ba25 -> f473ca7)

2021-01-24 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 910ba25  Fix spellings (#13867)
 add f473ca7  Replace `google_cloud_storage_conn_id` by `gcp_conn_id` when 
using `GCSHook` (#13851)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/transfers/gcs_to_s3.py  |  2 +-
 airflow/providers/google/cloud/operators/bigquery.py |  4 ++--
 airflow/providers/google/cloud/operators/dataproc.py |  4 +---
 airflow/providers/google/cloud/operators/gcs.py  | 12 ++--
 airflow/providers/google/cloud/operators/text_to_speech.py   |  2 +-
 airflow/providers/google/cloud/sensors/gcs.py|  6 +++---
 airflow/providers/google/cloud/transfers/adls_to_gcs.py  |  2 +-
 .../google/cloud/transfers/azure_fileshare_to_gcs.py |  2 +-
 airflow/providers/google/cloud/transfers/cassandra_to_gcs.py |  2 +-
 airflow/providers/google/cloud/transfers/gcs_to_bigquery.py  |  2 +-
 airflow/providers/google/cloud/transfers/gcs_to_gcs.py   |  2 +-
 airflow/providers/google/cloud/transfers/gcs_to_local.py |  2 +-
 airflow/providers/google/cloud/transfers/local_to_gcs.py |  2 +-
 airflow/providers/google/cloud/transfers/s3_to_gcs.py|  2 +-
 .../google/marketing_platform/operators/campaign_manager.py  |  2 +-
 .../google/marketing_platform/operators/display_video.py |  2 +-
 airflow/providers/google/suite/transfers/gcs_to_gdrive.py|  2 +-
 tests/providers/google/cloud/operators/test_gcs.py   |  2 +-
 .../providers/google/cloud/operators/test_text_to_speech.py  |  2 +-
 tests/providers/google/cloud/sensors/test_gcs.py |  8 
 tests/providers/google/cloud/transfers/test_adls_to_gcs.py   |  2 +-
 .../google/cloud/transfers/test_azure_fileshare_to_gcs.py|  2 +-
 tests/providers/google/cloud/transfers/test_s3_to_gcs.py |  4 ++--
 .../marketing_platform/operators/test_campaign_manager.py|  2 +-
 .../marketing_platform/operators/test_display_video.py   |  2 +-
 tests/providers/google/suite/transfers/test_gcs_to_gdrive.py |  6 +++---
 26 files changed, 40 insertions(+), 42 deletions(-)



[airflow] branch master updated: Refactor DataprocOperators to support google-cloud-dataproc 2.0 (#13256)

2021-01-18 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 309788e  Refactor DataprocOperators to support google-cloud-dataproc 
2.0 (#13256)
309788e is described below

commit 309788e5e2023c598095a4ee00df417d94b6a5df
Author: Tomek Urbaszek 
AuthorDate: Mon Jan 18 17:49:19 2021 +0100

Refactor DataprocOperators to support google-cloud-dataproc 2.0 (#13256)
---
 airflow/providers/google/ADDITIONAL_INFO.md|   2 +
 airflow/providers/google/cloud/hooks/dataproc.py   | 104 -
 .../providers/google/cloud/operators/dataproc.py   |  30 +++--
 airflow/providers/google/cloud/sensors/dataproc.py |  12 +-
 setup.py   |   2 +-
 .../providers/google/cloud/hooks/test_dataproc.py  | 129 -
 .../google/cloud/operators/test_dataproc.py|  14 ++-
 .../google/cloud/sensors/test_dataproc.py  |   8 +-
 8 files changed, 157 insertions(+), 144 deletions(-)

diff --git a/airflow/providers/google/ADDITIONAL_INFO.md 
b/airflow/providers/google/ADDITIONAL_INFO.md
index c696e1b..16a6683 100644
--- a/airflow/providers/google/ADDITIONAL_INFO.md
+++ b/airflow/providers/google/ADDITIONAL_INFO.md
@@ -32,11 +32,13 @@ Details are covered in the UPDATING.md files for each 
library, but there are som
 | [``google-cloud-automl``](https://pypi.org/project/google-cloud-automl/) | 
``>=0.4.0,<2.0.0`` | ``>=2.1.0,<3.0.0``  | 
[`UPGRADING.md`](https://github.com/googleapis/python-bigquery-automl/blob/master/UPGRADING.md)
 |
 | 
[``google-cloud-bigquery-datatransfer``](https://pypi.org/project/google-cloud-bigquery-datatransfer/)
 | ``>=0.4.0,<2.0.0`` | ``>=3.0.0,<4.0.0``  | 
[`UPGRADING.md`](https://github.com/googleapis/python-bigquery-datatransfer/blob/master/UPGRADING.md)
 |
 | 
[``google-cloud-datacatalog``](https://pypi.org/project/google-cloud-datacatalog/)
 | ``>=0.5.0,<0.8`` | ``>=3.0.0,<4.0.0``  | 
[`UPGRADING.md`](https://github.com/googleapis/python-datacatalog/blob/master/UPGRADING.md)
 |
+| [``google-cloud-dataproc``](https://pypi.org/project/google-cloud-dataproc/) 
| ``>=1.0.1,<2.0.0`` | ``>=2.2.0,<3.0.0``  | 
[`UPGRADING.md`](https://github.com/googleapis/python-dataproc/blob/master/UPGRADING.md)
 |
 | [``google-cloud-kms``](https://pypi.org/project/google-cloud-os-login/) | 
``>=1.2.1,<2.0.0`` | ``>=2.0.0,<3.0.0``  | 
[`UPGRADING.md`](https://github.com/googleapis/python-kms/blob/master/UPGRADING.md)
 |
 | [``google-cloud-os-login``](https://pypi.org/project/google-cloud-os-login/) 
| ``>=1.0.0,<2.0.0`` | ``>=2.0.0,<3.0.0``  | 
[`UPGRADING.md`](https://github.com/googleapis/python-oslogin/blob/master/UPGRADING.md)
 |
 | [``google-cloud-pubsub``](https://pypi.org/project/google-cloud-pubsub/) | 
``>=1.0.0,<2.0.0`` | ``>=2.0.0,<3.0.0``  | 
[`UPGRADING.md`](https://github.com/googleapis/python-pubsub/blob/master/UPGRADING.md)
 |
 | [``google-cloud-tasks``](https://pypi.org/project/google-cloud-tasks/) | 
``>=1.2.1,<2.0.0`` | ``>=2.0.0,<3.0.0``  | 
[`UPGRADING.md`](https://github.com/googleapis/python-tasks/blob/master/UPGRADING.md)
 |
 
+
 ### The field names use the snake_case convention
 
 If your DAG uses an object from the above mentioned libraries passed by XCom, 
it is necessary to update the naming convention of the fields that are read. 
Previously, the fields used the CamelSnake convention, now the snake_case 
convention is used.
diff --git a/airflow/providers/google/cloud/hooks/dataproc.py 
b/airflow/providers/google/cloud/hooks/dataproc.py
index 12d5941..35d4786 100644
--- a/airflow/providers/google/cloud/hooks/dataproc.py
+++ b/airflow/providers/google/cloud/hooks/dataproc.py
@@ -26,18 +26,16 @@ from typing import Any, Dict, Iterable, List, Optional, 
Sequence, Tuple, Union
 from google.api_core.exceptions import ServerError
 from google.api_core.retry import Retry
 from google.cloud.dataproc_v1beta2 import (  # pylint: 
disable=no-name-in-module
-ClusterControllerClient,
-JobControllerClient,
-WorkflowTemplateServiceClient,
-)
-from google.cloud.dataproc_v1beta2.types import (  # pylint: 
disable=no-name-in-module
 Cluster,
-Duration,
-FieldMask,
+ClusterControllerClient,
 Job,
+JobControllerClient,
 JobStatus,
 WorkflowTemplate,
+WorkflowTemplateServiceClient,
 )
+from google.protobuf.duration_pb2 import Duration
+from google.protobuf.field_mask_pb2 import FieldMask
 
 from airflow.exceptions import AirflowException
 from airflow.providers.google.common.hooks.base_google import GoogleBaseHook
@@ -291,10 +289,12 @@ class DataprocHook(GoogleBaseHook):
 
 client = self.get_cluster_client(location=region)
 result = client.create_cluster(
-project_id=project_

[airflow] branch master updated: Change render to render_template in plugins.rst (#13560)

2021-01-08 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 71bb9f2  Change render to render_template in plugins.rst (#13560)
71bb9f2 is described below

commit 71bb9f298b62b84158ddce62bc06ca9c5311c1da
Author: Muhammad Aqeel 
AuthorDate: Fri Jan 8 17:26:05 2021 +0500

Change render to render_template in plugins.rst (#13560)

Changing render to render_template as BaseView object has no attribute 
'render'.
---
 docs/apache-airflow/plugins.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow/plugins.rst b/docs/apache-airflow/plugins.rst
index 96c4719..80708b9 100644
--- a/docs/apache-airflow/plugins.rst
+++ b/docs/apache-airflow/plugins.rst
@@ -198,7 +198,7 @@ definitions in Airflow.
 
 @expose("/")
 def test(self):
-return self.render("test_plugin/test.html", content="Hello 
galaxy!")
+return self.render_template("test_plugin/test.html", 
content="Hello galaxy!")
 
 v_appbuilder_view = TestAppBuilderBaseView()
 v_appbuilder_package = {"name": "Test View",



[airflow] branch master updated (feb8405 -> 43b2d33)

2021-01-04 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from feb8405  Support google-cloud-datacatalog 3.0.0 (#13224)
 add 43b2d33  Log migrations info in consisten way (#13458)

No new revisions were added by this update.

Summary of changes:
 .../migrations/versions/2c6edca13270_resource_based_permissions.py   | 5 +
 1 file changed, 5 insertions(+)



[airflow] branch master updated (35e4a3b -> 3a3e739)

2021-01-03 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 35e4a3b  GitHub PROD image build is pushed to GitHub Registry. (#13442)
 add 3a3e739  Fix insert_all method of BigQueryHook to support tables 
without schema (#13138)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/cloud/hooks/bigquery.py| 10 +-
 tests/providers/google/cloud/hooks/test_bigquery.py |  7 +++
 2 files changed, 8 insertions(+), 9 deletions(-)



[airflow] branch master updated (e436883 -> f7a1334)

2021-01-02 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e436883  Removes pip download when installing from local packages 
(#13422)
 add f7a1334  Add 'mongo_collection' to template_fields in 
MongoToS3Operator (#13361)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/transfers/mongo_to_s3.py| 2 +-
 tests/providers/amazon/aws/transfers/test_mongo_to_s3.py | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)



[airflow] branch master updated (04ec45f -> 13a9747)

2020-12-28 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 04ec45f  Add DataprocCreateWorkflowTemplateOperator (#13338)
 add 13a9747  Revert "Support google-cloud-tasks>=2.0.0 (#13334)" (#13341)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/ADDITIONAL_INFO.md|   1 -
 airflow/providers/google/cloud/hooks/tasks.py  | 118 ++---
 airflow/providers/google/cloud/operators/tasks.py  |  39 +++
 setup.py   |   2 +-
 tests/providers/google/cloud/hooks/test_tasks.py   |  86 ---
 .../providers/google/cloud/operators/test_tasks.py |  65 ++--
 6 files changed, 136 insertions(+), 175 deletions(-)



[airflow] branch master updated (97eee35 -> f95b1c9)

2020-12-21 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 97eee35  Fix typos and minor simplification in TESTING.rst (#13194)
 add f95b1c9  Add regional support to dataproc workflow template operators 
(#12907)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/cloud/hooks/dataproc.py   | 16 +
 .../providers/google/cloud/hooks/test_dataproc.py  | 39 +++---
 2 files changed, 36 insertions(+), 19 deletions(-)



[airflow-site] branch master updated: Update _index.md (#358)

2020-12-17 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/master by this push:
 new 0aabf3f  Update _index.md (#358)
0aabf3f is described below

commit 0aabf3f5df94ef0edbfc72e045dc0c1654fd46a2
Author: Flávio de Assis <34079342+flavio-as...@users.noreply.github.com>
AuthorDate: Thu Dec 17 17:15:00 2020 -0300

Update _index.md (#358)

Fix Typo in `Oracle provider`
---
 landing-pages/site/content/en/docs/_index.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/landing-pages/site/content/en/docs/_index.md 
b/landing-pages/site/content/en/docs/_index.md
index 52959d6..eed5525 100644
--- a/landing-pages/site/content/en/docs/_index.md
+++ b/landing-pages/site/content/en/docs/_index.md
@@ -97,7 +97,7 @@ Providers packages include integrations with third party 
integrations. They are
 
   Opsgenie
 
-  Orcle
+  Oracle
 
   Pagerduty
 



[airflow] branch master updated (4b67b0b -> 12bde09)

2020-12-15 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 4b67b0b  Remove inapplicable arg 'output' for CLI pools import/export 
(#13071)
 add 12bde09  Add question about PR to feature request template (#13087)

No new revisions were added by this update.

Summary of changes:
 .github/ISSUE_TEMPLATE/feature_request.md | 4 
 1 file changed, 4 insertions(+)



[airflow] branch master updated (26c6854 -> 6bf9acb)

2020-12-14 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 26c6854  Allows to install Airflow in Breeze from PIP with 
configurable extras (#13055)
 add 6bf9acb  Fix import from core to mysql provider in mysql example DAG 
(#13060)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/mysql/example_dags/example_mysql.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[airflow] branch master updated (4d3300c -> 1c1ef7e)

2020-12-14 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 4d3300c  Refactor plugins command output using AirflowConsole (#13036)
 add 1c1ef7e  Add project_id to client inside BigQuery hook update_table 
method (#13018)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/cloud/hooks/bigquery.py   |  2 +-
 .../providers/google/cloud/hooks/test_bigquery.py  | 52 ++
 2 files changed, 53 insertions(+), 1 deletion(-)



[airflow] branch v1-10-stable updated: Add possibility to check if upgrade check rule applies (#12981)

2020-12-10 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-stable by this push:
 new f666da6  Add possibility to check if upgrade check rule applies 
(#12981)
f666da6 is described below

commit f666da6123494422d4b6514ba7b474f862b4d3a2
Author: Tomek Urbaszek 
AuthorDate: Thu Dec 10 22:51:33 2020 +

Add possibility to check if upgrade check rule applies (#12981)

closes: #12897
---
 airflow/upgrade/formatters.py   | 30 -
 airflow/upgrade/problem.py  | 10 ++---
 airflow/upgrade/rules/base_rule.py  |  7 ++
 airflow/upgrade/rules/pod_template_file_rule.py |  5 +
 tests/upgrade/test_formattes.py | 15 -
 tests/upgrade/test_problem.py   |  7 +++---
 6 files changed, 56 insertions(+), 18 deletions(-)

diff --git a/airflow/upgrade/formatters.py b/airflow/upgrade/formatters.py
index daeb37f..4b2c5c3 100644
--- a/airflow/upgrade/formatters.py
+++ b/airflow/upgrade/formatters.py
@@ -74,20 +74,23 @@ class ConsoleFormatter(BaseFormatter):
 @staticmethod
 def display_recommendations(rule_statuses):
 for rule_status in rule_statuses:
+# Show recommendations only if there are any messaged
 if not rule_status.messages:
 continue
 
-# Show recommendations only if there are any messaged
 rule = rule_status.rule
-lines = [
-rule.title,
-"-" * len(rule.title),
-rule.description,
-"",
-"Problems:",
-"",
-]
-lines.extend(['{:>3}.  {}'.format(i, m) for i, m in 
enumerate(rule_status.messages, 1)])
+lines = [rule.title, "-" * len(rule.title)]
+if rule_status.skipped:
+lines.extend([rule_status.messages[0]])
+else:
+if rule.description:
+lines.extend([rule.description])
+lines.extend([
+"",
+"Problems:",
+"",
+])
+lines.extend(['{:>3}.  {}'.format(i, m) for i, m in 
enumerate(rule_status.messages, 1)])
 msg = "\n".join(lines)
 
 formatted_msg = pygments.highlight(
@@ -96,7 +99,12 @@ class ConsoleFormatter(BaseFormatter):
 print(formatted_msg)
 
 def on_next_rule_status(self, rule_status):
-status = colorize("green", "SUCCESS") if rule_status.is_success else 
colorize("red", "FAIL")
+if rule_status.skipped:
+status = colorize("yellow", "SKIPPED")
+elif rule_status.is_success:
+status = colorize("green", "SUCCESS")
+else:
+status = colorize("red", "FAIL")
 status_line_fmt = self.prepare_status_line_format()
 print(status_line_fmt.format(rule_status.rule.title, status))
 
diff --git a/airflow/upgrade/problem.py b/airflow/upgrade/problem.py
index 38ee0fa..e26b020 100644
--- a/airflow/upgrade/problem.py
+++ b/airflow/upgrade/problem.py
@@ -25,10 +25,10 @@ class RuleStatus(NamedTuple(
 'RuleStatus',
 [
 ('rule', BaseRule),
-('messages', List[str])
+('messages', List[str]),
+('skipped', bool)
 ]
 )):
-
 @property
 def is_success(self):
 return len(self.messages) == 0
@@ -36,10 +36,14 @@ class RuleStatus(NamedTuple(
 @classmethod
 def from_rule(cls, rule):
 # type: (BaseRule) -> RuleStatus
+msg = rule.should_skip()
+if msg:
+return cls(rule=rule, messages=[msg], skipped=True)
+
 messages = []  # type: List[str]
 result = rule.check()
 if isinstance(result, str):
 messages = [result]
 elif isinstance(result, Iterable):
 messages = list(result)
-return cls(rule=rule, messages=messages)
+return cls(rule=rule, messages=messages, skipped=False)
diff --git a/airflow/upgrade/rules/base_rule.py 
b/airflow/upgrade/rules/base_rule.py
index c80ec77..8cf0e0f 100644
--- a/airflow/upgrade/rules/base_rule.py
+++ b/airflow/upgrade/rules/base_rule.py
@@ -33,5 +33,12 @@ class BaseRule(object):
 """A long description explaining the problem in detail. This can be an 
entry from UPDATING.md file."""
 pass
 
+def should_skip(self):
+"""
+Executes a pre check of configuration. If returned value is
+True then the checking the rule is omitted.
+""&

[airflow] branch master updated (3ff5a35 -> 348ceb1)

2020-12-05 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 3ff5a35  Add paused column to `dags list` sub-command (#12830)
 add 348ceb1  Add Qliro to INTHEWILD.md (#12833)

No new revisions were added by this update.

Summary of changes:
 INTHEWILD.md | 1 +
 1 file changed, 1 insertion(+)



[airflow] branch master updated: Improve error handling in cli and introduce consistency (#12764)

2020-12-04 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 1bd98cd  Improve error handling in cli and introduce consistency 
(#12764)
1bd98cd is described below

commit 1bd98cd54ca46af86465d049e7a1295951413661
Author: Tomek Urbaszek 
AuthorDate: Fri Dec 4 09:41:41 2020 +

Improve error handling in cli and introduce consistency (#12764)

This PR is a followup after #12375 and #12704 it improves handling
of some errors in cli commands to avoid show users to much traceback
and uses SystemExit consistently.
---
 airflow/cli/commands/celery_command.py|  5 ++-
 airflow/cli/commands/config_command.py|  7 ++---
 airflow/cli/commands/connection_command.py| 44 +++
 airflow/cli/commands/dag_command.py   |  6 ++--
 airflow/cli/commands/db_command.py|  2 +-
 airflow/cli/commands/kubernetes_command.py|  2 +-
 airflow/cli/commands/pool_command.py  | 20 ++--
 airflow/cli/commands/role_command.py  |  2 ++
 airflow/cli/commands/user_command.py  | 36 +-
 airflow/cli/commands/variable_command.py  | 11 ---
 tests/cli/commands/test_celery_command.py |  2 +-
 tests/cli/commands/test_config_command.py | 12 +++-
 tests/cli/commands/test_connection_command.py | 14 -
 13 files changed, 71 insertions(+), 92 deletions(-)

diff --git a/airflow/cli/commands/celery_command.py 
b/airflow/cli/commands/celery_command.py
index 8276895..60b0f88 100644
--- a/airflow/cli/commands/celery_command.py
+++ b/airflow/cli/commands/celery_command.py
@@ -16,7 +16,7 @@
 # specific language governing permissions and limitations
 # under the License.
 """Celery command"""
-import sys
+
 from multiprocessing import Process
 from typing import Optional
 
@@ -95,8 +95,7 @@ def _serve_logs(skip_serve_logs: bool = False) -> 
Optional[Process]:
 def worker(args):
 """Starts Airflow Celery worker"""
 if not settings.validate_session():
-print("Worker exiting... database connection precheck failed! ")
-sys.exit(1)
+raise SystemExit("Worker exiting, database connection precheck 
failed.")
 
 autoscale = args.autoscale
 skip_serve_logs = args.skip_serve_logs
diff --git a/airflow/cli/commands/config_command.py 
b/airflow/cli/commands/config_command.py
index 2eedfa5..1c2674f 100644
--- a/airflow/cli/commands/config_command.py
+++ b/airflow/cli/commands/config_command.py
@@ -16,7 +16,6 @@
 # under the License.
 """Config sub-commands"""
 import io
-import sys
 
 import pygments
 from pygments.lexers.configs import IniLexer
@@ -39,12 +38,10 @@ def show_config(args):
 def get_value(args):
 """Get one value from configuration"""
 if not conf.has_section(args.section):
-print(f'The section [{args.section}] is not found in config.', 
file=sys.stderr)
-sys.exit(1)
+raise SystemExit(f'The section [{args.section}] is not found in 
config.')
 
 if not conf.has_option(args.section, args.option):
-print(f'The option [{args.section}/{args.option}] is not found in 
config.', file=sys.stderr)
-sys.exit(1)
+raise SystemExit(f'The option [{args.section}/{args.option}] is not 
found in config.')
 
 value = conf.get(args.section, args.option)
 print(value)
diff --git a/airflow/cli/commands/connection_command.py 
b/airflow/cli/commands/connection_command.py
index dd96f9f..5426194 100644
--- a/airflow/cli/commands/connection_command.py
+++ b/airflow/cli/commands/connection_command.py
@@ -112,17 +112,13 @@ def _format_connections(conns: List[Connection], fmt: 
str) -> str:
 
 
 def _is_stdout(fileio: io.TextIOWrapper) -> bool:
-if fileio.name == '':
-return True
-return False
+return fileio.name == ''
 
 
 def _valid_uri(uri: str) -> bool:
 """Check if a URI is valid, by checking if both scheme and netloc are 
available"""
 uri_parts = urlparse(uri)
-if uri_parts.scheme == '' or uri_parts.netloc == '':
-return False
-return True
+return uri_parts.scheme != '' and uri_parts.netloc != ''
 
 
 def connections_export(args):
@@ -140,11 +136,10 @@ def connections_export(args):
 _, filetype = os.path.splitext(args.file.name)
 filetype = filetype.lower()
 if filetype not in allowed_formats:
-msg = (
-f"Unsupported file format. "
-f"The file must have the extension {', 
'.join(allowed_formats)}"
+raise SystemExit(
+f"Unsupported file format. 

[airflow] branch master updated: Change DEBUG color to green in coloured logger (#12784)

2020-12-03 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 56f82ba  Change DEBUG color to green in coloured logger (#12784)
56f82ba is described below

commit 56f82ba22519b0cf2cb0a1f7c4d083db7f2e3358
Author: Tomek Urbaszek 
AuthorDate: Thu Dec 3 15:07:43 2020 +

Change DEBUG color to green in coloured logger (#12784)

In this way it's easier to see difference between debug and error.
---
 airflow/utils/log/colored_log.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/utils/log/colored_log.py b/airflow/utils/log/colored_log.py
index 5d4b0af..d1adb08 100644
--- a/airflow/utils/log/colored_log.py
+++ b/airflow/utils/log/colored_log.py
@@ -25,7 +25,7 @@ from colorlog import TTYColoredFormatter
 from colorlog.escape_codes import esc, escape_codes
 
 DEFAULT_COLORS = {
-"DEBUG": "red",
+"DEBUG": "green",
 "INFO": "",
 "WARNING": "yellow",
 "ERROR": "red",



[airflow] branch master updated (0400ee3 -> 67acdbd)

2020-12-02 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 0400ee3  Allow using _CMD / _SECRET to set `[webserver] secret_key` 
config (#12742)
 add 67acdbd  Remove store_serialized_dags from config (#12754)

No new revisions were added by this update.

Summary of changes:
 airflow/config_templates/config.yml  | 9 -
 airflow/config_templates/default_airflow.cfg | 5 -
 2 files changed, 14 deletions(-)



[airflow] branch master updated (ae0e8f4 -> cba8d62)

2020-12-02 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from ae0e8f4  Move config item 'worker_precheck' from section [core] to 
[celery] (#12746)
 add cba8d62  Refactor list rendering in commands (#12704)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md| 34 
 airflow/cli/cli_parser.py  | 17 ++--
 airflow/cli/commands/cheat_sheet_command.py|  2 +
 airflow/cli/commands/connection_command.py | 82 +++---
 airflow/cli/commands/dag_command.py| 98 --
 airflow/cli/commands/info_command.py   |  2 +
 airflow/cli/commands/plugins_command.py|  3 +
 airflow/cli/commands/pool_command.py   | 45 ++
 airflow/cli/commands/provider_command.py   | 83 --
 airflow/cli/commands/role_command.py   | 11 +--
 airflow/cli/commands/task_command.py   | 70 
 airflow/cli/commands/user_command.py   | 12 +--
 airflow/cli/commands/variable_command.py   |  5 +-
 airflow/cli/simple_table.py| 67 +++
 airflow/utils/cli.py   | 15 +++-
 docs/apache-airflow/usage-cli.rst  | 37 
 .../run_install_and_test_provider_packages.sh  |  4 +-
 tests/cli/commands/test_connection_command.py  | 31 +++
 tests/cli/commands/test_dag_command.py | 15 ++--
 tests/cli/commands/test_pool_command.py|  5 +-
 tests/cli/commands/test_role_command.py|  2 +-
 tests/cli/commands/test_task_command.py| 43 +-
 tests/cli/commands/test_user_command.py|  2 +-
 23 files changed, 413 insertions(+), 272 deletions(-)



[airflow] branch v1-10-stable updated (65a8dbc -> fb63fbc)

2020-11-30 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 65a8dbc  Fix test in PR 12663 (#12667)
 add fb63fbc  Add possibility to configure upgrade_check command (#12657)

No new revisions were added by this update.

Summary of changes:
 airflow/upgrade/README.md  |  20 
 airflow/upgrade/checker.py |  38 +++-
 airflow/upgrade/config.py  |  51 ++
 airflow/upgrade/rules/undefined_jinja_varaibles.py |  11 +--
 .../rules/test_undefined_jinja_varaibles.py|   2 +
 tests/upgrade/test_config.py   | 103 +
 tests/upgrade/test_formattes.py|   4 +-
 7 files changed, 213 insertions(+), 16 deletions(-)
 create mode 100644 airflow/upgrade/config.py
 create mode 100644 tests/upgrade/test_config.py



[airflow] branch master updated (02d9434 -> c9d1ea5)

2020-11-29 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 02d9434  Don't use time.time() or timezone.utcnow() for duration 
calculations (#12353)
 add c9d1ea5  Refactor airflow plugins command (#12697)

No new revisions were added by this update.

Summary of changes:
 airflow/cli/commands/plugins_command.py| 72 +++---
 tests/cli/commands/test_plugins_command.py | 16 +++
 2 files changed, 54 insertions(+), 34 deletions(-)



[airflow] branch master updated (0a1b434 -> 850b74b)

2020-11-28 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 0a1b434  Move production deployments tips to 
docs/production-deployment.rst (#12686)
 add 850b74b  Use rich to render info and cheat-sheet command (#12689)

No new revisions were added by this update.

Summary of changes:
 airflow/cli/commands/cheat_sheet_command.py|  44 ++--
 airflow/cli/commands/info_command.py   | 228 +
 .../simple_table.py}   |  44 ++--
 docs/modules_management.rst|  68 +++---
 tests/cli/commands/test_cheat_sheet_command.py |  26 +--
 tests/cli/commands/test_info_command.py|  46 +++--
 6 files changed, 209 insertions(+), 247 deletions(-)
 copy airflow/{contrib/operators/oracle_to_oracle_transfer.py => 
cli/simple_table.py} (50%)



[airflow] branch master updated (9a74ee5 -> 456a1c5)

2020-11-27 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 9a74ee5  Add 1.10.13 to CI, Breeze and Docs (#12652)
 add 456a1c5  Restructure the extras in setup.py and described them (#12548)

No new revisions were added by this update.

Summary of changes:
 .pre-commit-config.yaml|  2 +-
 CONTRIBUTING.rst   | 14 +-
 INSTALL| 14 +-
 UPDATING.md|  7 +
 UPGRADING_TO_2.0.md| 13 +++---
 dev/provider_packages/README.md|  2 +-
 docs/extra-packages-ref.rst| 12 ++---
 .../pre_commit_check_setup_extra_packages_ref.py   | 14 +++---
 .../in_container/run_prepare_provider_readme.sh|  2 +-
 .../run_test_package_import_all_classes.sh |  2 +-
 setup.py   | 30 ++
 11 files changed, 72 insertions(+), 40 deletions(-)



[airflow] branch master updated (c084393 -> e1ebfa6)

2020-11-27 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from c084393  Allows mounting local sources for github run-id images 
(#12650)
 add e1ebfa6  Add DataflowJobMessagesSensor and 
DataflowAutoscalingEventsSensor (#12249)

No new revisions were added by this update.

Summary of changes:
 .../google/cloud/example_dags/example_dataflow.py  |  37 +++-
 airflow/providers/google/cloud/hooks/dataflow.py   | 118 +++-
 airflow/providers/google/cloud/sensors/dataflow.py | 198 +
 .../providers/google/cloud/hooks/test_dataflow.py  | 106 +++
 .../google/cloud/sensors/test_dataflow.py  | 171 +-
 5 files changed, 626 insertions(+), 4 deletions(-)



[airflow] branch master updated (74ed92b -> b57b932)

2020-11-24 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 74ed92b  Drop random.choice() in BaseHook.get_connection() (#12573)
 add b57b932  Improve code quality of ExternalTaskSensor (#12574)

No new revisions were added by this update.

Summary of changes:
 airflow/sensors/external_task_sensor.py | 81 -
 1 file changed, 39 insertions(+), 42 deletions(-)



[airflow] branch master updated (be8f1ac -> 370e7d0)

2020-11-21 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from be8f1ac  Fix build on RTD (#12529)
 add 370e7d0  Fix Python Docstring parameters (#12513)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/hooks/datasync.py | 8 
 airflow/providers/apache/hdfs/hooks/webhdfs.py | 2 +-
 airflow/providers/http/hooks/http.py   | 2 +-
 3 files changed, 6 insertions(+), 6 deletions(-)



[airflow] branch v1-10-stable updated: Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

2020-11-19 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-stable by this push:
 new 18100a0  Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)
18100a0 is described below

commit 18100a0ec96692bb4d7c9e80f206b66a30c65e0d
Author: Ashmeet Lamba 
AuthorDate: Thu Nov 19 16:33:06 2020 +0530

Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

Adding a rule to check for undefined jinja variables when upgrading to 
Airflow2.0
---
 airflow/models/dag.py  |   4 +-
 airflow/upgrade/rules/undefined_jinja_varaibles.py | 153 
 .../rules/test_undefined_jinja_varaibles.py| 192 +
 3 files changed, 347 insertions(+), 2 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 348e19d..a1908e3 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -224,7 +224,7 @@ class DAG(BaseDag, LoggingMixin):
 end_date=None,  # type: Optional[datetime]
 full_filepath=None,  # type: Optional[str]
 template_searchpath=None,  # type: Optional[Union[str, Iterable[str]]]
-template_undefined=jinja2.Undefined,  # type: Type[jinja2.Undefined]
+template_undefined=None,  # type: Optional[Type[jinja2.Undefined]]
 user_defined_macros=None,  # type: Optional[Dict]
 user_defined_filters=None,  # type: Optional[Dict]
 default_args=None,  # type: Optional[Dict]
@@ -807,7 +807,7 @@ class DAG(BaseDag, LoggingMixin):
 # Default values (for backward compatibility)
 jinja_env_options = {
 'loader': jinja2.FileSystemLoader(searchpath),
-'undefined': self.template_undefined,
+'undefined': self.template_undefined or jinja2.Undefined,
 'extensions': ["jinja2.ext.do"],
 'cache_size': 0
 }
diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py 
b/airflow/upgrade/rules/undefined_jinja_varaibles.py
new file mode 100644
index 000..b97cfbc
--- /dev/null
+++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py
@@ -0,0 +1,153 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+import re
+
+import jinja2
+import six
+
+from airflow import conf
+from airflow.models import DagBag, TaskInstance
+from airflow.upgrade.rules.base_rule import BaseRule
+from airflow.utils import timezone
+
+
+class UndefinedJinjaVariablesRule(BaseRule):
+
+title = "Jinja Template Variables cannot be undefined"
+
+description = """\
+The default behavior for DAG's Jinja templates has changed. Now, more 
restrictive validation
+of non-existent variables is applied - `jinja2.StrictUndefined`.
+
+The user should do either of the following to fix this -
+1. Fix the Jinja Templates by defining every variable or providing default 
values
+2. Explicitly declare `template_undefined=jinja2.Undefined` while defining the 
DAG
+"""
+
+def _check_rendered_content(self, rendered_content, seen_oids=None):
+"""Replicates the logic in BaseOperator.render_template() to
+cover all the cases needed to be checked.
+"""
+if isinstance(rendered_content, six.string_types):
+return set(re.findall(r"{{(.*?)}}", rendered_content))
+
+elif isinstance(rendered_content, (int, float, bool)):
+return set()
+
+elif isinstance(rendered_content, (tuple, list, set)):
+debug_error_messages = set()
+for element in rendered_content:
+
debug_error_messages.update(self._check_rendered_content(element))
+return debug_error_messages
+
+elif isinstance(rendered_content, dict):
+debug_error_messages = set()
+for key, value in rendered_content.items():
+
debug_error_messages.update(self._check_rendered_content(value))
+return debug_error_messages
+
+else:
+  

[airflow] branch v1-10-stable updated: Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

2020-11-19 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-stable by this push:
 new 18100a0  Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)
18100a0 is described below

commit 18100a0ec96692bb4d7c9e80f206b66a30c65e0d
Author: Ashmeet Lamba 
AuthorDate: Thu Nov 19 16:33:06 2020 +0530

Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

Adding a rule to check for undefined jinja variables when upgrading to 
Airflow2.0
---
 airflow/models/dag.py  |   4 +-
 airflow/upgrade/rules/undefined_jinja_varaibles.py | 153 
 .../rules/test_undefined_jinja_varaibles.py| 192 +
 3 files changed, 347 insertions(+), 2 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 348e19d..a1908e3 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -224,7 +224,7 @@ class DAG(BaseDag, LoggingMixin):
 end_date=None,  # type: Optional[datetime]
 full_filepath=None,  # type: Optional[str]
 template_searchpath=None,  # type: Optional[Union[str, Iterable[str]]]
-template_undefined=jinja2.Undefined,  # type: Type[jinja2.Undefined]
+template_undefined=None,  # type: Optional[Type[jinja2.Undefined]]
 user_defined_macros=None,  # type: Optional[Dict]
 user_defined_filters=None,  # type: Optional[Dict]
 default_args=None,  # type: Optional[Dict]
@@ -807,7 +807,7 @@ class DAG(BaseDag, LoggingMixin):
 # Default values (for backward compatibility)
 jinja_env_options = {
 'loader': jinja2.FileSystemLoader(searchpath),
-'undefined': self.template_undefined,
+'undefined': self.template_undefined or jinja2.Undefined,
 'extensions': ["jinja2.ext.do"],
 'cache_size': 0
 }
diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py 
b/airflow/upgrade/rules/undefined_jinja_varaibles.py
new file mode 100644
index 000..b97cfbc
--- /dev/null
+++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py
@@ -0,0 +1,153 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+import re
+
+import jinja2
+import six
+
+from airflow import conf
+from airflow.models import DagBag, TaskInstance
+from airflow.upgrade.rules.base_rule import BaseRule
+from airflow.utils import timezone
+
+
+class UndefinedJinjaVariablesRule(BaseRule):
+
+title = "Jinja Template Variables cannot be undefined"
+
+description = """\
+The default behavior for DAG's Jinja templates has changed. Now, more 
restrictive validation
+of non-existent variables is applied - `jinja2.StrictUndefined`.
+
+The user should do either of the following to fix this -
+1. Fix the Jinja Templates by defining every variable or providing default 
values
+2. Explicitly declare `template_undefined=jinja2.Undefined` while defining the 
DAG
+"""
+
+def _check_rendered_content(self, rendered_content, seen_oids=None):
+"""Replicates the logic in BaseOperator.render_template() to
+cover all the cases needed to be checked.
+"""
+if isinstance(rendered_content, six.string_types):
+return set(re.findall(r"{{(.*?)}}", rendered_content))
+
+elif isinstance(rendered_content, (int, float, bool)):
+return set()
+
+elif isinstance(rendered_content, (tuple, list, set)):
+debug_error_messages = set()
+for element in rendered_content:
+
debug_error_messages.update(self._check_rendered_content(element))
+return debug_error_messages
+
+elif isinstance(rendered_content, dict):
+debug_error_messages = set()
+for key, value in rendered_content.items():
+
debug_error_messages.update(self._check_rendered_content(value))
+return debug_error_messages
+
+else:
+  

[airflow] branch v1-10-stable updated: Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

2020-11-19 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-stable by this push:
 new 18100a0  Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)
18100a0 is described below

commit 18100a0ec96692bb4d7c9e80f206b66a30c65e0d
Author: Ashmeet Lamba 
AuthorDate: Thu Nov 19 16:33:06 2020 +0530

Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

Adding a rule to check for undefined jinja variables when upgrading to 
Airflow2.0
---
 airflow/models/dag.py  |   4 +-
 airflow/upgrade/rules/undefined_jinja_varaibles.py | 153 
 .../rules/test_undefined_jinja_varaibles.py| 192 +
 3 files changed, 347 insertions(+), 2 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 348e19d..a1908e3 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -224,7 +224,7 @@ class DAG(BaseDag, LoggingMixin):
 end_date=None,  # type: Optional[datetime]
 full_filepath=None,  # type: Optional[str]
 template_searchpath=None,  # type: Optional[Union[str, Iterable[str]]]
-template_undefined=jinja2.Undefined,  # type: Type[jinja2.Undefined]
+template_undefined=None,  # type: Optional[Type[jinja2.Undefined]]
 user_defined_macros=None,  # type: Optional[Dict]
 user_defined_filters=None,  # type: Optional[Dict]
 default_args=None,  # type: Optional[Dict]
@@ -807,7 +807,7 @@ class DAG(BaseDag, LoggingMixin):
 # Default values (for backward compatibility)
 jinja_env_options = {
 'loader': jinja2.FileSystemLoader(searchpath),
-'undefined': self.template_undefined,
+'undefined': self.template_undefined or jinja2.Undefined,
 'extensions': ["jinja2.ext.do"],
 'cache_size': 0
 }
diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py 
b/airflow/upgrade/rules/undefined_jinja_varaibles.py
new file mode 100644
index 000..b97cfbc
--- /dev/null
+++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py
@@ -0,0 +1,153 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+import re
+
+import jinja2
+import six
+
+from airflow import conf
+from airflow.models import DagBag, TaskInstance
+from airflow.upgrade.rules.base_rule import BaseRule
+from airflow.utils import timezone
+
+
+class UndefinedJinjaVariablesRule(BaseRule):
+
+title = "Jinja Template Variables cannot be undefined"
+
+description = """\
+The default behavior for DAG's Jinja templates has changed. Now, more 
restrictive validation
+of non-existent variables is applied - `jinja2.StrictUndefined`.
+
+The user should do either of the following to fix this -
+1. Fix the Jinja Templates by defining every variable or providing default 
values
+2. Explicitly declare `template_undefined=jinja2.Undefined` while defining the 
DAG
+"""
+
+def _check_rendered_content(self, rendered_content, seen_oids=None):
+"""Replicates the logic in BaseOperator.render_template() to
+cover all the cases needed to be checked.
+"""
+if isinstance(rendered_content, six.string_types):
+return set(re.findall(r"{{(.*?)}}", rendered_content))
+
+elif isinstance(rendered_content, (int, float, bool)):
+return set()
+
+elif isinstance(rendered_content, (tuple, list, set)):
+debug_error_messages = set()
+for element in rendered_content:
+
debug_error_messages.update(self._check_rendered_content(element))
+return debug_error_messages
+
+elif isinstance(rendered_content, dict):
+debug_error_messages = set()
+for key, value in rendered_content.items():
+
debug_error_messages.update(self._check_rendered_content(value))
+return debug_error_messages
+
+else:
+  

[airflow] branch v1-10-stable updated: Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

2020-11-19 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-stable by this push:
 new 18100a0  Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)
18100a0 is described below

commit 18100a0ec96692bb4d7c9e80f206b66a30c65e0d
Author: Ashmeet Lamba 
AuthorDate: Thu Nov 19 16:33:06 2020 +0530

Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

Adding a rule to check for undefined jinja variables when upgrading to 
Airflow2.0
---
 airflow/models/dag.py  |   4 +-
 airflow/upgrade/rules/undefined_jinja_varaibles.py | 153 
 .../rules/test_undefined_jinja_varaibles.py| 192 +
 3 files changed, 347 insertions(+), 2 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 348e19d..a1908e3 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -224,7 +224,7 @@ class DAG(BaseDag, LoggingMixin):
 end_date=None,  # type: Optional[datetime]
 full_filepath=None,  # type: Optional[str]
 template_searchpath=None,  # type: Optional[Union[str, Iterable[str]]]
-template_undefined=jinja2.Undefined,  # type: Type[jinja2.Undefined]
+template_undefined=None,  # type: Optional[Type[jinja2.Undefined]]
 user_defined_macros=None,  # type: Optional[Dict]
 user_defined_filters=None,  # type: Optional[Dict]
 default_args=None,  # type: Optional[Dict]
@@ -807,7 +807,7 @@ class DAG(BaseDag, LoggingMixin):
 # Default values (for backward compatibility)
 jinja_env_options = {
 'loader': jinja2.FileSystemLoader(searchpath),
-'undefined': self.template_undefined,
+'undefined': self.template_undefined or jinja2.Undefined,
 'extensions': ["jinja2.ext.do"],
 'cache_size': 0
 }
diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py 
b/airflow/upgrade/rules/undefined_jinja_varaibles.py
new file mode 100644
index 000..b97cfbc
--- /dev/null
+++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py
@@ -0,0 +1,153 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+import re
+
+import jinja2
+import six
+
+from airflow import conf
+from airflow.models import DagBag, TaskInstance
+from airflow.upgrade.rules.base_rule import BaseRule
+from airflow.utils import timezone
+
+
+class UndefinedJinjaVariablesRule(BaseRule):
+
+title = "Jinja Template Variables cannot be undefined"
+
+description = """\
+The default behavior for DAG's Jinja templates has changed. Now, more 
restrictive validation
+of non-existent variables is applied - `jinja2.StrictUndefined`.
+
+The user should do either of the following to fix this -
+1. Fix the Jinja Templates by defining every variable or providing default 
values
+2. Explicitly declare `template_undefined=jinja2.Undefined` while defining the 
DAG
+"""
+
+def _check_rendered_content(self, rendered_content, seen_oids=None):
+"""Replicates the logic in BaseOperator.render_template() to
+cover all the cases needed to be checked.
+"""
+if isinstance(rendered_content, six.string_types):
+return set(re.findall(r"{{(.*?)}}", rendered_content))
+
+elif isinstance(rendered_content, (int, float, bool)):
+return set()
+
+elif isinstance(rendered_content, (tuple, list, set)):
+debug_error_messages = set()
+for element in rendered_content:
+
debug_error_messages.update(self._check_rendered_content(element))
+return debug_error_messages
+
+elif isinstance(rendered_content, dict):
+debug_error_messages = set()
+for key, value in rendered_content.items():
+
debug_error_messages.update(self._check_rendered_content(value))
+return debug_error_messages
+
+else:
+  

[airflow] branch v1-10-stable updated: Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

2020-11-19 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-stable by this push:
 new 18100a0  Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)
18100a0 is described below

commit 18100a0ec96692bb4d7c9e80f206b66a30c65e0d
Author: Ashmeet Lamba 
AuthorDate: Thu Nov 19 16:33:06 2020 +0530

Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)

Adding a rule to check for undefined jinja variables when upgrading to 
Airflow2.0
---
 airflow/models/dag.py  |   4 +-
 airflow/upgrade/rules/undefined_jinja_varaibles.py | 153 
 .../rules/test_undefined_jinja_varaibles.py| 192 +
 3 files changed, 347 insertions(+), 2 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 348e19d..a1908e3 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -224,7 +224,7 @@ class DAG(BaseDag, LoggingMixin):
 end_date=None,  # type: Optional[datetime]
 full_filepath=None,  # type: Optional[str]
 template_searchpath=None,  # type: Optional[Union[str, Iterable[str]]]
-template_undefined=jinja2.Undefined,  # type: Type[jinja2.Undefined]
+template_undefined=None,  # type: Optional[Type[jinja2.Undefined]]
 user_defined_macros=None,  # type: Optional[Dict]
 user_defined_filters=None,  # type: Optional[Dict]
 default_args=None,  # type: Optional[Dict]
@@ -807,7 +807,7 @@ class DAG(BaseDag, LoggingMixin):
 # Default values (for backward compatibility)
 jinja_env_options = {
 'loader': jinja2.FileSystemLoader(searchpath),
-'undefined': self.template_undefined,
+'undefined': self.template_undefined or jinja2.Undefined,
 'extensions': ["jinja2.ext.do"],
 'cache_size': 0
 }
diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py 
b/airflow/upgrade/rules/undefined_jinja_varaibles.py
new file mode 100644
index 000..b97cfbc
--- /dev/null
+++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py
@@ -0,0 +1,153 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+import re
+
+import jinja2
+import six
+
+from airflow import conf
+from airflow.models import DagBag, TaskInstance
+from airflow.upgrade.rules.base_rule import BaseRule
+from airflow.utils import timezone
+
+
+class UndefinedJinjaVariablesRule(BaseRule):
+
+title = "Jinja Template Variables cannot be undefined"
+
+description = """\
+The default behavior for DAG's Jinja templates has changed. Now, more 
restrictive validation
+of non-existent variables is applied - `jinja2.StrictUndefined`.
+
+The user should do either of the following to fix this -
+1. Fix the Jinja Templates by defining every variable or providing default 
values
+2. Explicitly declare `template_undefined=jinja2.Undefined` while defining the 
DAG
+"""
+
+def _check_rendered_content(self, rendered_content, seen_oids=None):
+"""Replicates the logic in BaseOperator.render_template() to
+cover all the cases needed to be checked.
+"""
+if isinstance(rendered_content, six.string_types):
+return set(re.findall(r"{{(.*?)}}", rendered_content))
+
+elif isinstance(rendered_content, (int, float, bool)):
+return set()
+
+elif isinstance(rendered_content, (tuple, list, set)):
+debug_error_messages = set()
+for element in rendered_content:
+
debug_error_messages.update(self._check_rendered_content(element))
+return debug_error_messages
+
+elif isinstance(rendered_content, dict):
+debug_error_messages = set()
+for key, value in rendered_content.items():
+
debug_error_messages.update(self._check_rendered_content(value))
+return debug_error_messages
+
+else:
+  

[airflow] branch master updated (efdba2c -> a4aa32b)

2020-11-17 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from efdba2c  Add stack overflow link to Github Issues (#12407)
 add a4aa32b  Simplify using XComArg in jinja template string (#12405)

No new revisions were added by this update.

Summary of changes:
 airflow/models/xcom_arg.py|  3 ++-
 tests/models/test_xcom_arg.py | 15 +--
 2 files changed, 11 insertions(+), 7 deletions(-)



[airflow] branch master updated (efdba2c -> a4aa32b)

2020-11-17 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from efdba2c  Add stack overflow link to Github Issues (#12407)
 add a4aa32b  Simplify using XComArg in jinja template string (#12405)

No new revisions were added by this update.

Summary of changes:
 airflow/models/xcom_arg.py|  3 ++-
 tests/models/test_xcom_arg.py | 15 +--
 2 files changed, 11 insertions(+), 7 deletions(-)



[airflow] branch master updated (7eb23db -> efdba2c)

2020-11-17 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7eb23db  Update Kaxil's Github handle (#12409)
 add efdba2c  Add stack overflow link to Github Issues (#12407)

No new revisions were added by this update.

Summary of changes:
 .github/ISSUE_TEMPLATE/config.yml | 3 +++
 1 file changed, 3 insertions(+)



[airflow] branch master updated (7eb23db -> efdba2c)

2020-11-17 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7eb23db  Update Kaxil's Github handle (#12409)
 add efdba2c  Add stack overflow link to Github Issues (#12407)

No new revisions were added by this update.

Summary of changes:
 .github/ISSUE_TEMPLATE/config.yml | 3 +++
 1 file changed, 3 insertions(+)



[airflow] branch master updated (7eb23db -> efdba2c)

2020-11-17 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7eb23db  Update Kaxil's Github handle (#12409)
 add efdba2c  Add stack overflow link to Github Issues (#12407)

No new revisions were added by this update.

Summary of changes:
 .github/ISSUE_TEMPLATE/config.yml | 3 +++
 1 file changed, 3 insertions(+)



[airflow] branch master updated (7eb23db -> efdba2c)

2020-11-17 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7eb23db  Update Kaxil's Github handle (#12409)
 add efdba2c  Add stack overflow link to Github Issues (#12407)

No new revisions were added by this update.

Summary of changes:
 .github/ISSUE_TEMPLATE/config.yml | 3 +++
 1 file changed, 3 insertions(+)



[airflow] branch master updated (917e6c4 -> 1623df8)

2020-11-16 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 917e6c4  Add provide_file_and_upload to GCSHook (#12310)
 add 1623df8  Use different deserialization method in XCom init_on_load 
(#12327)

No new revisions were added by this update.

Summary of changes:
 airflow/models/xcom.py | 29 -
 docs/concepts.rst  |  6 ++
 docs/spelling_wordlist.txt |  1 +
 tests/models/test_xcom.py  | 16 
 4 files changed, 43 insertions(+), 9 deletions(-)



[airflow] branch master updated (917e6c4 -> 1623df8)

2020-11-16 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 917e6c4  Add provide_file_and_upload to GCSHook (#12310)
 add 1623df8  Use different deserialization method in XCom init_on_load 
(#12327)

No new revisions were added by this update.

Summary of changes:
 airflow/models/xcom.py | 29 -
 docs/concepts.rst  |  6 ++
 docs/spelling_wordlist.txt |  1 +
 tests/models/test_xcom.py  | 16 
 4 files changed, 43 insertions(+), 9 deletions(-)



[airflow] branch master updated (823b3aa -> 39ea872)

2020-11-15 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 823b3aa  Reject 'connections add' CLI request if URI provided is 
invalid (#12370)
 add 39ea872  Check for TaskGroup in _PythonDecoratedOperator (#12312)

No new revisions were added by this update.

Summary of changes:
 airflow/operators/python.py| 17 ++---
 docs/concepts.rst  | 10 ++
 tests/operators/test_python.py | 18 ++
 3 files changed, 42 insertions(+), 3 deletions(-)



[airflow] branch master updated (823b3aa -> 39ea872)

2020-11-15 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 823b3aa  Reject 'connections add' CLI request if URI provided is 
invalid (#12370)
 add 39ea872  Check for TaskGroup in _PythonDecoratedOperator (#12312)

No new revisions were added by this update.

Summary of changes:
 airflow/operators/python.py| 17 ++---
 docs/concepts.rst  | 10 ++
 tests/operators/test_python.py | 18 ++
 3 files changed, 42 insertions(+), 3 deletions(-)



[airflow] branch master updated (823b3aa -> 39ea872)

2020-11-15 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 823b3aa  Reject 'connections add' CLI request if URI provided is 
invalid (#12370)
 add 39ea872  Check for TaskGroup in _PythonDecoratedOperator (#12312)

No new revisions were added by this update.

Summary of changes:
 airflow/operators/python.py| 17 ++---
 docs/concepts.rst  | 10 ++
 tests/operators/test_python.py | 18 ++
 3 files changed, 42 insertions(+), 3 deletions(-)



[airflow] branch master updated (823b3aa -> 39ea872)

2020-11-15 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 823b3aa  Reject 'connections add' CLI request if URI provided is 
invalid (#12370)
 add 39ea872  Check for TaskGroup in _PythonDecoratedOperator (#12312)

No new revisions were added by this update.

Summary of changes:
 airflow/operators/python.py| 17 ++---
 docs/concepts.rst  | 10 ++
 tests/operators/test_python.py | 18 ++
 3 files changed, 42 insertions(+), 3 deletions(-)



[airflow] branch master updated (823b3aa -> 39ea872)

2020-11-15 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 823b3aa  Reject 'connections add' CLI request if URI provided is 
invalid (#12370)
 add 39ea872  Check for TaskGroup in _PythonDecoratedOperator (#12312)

No new revisions were added by this update.

Summary of changes:
 airflow/operators/python.py| 17 ++---
 docs/concepts.rst  | 10 ++
 tests/operators/test_python.py | 18 ++
 3 files changed, 42 insertions(+), 3 deletions(-)



[airflow] branch master updated (1b77ebc -> bcb2437)

2020-11-14 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 1b77ebc  Visually separate pre-commits which require CI image (#12367)
 add bcb2437  Remove redundant method in KubernetesExecutor (#12317)

No new revisions were added by this update.

Summary of changes:
 airflow/executors/kubernetes_executor.py | 32 +---
 1 file changed, 1 insertion(+), 31 deletions(-)



[airflow] branch master updated (458ad93 -> d54f087)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 458ad93  Update & Fix 'Rotate Fernet Key' Doc (#12347)
 add d54f087  Use the backend-configured model (#12336)

No new revisions were added by this update.

Summary of changes:
 airflow/executors/celery_executor.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[airflow] branch master updated (458ad93 -> d54f087)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 458ad93  Update & Fix 'Rotate Fernet Key' Doc (#12347)
 add d54f087  Use the backend-configured model (#12336)

No new revisions were added by this update.

Summary of changes:
 airflow/executors/celery_executor.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[airflow] branch master updated (458ad93 -> d54f087)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 458ad93  Update & Fix 'Rotate Fernet Key' Doc (#12347)
 add d54f087  Use the backend-configured model (#12336)

No new revisions were added by this update.

Summary of changes:
 airflow/executors/celery_executor.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[airflow] branch master updated (458ad93 -> d54f087)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 458ad93  Update & Fix 'Rotate Fernet Key' Doc (#12347)
 add d54f087  Use the backend-configured model (#12336)

No new revisions were added by this update.

Summary of changes:
 airflow/executors/celery_executor.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[airflow] branch master updated (7c4fe19 -> 1222ebd)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7c4fe19  For v1-10-test PRs and pushes, use target branch scripts for 
images (#12339)
 add 1222ebd  Create DAG-level cluster policy (#12184)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md|   8 ++
 airflow/models/dagbag.py   |   5 +-
 airflow/settings.py|  46 -
 docs/concepts.rst  | 108 ++---
 tests/cluster_policies/__init__.py |  34 +++
 ..._double_trigger.py => test_dag_with_no_tags.py} |  12 +--
 tests/models/test_dagbag.py|  22 +++--
 7 files changed, 162 insertions(+), 73 deletions(-)
 copy tests/dags/{test_double_trigger.py => test_dag_with_no_tags.py} (80%)



[airflow] branch master updated (7c4fe19 -> 1222ebd)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7c4fe19  For v1-10-test PRs and pushes, use target branch scripts for 
images (#12339)
 add 1222ebd  Create DAG-level cluster policy (#12184)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md|   8 ++
 airflow/models/dagbag.py   |   5 +-
 airflow/settings.py|  46 -
 docs/concepts.rst  | 108 ++---
 tests/cluster_policies/__init__.py |  34 +++
 ..._double_trigger.py => test_dag_with_no_tags.py} |  12 +--
 tests/models/test_dagbag.py|  22 +++--
 7 files changed, 162 insertions(+), 73 deletions(-)
 copy tests/dags/{test_double_trigger.py => test_dag_with_no_tags.py} (80%)



[airflow] branch master updated (7c4fe19 -> 1222ebd)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7c4fe19  For v1-10-test PRs and pushes, use target branch scripts for 
images (#12339)
 add 1222ebd  Create DAG-level cluster policy (#12184)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md|   8 ++
 airflow/models/dagbag.py   |   5 +-
 airflow/settings.py|  46 -
 docs/concepts.rst  | 108 ++---
 tests/cluster_policies/__init__.py |  34 +++
 ..._double_trigger.py => test_dag_with_no_tags.py} |  12 +--
 tests/models/test_dagbag.py|  22 +++--
 7 files changed, 162 insertions(+), 73 deletions(-)
 copy tests/dags/{test_double_trigger.py => test_dag_with_no_tags.py} (80%)



[airflow] branch master updated (7c4fe19 -> 1222ebd)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7c4fe19  For v1-10-test PRs and pushes, use target branch scripts for 
images (#12339)
 add 1222ebd  Create DAG-level cluster policy (#12184)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md|   8 ++
 airflow/models/dagbag.py   |   5 +-
 airflow/settings.py|  46 -
 docs/concepts.rst  | 108 ++---
 tests/cluster_policies/__init__.py |  34 +++
 ..._double_trigger.py => test_dag_with_no_tags.py} |  12 +--
 tests/models/test_dagbag.py|  22 +++--
 7 files changed, 162 insertions(+), 73 deletions(-)
 copy tests/dags/{test_double_trigger.py => test_dag_with_no_tags.py} (80%)



[airflow] branch master updated (7c4fe19 -> 1222ebd)

2020-11-13 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7c4fe19  For v1-10-test PRs and pushes, use target branch scripts for 
images (#12339)
 add 1222ebd  Create DAG-level cluster policy (#12184)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md|   8 ++
 airflow/models/dagbag.py   |   5 +-
 airflow/settings.py|  46 -
 docs/concepts.rst  | 108 ++---
 tests/cluster_policies/__init__.py |  34 +++
 ..._double_trigger.py => test_dag_with_no_tags.py} |  12 +--
 tests/models/test_dagbag.py|  22 +++--
 7 files changed, 162 insertions(+), 73 deletions(-)
 copy tests/dags/{test_double_trigger.py => test_dag_with_no_tags.py} (80%)



[airflow] branch master updated (571f831 -> 32b59f8)

2020-11-12 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 571f831  Update automated PR labels (#12326)
 add 32b59f8  Fixes the sending of an empty list to BigQuery `list_rows` 
(#12307)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/google/cloud/hooks/bigquery.py|  8 ++--
 tests/providers/google/cloud/hooks/test_bigquery.py | 21 +
 2 files changed, 27 insertions(+), 2 deletions(-)



  1   2   3   4   5   >