[GitHub] [airflow] Narendra-Neerukonda edited a comment on issue #18133: bulk_sync_to_db got UnicodeDecodeError when Chinese characters in dag code

2021-09-10 Thread GitBox


Narendra-Neerukonda edited a comment on issue #18133:
URL: https://github.com/apache/airflow/issues/18133#issuecomment-917351733


   In Airflow 2.1.3, there is an option in core for store_dag_code, which if 
set to True, stores the dag code in DB and UI retrieves it from there (nelow 
snippet from 2.1.3):
   
   
   def code(cls, fileloc) -> str:
   if STORE_DAG_CODE:
   return cls._get_code_from_db(fileloc)
   else:
   return cls._get_code_from_file(fileloc)
   
   
   
   However, in the current main branch, the store_dag_code option seems to have 
been removed and the code is loaded directly from DB (below snippet from main). 
So, not sure if the issue will come any more, in future releases.
   
   
   def code(cls, fileloc) -> str:
   return cls._get_code_from_db(fileloc)
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Narendra-Neerukonda edited a comment on issue #18133: bulk_sync_to_db got UnicodeDecodeError when Chinese characters in dag code

2021-09-10 Thread GitBox


Narendra-Neerukonda edited a comment on issue #18133:
URL: https://github.com/apache/airflow/issues/18133#issuecomment-917351733


   In Airflow 2.1.3, there is an option in core for store_dag_code, which if 
set to True, stores the dag code in DB and UI retrieves it from there (nelow 
snippet from 2.1.3):
   
   
   def code(cls, fileloc) -> str:
   if STORE_DAG_CODE:
   return cls._get_code_from_db(fileloc)
   else:
   return cls._get_code_from_file(fileloc)
   
   
   
   However, in the current main branch, the store_dag_code option seems to have 
been removed and the code is loaded directly from DB (below snippet from main). 
So, not sure if the issue will come any more in future releases.
   
   
   def code(cls, fileloc) -> str:
   return cls._get_code_from_db(fileloc)
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Narendra-Neerukonda edited a comment on issue #18133: bulk_sync_to_db got UnicodeDecodeError when Chinese characters in dag code

2021-09-10 Thread GitBox


Narendra-Neerukonda edited a comment on issue #18133:
URL: https://github.com/apache/airflow/issues/18133#issuecomment-917351733


   In Airflow 2.1.3, there is an option in core for store_dag_code, which if 
set to True, stores the dag code in DB and UI retrieves it from there (nelow 
snippet from 2.1.3):
   `
   
   def code(cls, fileloc) -> str:
   if STORE_DAG_CODE:
   return cls._get_code_from_db(fileloc)
   else:
   return cls._get_code_from_file(fileloc)
   
   `
   
   However, in the current main branch, the store_dag_code option seems to have 
been removed and the code is loaded directly from DB (below snippet from main). 
So, not sure if the issue will come any more in future releases.
   `
   
   def code(cls, fileloc) -> str:
   return cls._get_code_from_db(fileloc)
   
   `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Narendra-Neerukonda edited a comment on issue #18133: bulk_sync_to_db got UnicodeDecodeError when Chinese characters in dag code

2021-09-10 Thread GitBox


Narendra-Neerukonda edited a comment on issue #18133:
URL: https://github.com/apache/airflow/issues/18133#issuecomment-917351733


   In Airflow 2.1.3, there is an option in core for store_dag_code, which if 
set to True, stores the dag code in DB and UI retrieves it from there (nelow 
snippet from 2.1.3):
   `
   
   def code(cls, fileloc) -> str:
   if STORE_DAG_CODE:
   return cls._get_code_from_db(fileloc)
   else:
   return cls._get_code_from_file(fileloc)
   
   `
   
   However, in the current main branch, the store_dag_code option seems to have 
been removed and the code is loaded directly from DB (below snippet from main). 
So, not sure if the issue will come any more in future releases.
   `
   def code(cls, fileloc) -> str:
   return cls._get_code_from_db(fileloc)
   `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Narendra-Neerukonda edited a comment on issue #18133: bulk_sync_to_db got UnicodeDecodeError when Chinese characters in dag code

2021-09-10 Thread GitBox


Narendra-Neerukonda edited a comment on issue #18133:
URL: https://github.com/apache/airflow/issues/18133#issuecomment-917351733


   In Airflow 2.1.3, there is an option in core for store_dag_code, which if 
set to True, stores the dag code in DB and UI retrieves it from there (nelow 
snippet from 2.1.3):
   `
   @classmethod
   def code(cls, fileloc) -> str:
   if STORE_DAG_CODE:
   return cls._get_code_from_db(fileloc)
   else:
   return cls._get_code_from_file(fileloc)
   `
   
   However, in the current main branch, the store_dag_code option seems to have 
been removed and the code is loaded directly from DB (below snippet from main). 
So, not sure if the issue will come any more in future releases.
   `
   @classmethod
   def code(cls, fileloc) -> str:
   return cls._get_code_from_db(fileloc)
   `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Narendra-Neerukonda edited a comment on issue #18133: bulk_sync_to_db got UnicodeDecodeError when Chinese characters in dag code

2021-09-10 Thread GitBox


Narendra-Neerukonda edited a comment on issue #18133:
URL: https://github.com/apache/airflow/issues/18133#issuecomment-917351733


   In Airflow 2.1.3, there is an option in core for store_dag_code, which if 
set to True, stores the dag code in DB and UI retrieves it from there (nelow 
snippet from 2.1.3):
   `
   @classmethod
   def code(cls, fileloc) -> str:
   """Returns source code for this DagCode object.
   
   :return: source code as string
   """
   if STORE_DAG_CODE:
   return cls._get_code_from_db(fileloc)
   else:
   return cls._get_code_from_file(fileloc)
   `
   
   However, in the current main branch, the store_dag_code option seems to have 
been removed and the code is loaded directly from DB (below snippet from main). 
So, not sure if the issue will come any more in future releases.
   `
   @classmethod
   def code(cls, fileloc) -> str:
   """Returns source code for this DagCode object.
   
   :return: source code as string
   """
   return cls._get_code_from_db(fileloc)
   `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Narendra-Neerukonda commented on issue #18133: bulk_sync_to_db got UnicodeDecodeError when Chinese characters in dag code

2021-09-10 Thread GitBox


Narendra-Neerukonda commented on issue #18133:
URL: https://github.com/apache/airflow/issues/18133#issuecomment-917351733


   In Airflow 2.1.3, there is an option in core for store_dag_code, which if 
set to True, stores the dag code in DB and UI retrieves it from there (nelow 
snippet from 2.1.3):
   `@classmethod
   def code(cls, fileloc) -> str:
   """Returns source code for this DagCode object.
   
   :return: source code as string
   """
   if STORE_DAG_CODE:
   return cls._get_code_from_db(fileloc)
   else:
   return cls._get_code_from_file(fileloc)`
   
   However, in the current main branch, the store_dag_code option seems to have 
been removed and the code is loaded directly from DB (below snippet from main). 
So, not sure if the issue will come any more in future releases.
   `@classmethod
   def code(cls, fileloc) -> str:
   """Returns source code for this DagCode object.
   
   :return: source code as string
   """
   return cls._get_code_from_db(fileloc)`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated: Use parameters instead of params (#18143)

2021-09-10 Thread msumit
This is an automated email from the ASF dual-hosted git repository.

msumit pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 9140ad8  Use parameters instead of params (#18143)
9140ad8 is described below

commit 9140ad8d8f6dadd56bc592f5cdbf5585c2a8ce89
Author: Sumit Maheshwari 
AuthorDate: Sat Sep 11 09:37:46 2021 +0530

Use parameters instead of params (#18143)
---
 .../example_dags/example_facebook_ads_to_gcs.py|  4 +--
 .../google/cloud/transfers/facebook_ads_to_gcs.py  | 27 +---
 .../example_dags/example_display_video.py  |  4 +--
 .../marketing_platform/operators/display_video.py  | 29 +-
 .../cloud/transfers/test_facebook_ads_to_gcs.py|  8 +++---
 .../operators/test_display_video.py|  6 ++---
 6 files changed, 58 insertions(+), 20 deletions(-)

diff --git 
a/airflow/providers/google/cloud/example_dags/example_facebook_ads_to_gcs.py 
b/airflow/providers/google/cloud/example_dags/example_facebook_ads_to_gcs.py
index 0ffe21c..9b6ac50 100644
--- a/airflow/providers/google/cloud/example_dags/example_facebook_ads_to_gcs.py
+++ b/airflow/providers/google/cloud/example_dags/example_facebook_ads_to_gcs.py
@@ -51,7 +51,7 @@ FIELDS = [
 AdsInsights.Field.clicks,
 AdsInsights.Field.impressions,
 ]
-PARAMS = {'level': 'ad', 'date_preset': 'yesterday'}
+PARAMETERS = {'level': 'ad', 'date_preset': 'yesterday'}
 # [END howto_FB_ADS_variables]
 
 with models.DAG(
@@ -90,7 +90,7 @@ with models.DAG(
 start_date=days_ago(2),
 owner='airflow',
 bucket_name=GCS_BUCKET,
-params=PARAMS,
+parameters=PARAMETERS,
 fields=FIELDS,
 gcp_conn_id=GCS_CONN_ID,
 object_name=GCS_OBJ_PATH,
diff --git a/airflow/providers/google/cloud/transfers/facebook_ads_to_gcs.py 
b/airflow/providers/google/cloud/transfers/facebook_ads_to_gcs.py
index ed13fd6..7abee35 100644
--- a/airflow/providers/google/cloud/transfers/facebook_ads_to_gcs.py
+++ b/airflow/providers/google/cloud/transfers/facebook_ads_to_gcs.py
@@ -18,8 +18,10 @@
 """This module contains Facebook Ad Reporting to GCS operators."""
 import csv
 import tempfile
+import warnings
 from typing import Any, Dict, List, Optional, Sequence, Union
 
+from airflow.exceptions import AirflowException
 from airflow.models import BaseOperator
 from airflow.providers.facebook.ads.hooks.ads import FacebookAdsReportingHook
 from airflow.providers.google.cloud.hooks.gcs import GCSHook
@@ -56,9 +58,13 @@ class FacebookAdsReportToGcsOperator(BaseOperator):
 :param fields: List of fields that is obtained from Facebook. Found in 
AdsInsights.Field class.
 
https://developers.facebook.com/docs/marketing-api/insights/parameters/v6.0
 :type fields: List[str]
-:param params: Parameters that determine the query for Facebook
+:param params: Parameters that determine the query for Facebook. This 
keyword is deprecated,
+please use `parameters` keyword to pass the parameters.
 
https://developers.facebook.com/docs/marketing-api/insights/parameters/v6.0
 :type params: Dict[str, Any]
+:param parameters: Parameters that determine the query for Facebook
+
https://developers.facebook.com/docs/marketing-api/insights/parameters/v6.0
+:type parameters: Dict[str, Any]
 :param gzip: Option to compress local file or file data for upload
 :type gzip: bool
 :param impersonation_chain: Optional service account to impersonate using 
short-term
@@ -77,6 +83,7 @@ class FacebookAdsReportToGcsOperator(BaseOperator):
 "bucket_name",
 "object_name",
 "impersonation_chain",
+"parameters",
 )
 
 def __init__(
@@ -85,7 +92,8 @@ class FacebookAdsReportToGcsOperator(BaseOperator):
 bucket_name: str,
 object_name: str,
 fields: List[str],
-params: Dict[str, Any],
+params: Dict[str, Any] = None,
+parameters: Dict[str, Any] = None,
 gzip: bool = False,
 api_version: str = "v6.0",
 gcp_conn_id: str = "google_cloud_default",
@@ -100,15 +108,26 @@ class FacebookAdsReportToGcsOperator(BaseOperator):
 self.facebook_conn_id = facebook_conn_id
 self.api_version = api_version
 self.fields = fields
-self.params = params
+self.parameters = parameters
 self.gzip = gzip
 self.impersonation_chain = impersonation_chain
 
+if params is None and parameters is None:
+raise AirflowException("Argument ['parameters'] is required")
+if params and parameters is None:
+# TODO: Remove in provider version 6.0
+warnings.warn(
+"Please use 'parameters' instead of 'params'",
+DeprecationWarning,
+stacklevel=2,
+)
+self.parameters = params
+
 def ex

[GitHub] [airflow] msumit merged pull request #18143: Use parameters instead of params

2021-09-10 Thread GitBox


msumit merged pull request #18143:
URL: https://github.com/apache/airflow/pull/18143


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] subkanthi opened a new pull request #18161: Duplicate Connection: Added logic to query if a connection id exists before creating one

2021-09-10 Thread GitBox


subkanthi opened a new pull request #18161:
URL: https://github.com/apache/airflow/pull/18161


   Added logic to query if a connection exists before trying to create one in 
the duplicate connection flow.
   closes: #18050 
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on issue #10210: SQSSensor Dag is not triggering whenever there is new message.

2021-09-10 Thread GitBox


github-actions[bot] commented on issue #10210:
URL: https://github.com/apache/airflow/issues/10210#issuecomment-917297708


   This issue has been closed because it has not received response from the 
issue author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on issue #9780: DataProcOperator: TypeError: upload() got an unexpected keyword argument 'bucket_name'

2021-09-10 Thread GitBox


github-actions[bot] commented on issue #9780:
URL: https://github.com/apache/airflow/issues/9780#issuecomment-917297718


   This issue has been closed because it has not received response from the 
issue author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] closed issue #9780: DataProcOperator: TypeError: upload() got an unexpected keyword argument 'bucket_name'

2021-09-10 Thread GitBox


github-actions[bot] closed issue #9780:
URL: https://github.com/apache/airflow/issues/9780


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #15743: Redirect forked process output to logger

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #15743:
URL: https://github.com/apache/airflow/pull/15743#issuecomment-917297638


   This pull request has been automatically marked as stale because it has not 
had recent activity. It will be closed in 5 days if no further activity occurs. 
Thank you for your contributions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] closed issue #10210: SQSSensor Dag is not triggering whenever there is new message.

2021-09-10 Thread GitBox


github-actions[bot] closed issue #10210:
URL: https://github.com/apache/airflow/issues/10210


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #17258: Fix serialize_operator task_type

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #17258:
URL: https://github.com/apache/airflow/pull/17258#issuecomment-917297600


   This pull request has been automatically marked as stale because it has not 
had recent activity. It will be closed in 5 days if no further activity occurs. 
Thank you for your contributions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated: Fix Airflow version for `[logging] worker_log_server_port` (#18158)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 2776e08  Fix Airflow version for `[logging] worker_log_server_port` 
(#18158)
2776e08 is described below

commit 2776e087df0a28c01cc467e457e2f35263601b8b
Author: Kaxil Naik 
AuthorDate: Sat Sep 11 00:10:26 2021 +0100

Fix Airflow version for `[logging] worker_log_server_port` (#18158)

This will be released in 2.2.0 not 2.3.0
---
 airflow/config_templates/config.yml | 2 +-
 airflow/configuration.py| 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index be014cf..4eba300 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -617,7 +617,7 @@
 web server, who then builds pages and sends them to users. This defines
 the port on which the logs are served. It needs to be unused, and open
 visible from the main web server to connect into the workers.
-  version_added: 2.3.0
+  version_added: 2.2.0
   type: string
   example: ~
   default: "8793"
diff --git a/airflow/configuration.py b/airflow/configuration.py
index 9244f9f..c75dcb4 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -171,7 +171,7 @@ class AirflowConfigParser(ConfigParser):
 ('core', 'sensitive_var_conn_names'): ('admin', 
'sensitive_variable_fields', '2.1.0'),
 ('core', 'default_pool_task_slot_count'): ('core', 
'non_pooled_task_slot_count', '1.10.4'),
 ('core', 'max_active_tasks_per_dag'): ('core', 'dag_concurrency', 
'2.2.0'),
-('logging', 'worker_log_server_port'): ('celery', 
'worker_log_server_port', '2.3.0'),
+('logging', 'worker_log_server_port'): ('celery', 
'worker_log_server_port', '2.2.0'),
 ('api', 'access_control_allow_origins'): ('api', 
'access_control_allow_origin', '2.2.0'),
 }
 


[GitHub] [airflow] kaxil merged pull request #18158: Fix Airflow version for `[logging] worker_log_server_port`

2021-09-10 Thread GitBox


kaxil merged pull request #18158:
URL: https://github.com/apache/airflow/pull/18158


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18158: Fix Airflow version for `[logging] worker_log_server_port`

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #18158:
URL: https://github.com/apache/airflow/pull/18158#issuecomment-917263418


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jhtimmins opened a new pull request #18160: Apply parent dag permissions to subdags.

2021-09-10 Thread GitBox


jhtimmins opened a new pull request #18160:
URL: https://github.com/apache/airflow/pull/18160


   

[airflow] branch main updated: Make auto refresh interval configurable (#18107)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new a773794  Make auto refresh interval configurable (#18107)
a773794 is described below

commit a77379454c7841bef619523819edfb92795cb597
Author: Rachel Wigell 
AuthorDate: Fri Sep 10 18:35:59 2021 -0400

Make auto refresh interval configurable (#18107)

closes: #18069

We accidentally DDOS'd our own webserver yesterday 😄 More details in the 
issue on how exactly this happened, but the gist is that auto-refresh can be a 
significant strain if there are many active tasks whose statuses must be 
polled. Auto-refresh is a wonderful feature but we wanted to be able to 
lengthen the interval to protect against this.

On main, the interval is hard-coded to 3 seconds. I'm proposing we add a 
new webserver config variable that will allow this interval to be customized.
---
 airflow/config_templates/config.yml  | 8 
 airflow/config_templates/default_airflow.cfg | 4 
 airflow/www/static/js/graph.js   | 5 +++--
 airflow/www/static/js/tree.js| 4 ++--
 airflow/www/templates/airflow/graph.html | 1 +
 airflow/www/templates/airflow/tree.html  | 3 ++-
 airflow/www/views.py | 2 ++
 7 files changed, 22 insertions(+), 5 deletions(-)

diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index 6db9469..be014cf 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -1293,6 +1293,14 @@
   type: string
   example: ~
   default:
+- name: auto_refresh_interval
+  description: |
+How frequently, in seconds, the DAG data will auto-refresh in graph or 
tree view
+when auto-refresh is turned on
+  version_added: 2.2.0
+  type: integer
+  example: ~
+  default: "3"
 
 - name: email
   description: |
diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 1b11bd7..558e355 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -649,6 +649,10 @@ session_lifetime_minutes = 43200
 # Sets a custom page title for the DAGs overview page and site title for all 
pages
 # instance_name =
 
+# How frequently, in seconds, the DAG data will auto-refresh in graph or tree 
view
+# when auto-refresh is turned on
+auto_refresh_interval = 3
+
 [email]
 
 # Configuration email backend and whether to
diff --git a/airflow/www/static/js/graph.js b/airflow/www/static/js/graph.js
index 69e6d35..c416c6c 100644
--- a/airflow/www/static/js/graph.js
+++ b/airflow/www/static/js/graph.js
@@ -20,7 +20,8 @@
  */
 
 /*
-  global d3, document, nodes, taskInstances, tasks, edges, dagreD3, 
localStorage, $
+  global d3, document, nodes, taskInstances, tasks, edges, dagreD3, 
localStorage, $,
+  autoRefreshInterval
 */
 
 import getMetaValue from './meta_value';
@@ -408,7 +409,7 @@ function startOrStopRefresh() {
   if ($('#auto_refresh').is(':checked')) {
 refreshInterval = setInterval(() => {
   handleRefresh();
-}, 3000); // run refresh every 3 seconds
+}, autoRefreshInterval * 1000);
   } else {
 clearInterval(refreshInterval);
   }
diff --git a/airflow/www/static/js/tree.js b/airflow/www/static/js/tree.js
index 97c27ac..1346a73 100644
--- a/airflow/www/static/js/tree.js
+++ b/airflow/www/static/js/tree.js
@@ -19,7 +19,7 @@
  * under the License.
  */
 
-/* global treeData, document, window, $, d3, moment, localStorage */
+/* global treeData, document, window, $, d3, moment, localStorage, 
autoRefreshInterval */
 import { escapeHtml } from './main';
 import tiTooltip from './task_instances';
 import { callModal, callModalDag } from './dag';
@@ -460,7 +460,7 @@ document.addEventListener('DOMContentLoaded', () => {
 } else {
   $('#auto_refresh').prop('checked', false);
 }
-  }, 3000); // run refresh every 3 seconds
+  }, autoRefreshInterval * 1000);
 } else {
   clearInterval(refreshInterval);
 }
diff --git a/airflow/www/templates/airflow/graph.html 
b/airflow/www/templates/airflow/graph.html
index 1d3489c..d2ff701 100644
--- a/airflow/www/templates/airflow/graph.html
+++ b/airflow/www/templates/airflow/graph.html
@@ -128,6 +128,7 @@
 const edges = {{ edges|tojson }};
 const tasks = {{ tasks|tojson }};
 let taskInstances = {{ task_instances|tojson }};
+const autoRefreshInterval = {{ auto_refresh_interval }};
   
   
   
diff --git a/airflow/www/templates/airflow/tree.html 
b/airflow/www/templates/airflow/tree.html
index 53254b2..3f27a2d 100644
--- a/airflow/www/templates/airflow/tree.html
+++ b/airflow/www/templates/airflow/tree.html
@@ -107,6 +107,7 @@
   
   
   
-const treeData = {{ data|tojson }}
+c

[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18107: Make auto refresh interval configurable

2021-09-10 Thread GitBox


boring-cyborg[bot] commented on pull request #18107:
URL: https://github.com/apache/airflow/pull/18107#issuecomment-917252712


   Awesome work, congrats on your first merged pull request!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil closed issue #18069: Move auto-refresh interval to config variable

2021-09-10 Thread GitBox


kaxil closed issue #18069:
URL: https://github.com/apache/airflow/issues/18069


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #18107: Make auto refresh interval configurable

2021-09-10 Thread GitBox


kaxil merged pull request #18107:
URL: https://github.com/apache/airflow/pull/18107


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] SamWheating opened a new pull request #18159: Adding Variable.update() method and improving detection of variable key collisions

2021-09-10 Thread GitBox


SamWheating opened a new pull request #18159:
URL: https://github.com/apache/airflow/pull/18159


   **Re:** Discussion in https://github.com/apache/airflow/issues/17889
   
   A few changes in this PR:
   
- Update `Variable.set()` method to take a `description` argument. 
- Update `Variable.setdefault()` method to take a `description` argument. 
- Adding a `Variable.update()` method which will throw a `KeyError` if the 
Variable doesn't exist, and an `AttributeError` if it doesn't exist in the 
Database (since a non-metastore Variable can't be modified)
- Improved logging around key collisions between different variable 
backends.
- Updated documentation to warn users about key collisions between variable 
backends.
   
   If a user has a duplicated key in the metastore and an extra secrets 
backend, then updates to the Variable will update the value in the metastore, 
but reads will read the value in the additional backend. 
   
   This is still the case, but I've improved the logging when this happens and 
updated the documentation to warn users about this behaviour.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #18158: Fix Airflow version for `[logging] worker_log_server_port`

2021-09-10 Thread GitBox


kaxil opened a new pull request #18158:
URL: https://github.com/apache/airflow/pull/18158


   This will be released in 2.2.0 not 2.3.0
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18095: Kubernetes Executor Bug Fix: Set task state to failed when pod is DELETED while running

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #18095:
URL: https://github.com/apache/airflow/pull/18095#issuecomment-917242783


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18157: Add C2FO to ``INTHEWILD.md``

2021-09-10 Thread GitBox


boring-cyborg[bot] commented on pull request #18157:
URL: https://github.com/apache/airflow/pull/18157#issuecomment-917242601


   Awesome work, congrats on your first merged pull request!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #18157: Add C2FO to ``INTHEWILD.md``

2021-09-10 Thread GitBox


kaxil merged pull request #18157:
URL: https://github.com/apache/airflow/pull/18157


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (db5ac64 -> a3c4784)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from db5ac64  Reorder migrations to be compatible with 2.1.4 (#18153)
 add a3c4784  Add C2FO to ``INTHEWILD.md`` (#18157)

No new revisions were added by this update.

Summary of changes:
 INTHEWILD.md | 1 +
 1 file changed, 1 insertion(+)


[GitHub] [airflow] github-actions[bot] commented on pull request #18157: Add C2FO to ``INTHEWILD.md``

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #18157:
URL: https://github.com/apache/airflow/pull/18157#issuecomment-917239280


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest main or amend the last commit of 
the PR, and push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v2-1-test updated: Fixes warm shutdown for celery worker. (#18068)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v2-1-test by this push:
 new 45eb384  Fixes warm shutdown for celery worker. (#18068)
45eb384 is described below

commit 45eb38431eeeff70c849dc243ebea0c4045a8541
Author: Jarek Potiuk 
AuthorDate: Fri Sep 10 20:13:31 2021 +0200

Fixes warm shutdown for celery worker. (#18068)

The way how dumb-init propagated the signal by default
made celery worker not to handle termination well.

Default behaviour of dumb-init is to propagate signals to the
process group rather than to the single child it uses. This is
protective behaviour, in case a user runs 'bash -c' command
without 'exec' - in this case signals should be sent not only
to the bash but also to the process(es) it creates, otherwise
bash exits without propagating the signal and you need second
signal to kill all processes.

However some airflow processes (in particular airflow celery worker)
behave in a responsible way and handles the signals appropriately
- when the first signal is received, it will switch to offline
mode and let all workers terminate (until grace period expires
resulting in Warm Shutdown.

Therefore we can disable the protection of dumb-init and let it
propagate the signal to only the single child it spawns in the
Helm Chart. Documentation of the image was also updated to include
explanation of signal propagation. For explicitness the
DUMB_INIT_SETSID variable has been set to 1 in the image as well.

Fixes #18066

(cherry picked from commit 9e13e450032f4c71c54d091e7f80fe685204b5b4)
---
 Dockerfile |  1 +
 chart/templates/workers/worker-deployment.yaml |  3 ++
 docs/docker-stack/entrypoint.rst   | 41 ++
 3 files changed, 45 insertions(+)

diff --git a/Dockerfile b/Dockerfile
index e08a050..de9248c 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -479,6 +479,7 @@ LABEL org.apache.airflow.distro="debian" \
   org.opencontainers.image.title="Production Airflow Image" \
   org.opencontainers.image.description="Reference, production-ready Apache 
Airflow image"
 
+ENV DUMB_INIT_SETSID="1"
 
 ENTRYPOINT ["/usr/bin/dumb-init", "--", "/entrypoint"]
 CMD []
diff --git a/chart/templates/workers/worker-deployment.yaml 
b/chart/templates/workers/worker-deployment.yaml
index 38e4e6d..7ae2627 100644
--- a/chart/templates/workers/worker-deployment.yaml
+++ b/chart/templates/workers/worker-deployment.yaml
@@ -169,6 +169,9 @@ spec:
   envFrom:
   {{- include "custom_airflow_environment_from" . | default "\n  []" | 
indent 10 }}
   env:
+# Only signal the main process, not the process group, to make 
Warm Shutdown work properly
+- name: DUMB_INIT_SETSID
+  value: "0"
   {{- include "custom_airflow_environment" . | indent 10 }}
   {{- include "standard_airflow_environment" . | indent 10 }}
   {{- if .Values.workers.kerberosSidecar.enabled }}
diff --git a/docs/docker-stack/entrypoint.rst b/docs/docker-stack/entrypoint.rst
index a999892..4b64904 100644
--- a/docs/docker-stack/entrypoint.rst
+++ b/docs/docker-stack/entrypoint.rst
@@ -161,6 +161,47 @@ If there are any other arguments - they are simply passed 
to the "airflow" comma
   > docker run -it apache/airflow:2.1.0-python3.6 version
   2.1.0
 
+Signal propagation
+--
+
+Airflow uses ``dumb-init`` to run as "init" in the entrypoint. This is in 
order to propagate
+signals and reap child processes properly. This means that the process that 
you run does not have
+to install signal handlers to work properly and be killed when the container 
is gracefully terminated.
+The behaviour of signal propagation is configured by ``DUMB_INIT_SETSID`` 
variable which is set to
+``1`` by default - meaning that the signals will be propagated to the whole 
process group, but you can
+set it to ``0`` to enable ``single-child`` behaviour of ``dumb-init`` which 
only propagates the
+signals to only single child process.
+
+The table below summarizes ``DUMB_INIT_SETSID`` possible values and their use 
cases.
+
+++--+
+| Variable value | Use case
 |
+++--+
+| 1 (default)| Propagates signals to all processes in the process group of 
the main |
+|| process running in the container.   
 |
+|| 
 |
+|| If you run your processes via ``["bash", "-c"]`` comman

[airflow] branch v2-2-test updated (e6cb2f7 -> db5ac64)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v2-2-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e6cb2f7  ECSOperator returns last logs when ECS task fails (#17209)
 add b8926ee  Add a note about no back-compat guarantees for experimental 
features (#18139)
 add 0e3b06b  Mark passing pre/post execute callbacks to operators as 
experimental. (#18140)
 add 68d99bc  [Airflow 16364] Add conn_timeout and cmd_timeout params to 
SSHOperator; add conn_timeout param to SSHHook (#17236)
 add 0df31cd  Change from dynamic date to fixed date in examples (#18071)
 add 3d4bfdc  Add missing __init__.py files for some test packages (#18142)
 add 975a4e0  Fix quarentine tests affected by AIP-39 (#18141)
 add d491afb  Doc: Minor wording tweaks (#18148)
 add 42c835f  Fix typo in StandardTaskRunning log message (#18149)
 add 491d818  Fix bad repository name in pre-commit config (#18151)
 add 9e13e45  Fixes warm shutdown for celery worker. (#18068)
 add 476ae0e  Fixing Vault AppRole authentication with CONN_URI (#18064)
 add 692d744  Fixed log view for deferred tasks (#18154)
 add db5ac64  Reorder migrations to be compatible with 2.1.4 (#18153)

No new revisions were added by this update.

Summary of changes:
 .pre-commit-config.yaml|  10 +-
 BREEZE.rst |  10 +-
 Dockerfile |   1 +
 STATIC_CODE_CHECKS.rst |   2 +-
 airflow/example_dags/example_bash_operator.py  |   7 +-
 .../example_branch_datetime_operator.py|   7 +-
 .../example_branch_day_of_week_operator.py |   6 +-
 airflow/example_dags/example_branch_labels.py  |   7 +-
 airflow/example_dags/example_branch_operator.py|   6 +-
 .../example_branch_python_dop_operator_3.py|   6 +-
 airflow/example_dags/example_complex.py|   6 +-
 airflow/example_dags/example_dag_decorator.py  |   5 +-
 .../example_dags/example_kubernetes_executor.py|   6 +-
 .../example_kubernetes_executor_config.py  |   5 +-
 airflow/example_dags/example_latest_only.py|   5 +-
 .../example_latest_only_with_trigger.py|   5 +-
 airflow/example_dags/example_nested_branch_dag.py  |   8 +-
 .../example_passing_params_via_test_command.py |   7 +-
 airflow/example_dags/example_python_operator.py|   6 +-
 .../example_dags/example_short_circuit_operator.py |   7 +-
 airflow/example_dags/example_skip_dag.py   |   5 +-
 airflow/example_dags/example_task_group.py |   6 +-
 .../example_dags/example_task_group_decorator.py   |   7 +-
 .../example_dags/example_trigger_controller_dag.py |   7 +-
 airflow/example_dags/example_trigger_target_dag.py |   6 +-
 airflow/example_dags/example_xcom.py   |   7 +-
 airflow/example_dags/example_xcomargs.py   |   9 +-
 airflow/example_dags/subdags/subdag.py |   6 +-
 airflow/example_dags/test_utils.py |   1 -
 airflow/example_dags/tutorial.py   |   6 +-
 airflow/example_dags/tutorial_etl_dag.py   |   5 +-
 airflow/example_dags/tutorial_taskflow_api_etl.py  |   4 +-
 .../tutorial_taskflow_api_etl_virtualenv.py|   5 +-
 airflow/jobs/scheduler_job.py  |   6 +-
 ...5d12_add_max_active_runs_column_to_dagmodel_.py |   4 +-
 ...ta_interval_start_end_to_dagmodel_and_dagrun.py |   2 +-
 .../83f031fd9f1c_improve_mssql_compatibility.py|   4 +-
 airflow/models/baseoperator.py |   4 +
 .../providers/google/config_templates}/__init__.py |   0
 airflow/providers/hashicorp/hooks/vault.py |  26 ++-
 airflow/providers/ssh/hooks/ssh.py |  40 +++-
 airflow/providers/ssh/operators/ssh.py |  37 +++-
 airflow/task/task_runner/standard_task_runner.py   |   2 +-
 airflow/www/views.py   |   2 +-
 breeze-complete|   2 +-
 chart/templates/workers/worker-deployment.yaml |   3 +
 .../connections/ssh.rst|   7 +-
 docs/apache-airflow/migrations-ref.rst |  16 +-
 docs/apache-airflow/release-process.rst|  17 +-
 docs/apache-airflow/start/docker.rst   |   6 +-
 docs/conf.py   |   1 +
 docs/docker-stack/entrypoint.rst   |  41 
 ...t_check_providers_subpackages_all_have_init.py} |  20 +-
 tests/cli/commands/test_task_command.py|  24 +--
 tests/executors/test_celery_executor.py|   2 +-
 tests/jobs/test_scheduler_job.py   | 155 ---
 .../aws/config_templates}/__init__.py  |   0
 .../aws/infrastructure}/__init__.py|   0
 .../example_s3_to_redshift}/__init__.py|   0
 .../hooks => amazon/aws/secrets}/__init__.py

[airflow] branch main updated (692d744 -> db5ac64)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 692d744  Fixed log view for deferred tasks (#18154)
 add db5ac64  Reorder migrations to be compatible with 2.1.4 (#18153)

No new revisions were added by this update.

Summary of changes:
 ...2435bf5d12_add_max_active_runs_column_to_dagmodel_.py |  4 ++--
 ...add_data_interval_start_end_to_dagmodel_and_dagrun.py |  2 +-
 .../versions/83f031fd9f1c_improve_mssql_compatibility.py |  4 ++--
 docs/apache-airflow/migrations-ref.rst   | 16 
 4 files changed, 13 insertions(+), 13 deletions(-)


[airflow] 05/06: Bump version to 2.1.4

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2ef6ab1dc3e31b605a6f8f4ba0699aa2aa1cfca4
Author: Kaxil Naik 
AuthorDate: Fri Sep 10 15:06:48 2021 +0100

Bump version to 2.1.4
---
 README.md | 16 
 setup.py  |  2 +-
 2 files changed, 9 insertions(+), 9 deletions(-)

diff --git a/README.md b/README.md
index 722fad2..8981e3a 100644
--- a/README.md
+++ b/README.md
@@ -82,7 +82,7 @@ Airflow is not a streaming solution, but it is often used to 
process real-time d
 
 Apache Airflow is tested with:
 
-|  | Main version (dev)| Stable version (2.1.3)   |
+|  | Main version (dev)| Stable version (2.1.4)   |
 |  | - |  |
 | Python   | 3.6, 3.7, 3.8, 3.9| 3.6, 3.7, 3.8, 3.9   |
 | Kubernetes   | 1.20, 1.19, 1.18  | 1.20, 1.19, 1.18 |
@@ -142,15 +142,15 @@ them to appropriate format and workflow that your tool 
requires.
 
 
 ```bash
-pip install apache-airflow==2.1.3 \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.1.3/constraints-3.7.txt";
+pip install apache-airflow==2.1.4 \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.1.4/constraints-3.7.txt";
 ```
 
 2. Installing with extras (for example postgres,google)
 
 ```bash
-pip install apache-airflow[postgres,google]==2.1.3 \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.1.3/constraints-3.7.txt";
+pip install apache-airflow[postgres,google]==2.1.4 \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.1.4/constraints-3.7.txt";
 ```
 
 For information on installing provider packages check
@@ -231,7 +231,7 @@ packages:
 * **Airflow Providers**: SemVer rules apply to changes in the particular 
provider's code only.
   SemVer MAJOR and MINOR versions for the packages are independent from 
Airflow version.
   For example `google 4.1.0` and `amazon 3.0.3` providers can happily be 
installed
-  with `Airflow 2.1.3`. If there are limits of cross-dependencies between 
providers and Airflow packages,
+  with `Airflow 2.1.4`. If there are limits of cross-dependencies between 
providers and Airflow packages,
   they are present in providers as `install_requires` limitations. We aim to 
keep backwards
   compatibility of providers with all previously released Airflow 2 versions 
but
   there will be sometimes breaking changes that might make some, or all
@@ -254,7 +254,7 @@ Apache Airflow version life cycle:
 
 | Version | Current Patch/Minor | State | First Release | Limited Support 
| EOL/Terminated |
 
|-|-|---|---|-||
-| 2   | 2.1.3   | Supported | Dec 17, 2020  | Dec 2021
| TBD|
+| 2   | 2.1.4   | Supported | Dec 17, 2020  | Dec 2021
| TBD|
 | 1.10| 1.10.15 | EOL   | Aug 27, 2018  | Dec 17, 2020
| June 17, 2021  |
 | 1.9 | 1.9.0   | EOL   | Jan 03, 2018  | Aug 27, 2018
| Aug 27, 2018   |
 | 1.8 | 1.8.2   | EOL   | Mar 19, 2017  | Jan 03, 2018
| Jan 03, 2018   |
@@ -280,7 +280,7 @@ They are based on the official release schedule of Python 
and Kubernetes, nicely
 
 2. The "oldest" supported version of Python/Kubernetes is the default one. 
"Default" is only meaningful
in terms of "smoke tests" in CI PRs which are run using this default 
version and default reference
-   image available. Currently ``apache/airflow:latest`` and 
``apache/airflow:2.1.3` images
+   image available. Currently ``apache/airflow:latest`` and 
``apache/airflow:2.1.4` images
are both Python 3.6 images, however the first MINOR/MAJOR release of 
Airflow release after 23.12.2021 will
become Python 3.7 images.
 
diff --git a/setup.py b/setup.py
index 33cb4f9..33e1c8d 100644
--- a/setup.py
+++ b/setup.py
@@ -41,7 +41,7 @@ PY39 = sys.version_info >= (3, 9)
 
 logger = logging.getLogger(__name__)
 
-version = '2.1.3'
+version = '2.1.4'
 
 my_dir = dirname(__file__)
 


[airflow] 03/06: Update version added fields in airflow/config_templates/config.yml (#18128)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0ee20fff9f32cdcfe07fc1ab545d9b491c4374ac
Author: Kamil Breguła 
AuthorDate: Fri Sep 10 01:21:06 2021 +0200

Update version added fields in airflow/config_templates/config.yml (#18128)

(cherry picked from commit 2767781b880b0fb03d46950c06e1e44902c25a7c)
---
 airflow/config_templates/config.yml | 128 ++--
 1 file changed, 64 insertions(+), 64 deletions(-)

diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index 7abcb06..38be813 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -231,7 +231,7 @@
 but means plugin changes picked up by tasks straight away)
   default: "False"
   example: ~
-  version_added: "2.0.0"
+  version_added: 2.0.0
   see_also: ":ref:`plugins:loading`"
   type: boolean
 - name: fernet_key
@@ -382,7 +382,7 @@
 All the template_fields for each of Task Instance are stored in the 
Database.
 Keeping this number small may cause an error when you try to view 
``Rendered`` tab in
 TaskInstance view for older tasks.
-  version_added: 2.0.0
+  version_added: 1.10.10
   type: integer
   example: ~
   default: "30"
@@ -422,7 +422,7 @@
 Number of times the code should be retried in case of DB Operational 
Errors.
 Not all transactions will be retried as it can cause undesired state.
 Currently it is only used in ``DagFileProcessor.process_file`` to 
retry ``dagbag.sync_to_db``.
-  version_added: ~
+  version_added: 2.0.0
   type: integer
   example: ~
   default: "3"
@@ -431,7 +431,7 @@
 Hide sensitive Variables or Connection extra json keys from UI and 
task logs when set to True
 
 (Connection passwords are always hidden in logs)
-  version_added: ~
+  version_added: 2.1.0
   type: boolean
   example: ~
   default: "True"
@@ -439,7 +439,7 @@
   description: |
 A comma-separated list of extra sensitive keywords to look for in 
variables names or connection's
 extra JSON.
-  version_added: ~
+  version_added: 2.1.0
   type: string
   example: ~
   default: ""
@@ -451,7 +451,7 @@
   description: |
 The folder where airflow should store its log files
 This path must be absolute
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "{AIRFLOW_HOME}/logs"
@@ -459,7 +459,7 @@
   description: |
 Airflow can store logs remotely in AWS S3, Google Cloud Storage or 
Elastic Search.
 Set this to True if you want to enable remote logging.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "False"
@@ -467,7 +467,7 @@
   description: |
 Users must supply an Airflow connection id that provides access to the 
storage
 location.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: ""
@@ -477,7 +477,7 @@
 Credentials
 
`__
 will
 be used.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: ""
@@ -489,14 +489,14 @@
 GCS buckets should start with "gs://"
 WASB buckets should start with "wasb" just to help Airflow select 
correct handler
 Stackdriver logs should start with "stackdriver://"
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: ""
 - name: encrypt_s3_logs
   description: |
 Use server-side encryption for logs stored in S3
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "False"
@@ -505,7 +505,7 @@
 Logging level.
 
 Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, 
``DEBUG``.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "INFO"
@@ -514,7 +514,7 @@
 Logging level for Flask-appbuilder UI.
 
 Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, 
``DEBUG``.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "WARN"
@@ -523,7 +523,7 @@
 Logging class
 Specify the class that will specify the logging configuration
 This class has to be on the python classpath
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: "my.path.default_local_settings.LOGGING_CONFIG"
   default: ""
@@ -531,14 +531,14 @@
   description: |
 Flag to enable/disable Co

[airflow] branch v2-1-test updated (e7fc43f -> c81aa2b)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard e7fc43f  Add Changelog for 2.1.4
 discard eaca043  Bump version to 2.1.4
 discard a253b10  Update version added fields in 
airflow/config_templates/config.yml (#18128)
 discard 8b09602  Fix deprecation error message rather than silencing it 
(#18126)
 discard bf276ca  Limit the number of queued dagruns created by the Scheduler 
(#18065)
 new 247382f  Limit the number of queued dagruns created by the Scheduler 
(#18065)
 new f5f70e0  Fix deprecation error message rather than silencing it 
(#18126)
 new 0ee20ff  Update version added fields in 
airflow/config_templates/config.yml (#18128)
 new c1e9f80  Do not let create_dagrun overwrite explicit run_id (#17728)
 new 2ef6ab1  Bump version to 2.1.4
 new c81aa2b  Add Changelog for 2.1.4

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (e7fc43f)
\
 N -- N -- N   refs/heads/v2-1-test (c81aa2b)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 6 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 CHANGELOG.txt|   1 +
 airflow/models/dag.py|   9 +-
 tests/conftest.py|  18 +-
 tests/jobs/test_scheduler_job.py | 392 ++-
 4 files changed, 112 insertions(+), 308 deletions(-)


[airflow] 06/06: Add Changelog for 2.1.4

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c81aa2b72686b235d6ba636f8c892a064b9bf621
Author: Kaxil Naik 
AuthorDate: Fri Sep 10 15:11:27 2021 +0100

Add Changelog for 2.1.4
---
 CHANGELOG.txt | 39 +++
 1 file changed, 39 insertions(+)

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index ecbb6b6..560abc5 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -1,3 +1,42 @@
+Airflow 2.1.4, 2021-09-15
+-
+
+Bug Fixes
+"
+
+- Fix deprecation error message rather than silencing it (#18126)
+- Limit the number of queued dagruns created by the Scheduler (#18065)
+- Fix ``DagRun`` execution order from queued to running not being properly 
followed (#18061)
+- Fix ``max_active_runs`` not allowing moving of queued dagruns to running 
(#17945)
+- Avoid redirect loop for users with no permissions (#17838)
+- Avoid endless redirect loop when user has no roles (#17613)
+- Fix log links on graph TI modal (#17862)
+- Hide variable import form if user lacks permission (#18000)
+- Improve dag/task concurrency check (#17786)
+- Fix Clear task instances endpoint resets all DAG runs bug (#17961)
+- Fixes incorrect parameter passed to views (#18083) (#18085)
+- Fix Sentry handler from ``LocalTaskJob`` causing error (#18119)
+- Limit ``colorlog`` version (6.x is incompatible) (#18099)
+- Only show Pause/Unpause tooltip on hover (#17957)
+- Improve graph view load time for dags with open groups (#17821)
+- Increase width for Run column (#17817)
+- Fix wrong query on running tis (#17631)
+- Add root to tree refresh url (#17633)
+- Do not delete running DAG from the UI (#17630)
+- Improve discoverability of Provider packages' functionality
+- Do not let ``create_dagrun`` overwrite explicit ``run_id`` (#17728)
+
+Doc only changes
+
+
+- Update version added fields in airflow/config_templates/config.yml (#18128)
+- Improve the description of how to handle dynamic task generation (#17963)
+- Improve cross-links to operators and hooks references (#17622)
+- Doc: Fix replacing Airflow version for Docker stack (#17711)
+- Make the providers operators/hooks reference much more usable (#17768)
+- Update description about the new ``connection-types`` provider meta-data
+- Suggest to use secrets backend for variable when it contains sensitive data 
(#17319)
+
 Airflow 2.1.3, 2021-08-21
 -
 


[airflow] 02/06: Fix deprecation error message rather than silencing it (#18126)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f5f70e0ae79036e5a322fa1fc96430518c15e052
Author: Ash Berlin-Taylor 
AuthorDate: Fri Sep 10 00:07:19 2021 +0100

Fix deprecation error message rather than silencing it (#18126)

(cherry picked from commit c9d29467f71060f14863ca3508cb1055572479b5)
---
 .../versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py  | 2 +-
 .../versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py| 2 +-
 airflow/models/dagrun.py  | 4 ++--
 3 files changed, 4 insertions(+), 4 deletions(-)

diff --git 
a/airflow/migrations/versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py
 
b/airflow/migrations/versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py
index 82bb4c2..a9f612d 100644
--- 
a/airflow/migrations/versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py
+++ 
b/airflow/migrations/versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py
@@ -44,7 +44,7 @@ def upgrade():
 batch_op.create_index(
 'idx_dag_run_running_dags',
 ["state", "dag_id"],
-postgres_where=text("state='running'"),
+postgresql_where=text("state='running'"),
 mssql_where=text("state='running'"),
 sqlite_where=text("state='running'"),
 )
diff --git 
a/airflow/migrations/versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py
 
b/airflow/migrations/versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py
index 7326d73..6a1cbe6 100644
--- 
a/airflow/migrations/versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py
+++ 
b/airflow/migrations/versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py
@@ -40,7 +40,7 @@ def upgrade():
 batch_op.create_index(
 'idx_dag_run_queued_dags',
 ["state", "dag_id"],
-postgres_where=text("state='queued'"),
+postgresql_where=text("state='queued'"),
 mssql_where=text("state='queued'"),
 sqlite_where=text("state='queued'"),
 )
diff --git a/airflow/models/dagrun.py b/airflow/models/dagrun.py
index 1e5c2c1..ec4bdfb 100644
--- a/airflow/models/dagrun.py
+++ b/airflow/models/dagrun.py
@@ -103,7 +103,7 @@ class DagRun(Base, LoggingMixin):
 'idx_dag_run_running_dags',
 'state',
 'dag_id',
-postgres_where=text("state='running'"),
+postgresql_where=text("state='running'"),
 mssql_where=text("state='running'"),
 sqlite_where=text("state='running'"),
 ),
@@ -113,7 +113,7 @@ class DagRun(Base, LoggingMixin):
 'idx_dag_run_queued_dags',
 'state',
 'dag_id',
-postgres_where=text("state='queued'"),
+postgresql_where=text("state='queued'"),
 mssql_where=text("state='queued'"),
 sqlite_where=text("state='queued'"),
 ),


[airflow] 04/06: Do not let create_dagrun overwrite explicit run_id (#17728)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c1e9f8073193425194fe5d76c46c3c0f0f9af7ff
Author: Tzu-ping Chung 
AuthorDate: Thu Aug 19 22:33:09 2021 +0800

Do not let create_dagrun overwrite explicit run_id (#17728)

Previous DAG.create_dagrun() has an weird behavior that when *all* of
run_id, execution_date, and run_type are provided, the function would
ignore the run_id argument and overwrite it by auto-generating a run_id
with DagRun.generate_run_id(). This fix the logic to respect the
explicit run_id value.

I don't think any of the "Airflow proper" code would be affected by
this, but the dag_maker fixture used in the test suite needs to be
tweaked a bit to continue working.

(cherry picked from commit 50771e0f66803d0a0a0b552ab77f4e6be7d1088b)
---
 airflow/models/dag.py |  9 +
 tests/conftest.py | 18 +++---
 2 files changed, 16 insertions(+), 11 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 4ac2ace..a1419fe 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -1767,15 +1767,16 @@ class DAG(LoggingMixin):
 :param dag_hash: Hash of Serialized DAG
 :type dag_hash: str
 """
-if run_id and not run_type:
+if run_id:  # Infer run_type from run_id if needed.
 if not isinstance(run_id, str):
 raise ValueError(f"`run_id` expected to be a str is 
{type(run_id)}")
-run_type: DagRunType = DagRunType.from_run_id(run_id)
-elif run_type and execution_date:
+if not run_type:
+run_type = DagRunType.from_run_id(run_id)
+elif run_type and execution_date is not None:  # Generate run_id from 
run_type and execution_date.
 if not isinstance(run_type, DagRunType):
 raise ValueError(f"`run_type` expected to be a DagRunType is 
{type(run_type)}")
 run_id = DagRun.generate_run_id(run_type, execution_date)
-elif not run_id:
+else:
 raise AirflowException(
 "Creating DagRun needs either `run_id` or both `run_type` and 
`execution_date`"
 )
diff --git a/tests/conftest.py b/tests/conftest.py
index 0873ac4..6bee400 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -459,13 +459,17 @@ def dag_maker(request):
 
 def create_dagrun(self, **kwargs):
 dag = self.dag
-defaults = dict(
-run_id='test',
-state=State.RUNNING,
-execution_date=self.start_date,
-start_date=self.start_date,
-)
-kwargs = {**defaults, **kwargs}
+kwargs = {
+"state": State.RUNNING,
+"execution_date": self.start_date,
+"start_date": self.start_date,
+"session": self.session,
+**kwargs,
+}
+# Need to provide run_id if the user does not either provide one
+# explicitly, or pass run_type for inference in 
dag.create_dagrun().
+if "run_id" not in kwargs and "run_type" not in kwargs:
+kwargs["run_id"] = "test"
 self.dag_run = dag.create_dagrun(**kwargs)
 return self.dag_run
 


[airflow] 01/06: Limit the number of queued dagruns created by the Scheduler (#18065)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 247382fd0240c371a62748c67bc7a93700af98f0
Author: Ephraim Anierobi 
AuthorDate: Thu Sep 9 14:24:24 2021 +0100

Limit the number of queued dagruns created by the Scheduler (#18065)

There's no limit to the amount of queued dagruns to create currently
and it has become a concern with issues raised against it. See #18023 and 
#17979

Co-authored-by: Sam Wheating 
(cherry picked from commit 0eb41b5952c2ce1884594c82bbf05835912b9812)
---
 airflow/config_templates/config.yml|   8 +
 airflow/config_templates/default_airflow.cfg   |   4 +
 airflow/jobs/scheduler_job.py  |  21 +-
 ...26fe78_add_index_on_state_dag_id_for_queued_.py |  52 +++
 airflow/models/dagrun.py   |  10 +
 docs/apache-airflow/migrations-ref.rst |   4 +-
 tests/jobs/test_scheduler_job.py   | 411 ++---
 7 files changed, 214 insertions(+), 296 deletions(-)

diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index 9945213..7abcb06 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -191,6 +191,14 @@
   type: string
   example: ~
   default: "16"
+- name: max_queued_runs_per_dag
+  description: |
+The maximum number of queued dagruns for a single DAG. The scheduler 
will not create more DAG runs
+if it reaches the limit. This is not configurable at the DAG level.
+  version_added: 2.1.4
+  type: string
+  example: ~
+  default: "16"
 - name: load_examples
   description: |
 Whether to load the DAG examples that ship with Airflow. It's good to
diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 03d5e1f..56a1d90 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -127,6 +127,10 @@ dags_are_paused_at_creation = True
 # which is defaulted as ``max_active_runs_per_dag``.
 max_active_runs_per_dag = 16
 
+# The maximum number of queued dagruns for a single DAG. The scheduler will 
not create more DAG runs
+# if it reaches the limit. This is not configurable at the DAG level.
+max_queued_runs_per_dag = 16
+
 # Whether to load the DAG examples that ship with Airflow. It's good to
 # get started, but you probably want to set this to ``False`` in a production
 # environment
diff --git a/airflow/jobs/scheduler_job.py b/airflow/jobs/scheduler_job.py
index 8d5f888..45083a4 100644
--- a/airflow/jobs/scheduler_job.py
+++ b/airflow/jobs/scheduler_job.py
@@ -985,14 +985,31 @@ class SchedulerJob(BaseJob):
 existing_dagruns = (
 session.query(DagRun.dag_id, 
DagRun.execution_date).filter(existing_dagruns_filter).all()
 )
+max_queued_dagruns = conf.getint('core', 'max_queued_runs_per_dag')
+
+queued_runs_of_dags = defaultdict(
+int,
+session.query(DagRun.dag_id, func.count('*'))
+.filter(  # We use `list` here because SQLA doesn't accept a set
+# We use set to avoid duplicate dag_ids
+DagRun.dag_id.in_(list({dm.dag_id for dm in dag_models})),
+DagRun.state == State.QUEUED,
+)
+.group_by(DagRun.dag_id)
+.all(),
+)
 
 for dag_model in dag_models:
+# Lets quickly check if we have exceeded the number of queued 
dagruns per dags
+total_queued = queued_runs_of_dags[dag_model.dag_id]
+if total_queued >= max_queued_dagruns:
+continue
+
 try:
 dag = self.dagbag.get_dag(dag_model.dag_id, session=session)
 except SerializedDagNotFound:
 self.log.exception("DAG '%s' not found in serialized_dag 
table", dag_model.dag_id)
 continue
-
 dag_hash = self.dagbag.dags_hash.get(dag.dag_id)
 # Explicitly check if the DagRun already exists. This is an edge 
case
 # where a Dag Run is created but `DagModel.next_dagrun` and 
`DagModel.next_dagrun_create_after`
@@ -1003,6 +1020,7 @@ class SchedulerJob(BaseJob):
 # create a new one. This is so that in the next Scheduling loop we 
try to create new runs
 # instead of falling in a loop of Integrity Error.
 if (dag.dag_id, dag_model.next_dagrun) not in existing_dagruns:
+
 dag.create_dagrun(
 run_type=DagRunType.SCHEDULED,
 execution_date=dag_model.next_dagrun,
@@ -1012,6 +1030,7 @@ class SchedulerJob(BaseJob):
 dag_hash=dag_hash,
 creating_job_id=self.id,
 )
+queued_runs_of_dags[

[GitHub] [airflow] kaxil merged pull request #18153: Reorder migrations to be compatible with 2.1.4

2021-09-10 Thread GitBox


kaxil merged pull request #18153:
URL: https://github.com/apache/airflow/pull/18153


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18157: Update INTHEWILD.md

2021-09-10 Thread GitBox


boring-cyborg[bot] commented on pull request #18157:
URL: https://github.com/apache/airflow/pull/18157#issuecomment-917221448


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] AnthonyShipman opened a new pull request #18157: Update INTHEWILD.md

2021-09-10 Thread GitBox


AnthonyShipman opened a new pull request #18157:
URL: https://github.com/apache/airflow/pull/18157


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] nwalens commented on pull request #18147: Allow airflow standard images to run in openshift utilising the official helm chart #18136

2021-09-10 Thread GitBox


nwalens commented on pull request #18147:
URL: https://github.com/apache/airflow/pull/18147#issuecomment-917202675


   @jedcunningham  I hope I managed to fix all issues you pointed out.
   
   Sorry for the trouble and thanks for the help!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] john-jac commented on issue #15538: S3KeySensor wildcard fails to match valid unix wildcards

2021-09-10 Thread GitBox


john-jac commented on issue #15538:
URL: https://github.com/apache/airflow/issues/15538#issuecomment-917191275


   I'm happy to take this one if it's still free


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #18120: Handle DAGs with Parsing Errors

2021-09-10 Thread GitBox


kaxil commented on a change in pull request #18120:
URL: https://github.com/apache/airflow/pull/18120#discussion_r706449385



##
File path: airflow/dag_processing/processor.py
##
@@ -647,3 +648,17 @@ def process_file(
 self.log.exception("Error logging import errors!")
 
 return len(dagbag.dags), len(dagbag.import_errors)
+
+def delete_serialized_dags_with_parsing_errors(self, session, file_path):
+from airflow.models.serialized_dag import SerializedDagModel
+
+self.log.error("Deleting Serialized DAG(s) in %s file, please fix the 
parsing errors.", file_path)

Review comment:
   Good point, yes




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated: Fixed log view for deferred tasks (#18154)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 692d744  Fixed log view for deferred tasks (#18154)
692d744 is described below

commit 692d744e3b8acaf398c5b400eb2264a4e1c915de
Author: Andrew Godwin 
AuthorDate: Fri Sep 10 14:18:14 2021 -0600

Fixed log view for deferred tasks (#18154)

Deferred tasks were not showing up in the log view while they were
deferred, as they decrement the try_number on the way into deferral
status much like rescheduling sensors. This applies the same fix that
rescheduling sensors have to make their log appear.
---
 airflow/www/views.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index 14f7da0..54551f8 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -1280,7 +1280,7 @@ class Airflow(AirflowBaseView):
 num_logs = 0
 if ti is not None:
 num_logs = ti.next_try_number - 1
-if ti.state == State.UP_FOR_RESCHEDULE:
+if ti.state in (State.UP_FOR_RESCHEDULE, State.DEFERRED):
 # Tasks in reschedule state decremented the try number
 num_logs += 1
 logs = [''] * num_logs


[GitHub] [airflow] kaxil merged pull request #18154: Fixed log view for deferred tasks

2021-09-10 Thread GitBox


kaxil merged pull request #18154:
URL: https://github.com/apache/airflow/pull/18154


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on issue #17310: Got `subprocess.TimeoutExpired` when run `./breeze --python 3.8 buildl-docs` on Apple M1 chip

2021-09-10 Thread GitBox


potiuk edited a comment on issue #17310:
URL: https://github.com/apache/airflow/issues/17310#issuecomment-917184972


   Ah yeah. I believe the Docker Desktop for Mac has fixed the emulation but 
Ubuntu on Parallels on ARM still does not have it :(
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #17310: Got `subprocess.TimeoutExpired` when run `./breeze --python 3.8 buildl-docs` on Apple M1 chip

2021-09-10 Thread GitBox


potiuk commented on issue #17310:
URL: https://github.com/apache/airflow/issues/17310#issuecomment-917184972


   Ah yeah. I believe the Docker Desktop for Mac has fixed the emulation but 
Parallels on ARM still does not have it :(
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] yavuzmester edited a comment on issue #17507: Task processes killed with WARNING - Recorded pid does not match the current pid

2021-09-10 Thread GitBox


yavuzmester edited a comment on issue #17507:
URL: https://github.com/apache/airflow/issues/17507#issuecomment-917007634


   I am using BashOperator with run_as_user. For me I think it is related to 
the `eval` line I have in my bash script. Because I got the error just after 
the `eval` line executes.
   Previously I tried a workaround, thought that it worked but seems the issue 
is still there. So deleted that comment not to create confusion.
   
   UPDATE: I replaced my eval usage with ${…} but still the error continues. I 
think I will downgrade.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] subkanthi commented on issue #17310: Got `subprocess.TimeoutExpired` when run `./breeze --python 3.8 buildl-docs` on Apple M1 chip

2021-09-10 Thread GitBox


subkanthi commented on issue #17310:
URL: https://github.com/apache/airflow/issues/17310#issuecomment-917179575


   `Step 64/90 : RUN ln -sf /usr/bin/dumb-init /usr/local/bin/dumb-init
---> Using cache
---> 8bc2b7e35a64
   Step 65/90 : COPY airflow/www/yarn.lock airflow/www/package.json 
${AIRFLOW_SOURCES}/airflow/www/
---> Using cache
---> 2101286441e1
   Step 66/90 : RUN yarn --cwd airflow/www install --frozen-lockfile --no-cache 
&& yarn cache clean
---> Using cache
---> 9c2480204136
   Step 67/90 : COPY setup.py ${AIRFLOW_SOURCES}/setup.py
---> 806de9b1671c
   Step 68/90 : COPY setup.cfg ${AIRFLOW_SOURCES}/setup.cfg
---> 3b83e41ca3be
   Step 69/90 : COPY airflow/__init__.py ${AIRFLOW_SOURCES}/airflow/__init__.py
---> c69bf4520461
   Step 70/90 : RUN if [[ ${INSTALL_FROM_PYPI} == "true" ]]; then bash 
/scripts/docker/install_airflow.sh; fi
---> [Warning] The requested image's platform (linux/amd64) does not match 
the detected host platform (linux/arm64/v8) and no specific platform was 
requested
---> Running in a890b3ba0c58
   standard_init_linux.go:228: exec user process caused: exec format error
   The command '/bin/bash -o pipefail -e -u -x -c if [[ ${INSTALL_FROM_PYPI} == 
"true" ]]; then bash /scripts/docker/install_airflow.sh; fi' 
returned a non-zero code: 1
   
   ERROR: The previous step completed with error. Please take a look at output 
above 
   
   `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #17310: Got `subprocess.TimeoutExpired` when run `./breeze --python 3.8 buildl-docs` on Apple M1 chip

2021-09-10 Thread GitBox


potiuk commented on issue #17310:
URL: https://github.com/apache/airflow/issues/17310#issuecomment-917178360


   BTW. Try --python 3.6 :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on issue #17310: Got `subprocess.TimeoutExpired` when run `./breeze --python 3.8 buildl-docs` on Apple M1 chip

2021-09-10 Thread GitBox


potiuk edited a comment on issue #17310:
URL: https://github.com/apache/airflow/issues/17310#issuecomment-917177863


   Never saw this issue on my linux/chromebook/MacOs :(


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #17310: Got `subprocess.TimeoutExpired` when run `./breeze --python 3.8 buildl-docs` on Apple M1 chip

2021-09-10 Thread GitBox


potiuk commented on issue #17310:
URL: https://github.com/apache/airflow/issues/17310#issuecomment-917177863


   Never saw this issue on my linux/chromebook/os :(


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] yavuzmester edited a comment on issue #17507: Task processes killed with WARNING - Recorded pid does not match the current pid

2021-09-10 Thread GitBox


yavuzmester edited a comment on issue #17507:
URL: https://github.com/apache/airflow/issues/17507#issuecomment-917007634


   I am using BashOperator with run_as_user. For me I think it is related to 
the `eval` line I have in my bash script. Because I got the error just after 
the `eval` line executes.
   Previously I tried a workaround, thought that it worked but seems the issue 
is still there. So deleted that comment not to create confusion.
   
   UPDATE: I replaced my eval usage with ${…} but still the error continues. I 
think I will downgrade to 2.1.2.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] subkanthi commented on issue #17310: Got `subprocess.TimeoutExpired` when run `./breeze --python 3.8 buildl-docs` on Apple M1 chip

2021-09-10 Thread GitBox


subkanthi commented on issue #17310:
URL: https://github.com/apache/airflow/issues/17310#issuecomment-917175263


   Interestingly there are no issues running breeze on mac os (m1), but getting 
this error in a Linux VM(Ubuntu) Parallels.
   Need to check if docker on Mac is emulating x86.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil closed pull request #18153: Reorder migrations to be compatible with 2.1.4

2021-09-10 Thread GitBox


kaxil closed pull request #18153:
URL: https://github.com/apache/airflow/pull/18153


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bmckallagat-os commented on issue #18155: Upgrade `importlib-resources` version

2021-09-10 Thread GitBox


bmckallagat-os commented on issue #18155:
URL: https://github.com/apache/airflow/issues/18155#issuecomment-917172545


   > Feel free. It is configured in setup.cfg - you need to change it and see 
what happens when all tests run. If all tests pass, we will merged. Remember 
that importlib_resources are backport of the 3.9 native library and we need to 
make sure we are compatible with 3.6 - 3.9.
   
   Got it! Will do. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #18130: Add script that validate version fields in config.yaml

2021-09-10 Thread GitBox


mik-laj commented on a change in pull request #18130:
URL: https://github.com/apache/airflow/pull/18130#discussion_r706433340



##
File path: scripts/tools/validate_version_added_fields_in_config.py
##
@@ -0,0 +1,112 @@
+#!/usr/bin/env python

Review comment:
   Done. I forgot to push commit. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #18155: Upgrade `importlib-resources` version

2021-09-10 Thread GitBox


potiuk commented on issue #18155:
URL: https://github.com/apache/airflow/issues/18155#issuecomment-917166743


   Feel free. It is configured in setup.cfg - you need to change it and see 
what happens when all tests run. If all tests pass, we will merged. Remember 
that importlib_resources are backport of the 3.9 native library and we need to 
make sure we are compatible with 3.6 - 3.9.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] john-jac opened a new pull request #18156: Add IAM Role Credentials to S3ToRedshiftTransfer and RedshiftToS3Transfer

2021-09-10 Thread GitBox


john-jac opened a new pull request #18156:
URL: https://github.com/apache/airflow/pull/18156


   Add role_arn to S3ToRedshiftTransfer and RedshiftToS3Transfer 
   
   These changes check the S3Hook connection for a role_arn and, if found, use 
that as the credentials for the S3 LOAD and UNLOAD operations.  If role_arn is 
not present, the previous method using boto3.Session is maintained.
   
   Tests have been added for both operators, based upon the existing tests.
   
   Note that when using IAM roles, S3Hook is not instantiated as there is no 
need for the boto3 session object.  This means that the IAM principle executing 
the operator does need to be able to assume the IAM role in the connection as 
that is not required by Redshift.
   
   See 
https://docs.aws.amazon.com/redshift/latest/dg/copy-parameters-authorization.html#copy-credentials
 for more details.
   
   closes: #9971
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bmckallagat-os opened a new issue #18155: Upgrade `importlib-resources` version

2021-09-10 Thread GitBox


bmckallagat-os opened a new issue #18155:
URL: https://github.com/apache/airflow/issues/18155


   ### Description
   
   The version for `importlib-resources` constraint sets it to be 
[v1.5.0](https://github.com/python/importlib_resources/tree/v1.5.0) which is 
over a year old. For compatibility sake (for instance with something like 
Datapane) I would suggest upgrading it. 
   
   ### Use case/motivation
   
   Upgrade a an old dependency to keep code up to date.
   
   ### Related issues
   
   Not that I am aware of, maybe somewhat #12120, or #15991.
   
   ### Are you willing to submit a PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #18155: Upgrade `importlib-resources` version

2021-09-10 Thread GitBox


boring-cyborg[bot] commented on issue #18155:
URL: https://github.com/apache/airflow/issues/18155#issuecomment-917163462


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18104: Adding corresponding labels to pods created by jobs

2021-09-10 Thread GitBox


potiuk commented on pull request #18104:
URL: https://github.com/apache/airflow/pull/18104#issuecomment-917157783


   > Looks like the error is "Cannot kill container: 
8be9d2c03192d171a4292fef3bc098bee2b69a6fa5abd73893e128c045cba9f9: Container 
8be9d2c03192d171a4292fef3bc098bee2b69a6fa5abd73893e128c045cba9f9 is not running"
   > 
   > Before I do this deep dive I just wanted to check if this has ever 
happened before because I don't think the labels should affect this.
   
   Not really. It's an intermittent error.
   
   But I think it would be great if you uppdate some tests to the chart/tests 
to validate that the labels are there.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #17068: Influxdb Hook

2021-09-10 Thread GitBox


potiuk commented on pull request #17068:
URL: https://github.com/apache/airflow/pull/17068#issuecomment-917153258


   The only 'real' failure is build docs now :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jlouk commented on pull request #18104: Adding corresponding labels to pods created by jobs

2021-09-10 Thread GitBox


jlouk commented on pull request #18104:
URL: https://github.com/apache/airflow/pull/18104#issuecomment-917150795


   Looks like the error is "Cannot kill container: 
8be9d2c03192d171a4292fef3bc098bee2b69a6fa5abd73893e128c045cba9f9: Container 
8be9d2c03192d171a4292fef3bc098bee2b69a6fa5abd73893e128c045cba9f9 is not running"
   
   Before I do this deep dive I just wanted to check if this has ever happened 
before because I don't think the labels should affect this.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk merged pull request #18064: Fixing Vault AppRole authentication with CONN_URI

2021-09-10 Thread GitBox


potiuk merged pull request #18064:
URL: https://github.com/apache/airflow/pull/18064


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (9e13e45 -> 476ae0e)

2021-09-10 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 9e13e45  Fixes warm shutdown for celery worker. (#18068)
 add 476ae0e  Fixing Vault AppRole authentication with CONN_URI (#18064)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/hashicorp/hooks/vault.py| 26 ++--
 tests/providers/hashicorp/hooks/test_vault.py | 43 +--
 2 files changed, 57 insertions(+), 12 deletions(-)


[GitHub] [airflow] potiuk closed issue #18053: `VaultHook` AppRole authentication fails when using a conn_uri

2021-09-10 Thread GitBox


potiuk closed issue #18053:
URL: https://github.com/apache/airflow/issues/18053


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18154: Fixed log view for deferred tasks

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #18154:
URL: https://github.com/apache/airflow/pull/18154#issuecomment-917130594


   The PR is likely OK to be merged with just subset of tests for default 
Python and Database versions without running the full matrix of tests, because 
it does not modify the core of Airflow. If the committers decide that the full 
tests matrix is needed, they will add the label 'full tests needed'. Then you 
should rebase to the latest main or amend the last commit of the PR, and push 
it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] andrewgodwin opened a new pull request #18154: Fixed log view for deferred tasks

2021-09-10 Thread GitBox


andrewgodwin opened a new pull request #18154:
URL: https://github.com/apache/airflow/pull/18154


   Deferred tasks were not showing up in the log view while they were
   deferred, as they decrement the try_number on the way into deferral
   status much like rescheduling sensors. This applies the same fix that
   rescheduling sensors have to make their log appear.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated: Fixes warm shutdown for celery worker. (#18068)

2021-09-10 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 9e13e45  Fixes warm shutdown for celery worker. (#18068)
9e13e45 is described below

commit 9e13e450032f4c71c54d091e7f80fe685204b5b4
Author: Jarek Potiuk 
AuthorDate: Fri Sep 10 20:13:31 2021 +0200

Fixes warm shutdown for celery worker. (#18068)

The way how dumb-init propagated the signal by default
made celery worker not to handle termination well.

Default behaviour of dumb-init is to propagate signals to the
process group rather than to the single child it uses. This is
protective behaviour, in case a user runs 'bash -c' command
without 'exec' - in this case signals should be sent not only
to the bash but also to the process(es) it creates, otherwise
bash exits without propagating the signal and you need second
signal to kill all processes.

However some airflow processes (in particular airflow celery worker)
behave in a responsible way and handles the signals appropriately
- when the first signal is received, it will switch to offline
mode and let all workers terminate (until grace period expires
resulting in Warm Shutdown.

Therefore we can disable the protection of dumb-init and let it
propagate the signal to only the single child it spawns in the
Helm Chart. Documentation of the image was also updated to include
explanation of signal propagation. For explicitness the
DUMB_INIT_SETSID variable has been set to 1 in the image as well.

Fixes #18066
---
 Dockerfile |  1 +
 chart/templates/workers/worker-deployment.yaml |  3 ++
 docs/docker-stack/entrypoint.rst   | 41 ++
 3 files changed, 45 insertions(+)

diff --git a/Dockerfile b/Dockerfile
index 405470d..1890a87 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -479,6 +479,7 @@ LABEL org.apache.airflow.distro="debian" \
   org.opencontainers.image.title="Production Airflow Image" \
   org.opencontainers.image.description="Reference, production-ready Apache 
Airflow image"
 
+ENV DUMB_INIT_SETSID="1"
 
 ENTRYPOINT ["/usr/bin/dumb-init", "--", "/entrypoint"]
 CMD []
diff --git a/chart/templates/workers/worker-deployment.yaml 
b/chart/templates/workers/worker-deployment.yaml
index 68a0e18..023ffa4 100644
--- a/chart/templates/workers/worker-deployment.yaml
+++ b/chart/templates/workers/worker-deployment.yaml
@@ -180,6 +180,9 @@ spec:
   envFrom:
   {{- include "custom_airflow_environment_from" . | default "\n  []" | 
indent 10 }}
   env:
+# Only signal the main process, not the process group, to make 
Warm Shutdown work properly
+- name: DUMB_INIT_SETSID
+  value: "0"
   {{- include "custom_airflow_environment" . | indent 10 }}
   {{- include "standard_airflow_environment" . | indent 10 }}
   {{- if .Values.workers.kerberosSidecar.enabled }}
diff --git a/docs/docker-stack/entrypoint.rst b/docs/docker-stack/entrypoint.rst
index eb880d9..7db5b5d 100644
--- a/docs/docker-stack/entrypoint.rst
+++ b/docs/docker-stack/entrypoint.rst
@@ -161,6 +161,47 @@ If there are any other arguments - they are simply passed 
to the "airflow" comma
   > docker run -it apache/airflow:2.1.2-python3.6 version
   2.1.2
 
+Signal propagation
+--
+
+Airflow uses ``dumb-init`` to run as "init" in the entrypoint. This is in 
order to propagate
+signals and reap child processes properly. This means that the process that 
you run does not have
+to install signal handlers to work properly and be killed when the container 
is gracefully terminated.
+The behaviour of signal propagation is configured by ``DUMB_INIT_SETSID`` 
variable which is set to
+``1`` by default - meaning that the signals will be propagated to the whole 
process group, but you can
+set it to ``0`` to enable ``single-child`` behaviour of ``dumb-init`` which 
only propagates the
+signals to only single child process.
+
+The table below summarizes ``DUMB_INIT_SETSID`` possible values and their use 
cases.
+
+++--+
+| Variable value | Use case
 |
+++--+
+| 1 (default)| Propagates signals to all processes in the process group of 
the main |
+|| process running in the container.   
 |
+|| 
 |
+|| If you run your processes via ``["bash", "-c"]`` command 
and bash|
+|| spawn  new processes without ``exec``, this will help 

[GitHub] [airflow] potiuk closed issue #18066: Chart version 1.1.0 does not gracefully shutdown workers

2021-09-10 Thread GitBox


potiuk closed issue #18066:
URL: https://github.com/apache/airflow/issues/18066


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk merged pull request #18068: Fixes warm shutdown for celery worker.

2021-09-10 Thread GitBox


potiuk merged pull request #18068:
URL: https://github.com/apache/airflow/pull/18068


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated: Fix bad repository name in pre-commit config (#18151)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 491d818  Fix bad repository name in pre-commit config (#18151)
491d818 is described below

commit 491d81893b0afb80b5f9df191369875bce6e2aa0
Author: Andrew Godwin 
AuthorDate: Fri Sep 10 11:54:15 2021 -0600

Fix bad repository name in pre-commit config (#18151)

Having a trailing slash on the `flynt` entry makes some versions of Git (in 
my case, the one under Ubuntu 20.04) unwilling to clone it. Removing the 
trailing slash to match the other entries fixes the problem.
---
 .pre-commit-config.yaml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 212be6f..55989a3 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -250,7 +250,7 @@ repos:
 exclude: |
   (?x)
   ^airflow/_vendor/
-  - repo: https://github.com/ikamensh/flynt/
+  - repo: https://github.com/ikamensh/flynt
 rev: '0.66'
 hooks:
   - id: flynt


[GitHub] [airflow] kaxil merged pull request #18151: Fix bad repository name in pre-commit config

2021-09-10 Thread GitBox


kaxil merged pull request #18151:
URL: https://github.com/apache/airflow/pull/18151


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] yavuzmester edited a comment on issue #17507: Task processes killed with WARNING - Recorded pid does not match the current pid

2021-09-10 Thread GitBox


yavuzmester edited a comment on issue #17507:
URL: https://github.com/apache/airflow/issues/17507#issuecomment-917007634


   I am using BashOperator with run_as_user. For me I think it is related to 
the `eval` line I have in my bash script. Because I got the error just after 
the `eval` line executes.
   Previously I tried a workaround, thought that it worked but seems the issue 
is still there. So deleted that comment not to create confusion.
   
   UPDATE: I was getting the error on the first eval usage. I replaced my eval 
usage with ${…} but this time I got the error after second such usage. I think 
I will downgrade to 2.1.2.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18153: Reorder migrations to be compatible with 2.1.4

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #18153:
URL: https://github.com/apache/airflow/pull/18153#issuecomment-917093278


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #18153: Reorder migrations to be compatible with 2.1.4

2021-09-10 Thread GitBox


kaxil opened a new pull request #18153:
URL: https://github.com/apache/airflow/pull/18153


   This commits makes Airflow 2.2 migrations compatible with 2.1.4 so users can
   easily upgrade
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] andrewgodwin commented on a change in pull request #18152: Fix stuck "queued" tasks in KubernetesExecutor

2021-09-10 Thread GitBox


andrewgodwin commented on a change in pull request #18152:
URL: https://github.com/apache/airflow/pull/18152#discussion_r706362385



##
File path: .pre-commit-config.yaml
##
@@ -250,7 +250,7 @@ repos:
 exclude: |
   (?x)
   ^airflow/_vendor/
-  - repo: https://github.com/ikamensh/flynt/
+  - repo: https://github.com/ikamensh/flynt

Review comment:
   This is a local copy of https://github.com/apache/airflow/pull/18151 to 
let me get the commit in; it will vanish upon rebase when that lands.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] andrewgodwin opened a new pull request #18152: Fix stuck "queued" tasks in KubernetesExecutor

2021-09-10 Thread GitBox


andrewgodwin opened a new pull request #18152:
URL: https://github.com/apache/airflow/pull/18152


   There are a set of circumstances where TaskInstances can get "stuck" in the 
QUEUED state when they are running under KubernetesExecutor, where they claim 
to have a pod scheduled (and so are queued) but do not actually have one, and 
so sit there forever.
   
   It appears this happens occasionally with reschedule sensors and now more 
often with deferrable tasks, when the task instance defers/reschedules and then 
resumes before the old pod has vanished. It would also, I believe, happen when 
the Executor hard-exits with items still in its internal queues.
   
   There was a pre-existing method in there to clean up stuck queued tasks, but 
it only ran once, on executor start. I have modified it to be safe to run 
periodically (by teaching it not to touch things that the executor looked at 
recently), and then made it run every so often (30 seconds by default).
   
   This is not a perfect fix - the only real fix would be to have far more 
detailed state tracking as part of TaskInstance or another table, and 
re-architect the KubernetesExecutor. However, this should reduce the number of 
times this happens very signficantly, so it should do for now.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18151: Fix bad repository name in pre-commit config

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #18151:
URL: https://github.com/apache/airflow/pull/18151#issuecomment-917088755


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] andrewgodwin opened a new pull request #18151: Fix bad repository name in pre-commit config

2021-09-10 Thread GitBox


andrewgodwin opened a new pull request #18151:
URL: https://github.com/apache/airflow/pull/18151


   Having a trailing slash on the `flynt` repo makes some versions of Git (in 
my case, the one under Ubuntu 20.04) unwilling to clone it. Removing the 
trailing slash to match the other entries fixes the problem.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] 04/05: Bump version to 2.1.4

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit eaca04301422ea27f0c1ea2550796c5ee2ce6cd9
Author: Kaxil Naik 
AuthorDate: Fri Sep 10 15:06:48 2021 +0100

Bump version to 2.1.4
---
 README.md | 16 
 setup.py  |  2 +-
 2 files changed, 9 insertions(+), 9 deletions(-)

diff --git a/README.md b/README.md
index 722fad2..8981e3a 100644
--- a/README.md
+++ b/README.md
@@ -82,7 +82,7 @@ Airflow is not a streaming solution, but it is often used to 
process real-time d
 
 Apache Airflow is tested with:
 
-|  | Main version (dev)| Stable version (2.1.3)   |
+|  | Main version (dev)| Stable version (2.1.4)   |
 |  | - |  |
 | Python   | 3.6, 3.7, 3.8, 3.9| 3.6, 3.7, 3.8, 3.9   |
 | Kubernetes   | 1.20, 1.19, 1.18  | 1.20, 1.19, 1.18 |
@@ -142,15 +142,15 @@ them to appropriate format and workflow that your tool 
requires.
 
 
 ```bash
-pip install apache-airflow==2.1.3 \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.1.3/constraints-3.7.txt";
+pip install apache-airflow==2.1.4 \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.1.4/constraints-3.7.txt";
 ```
 
 2. Installing with extras (for example postgres,google)
 
 ```bash
-pip install apache-airflow[postgres,google]==2.1.3 \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.1.3/constraints-3.7.txt";
+pip install apache-airflow[postgres,google]==2.1.4 \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.1.4/constraints-3.7.txt";
 ```
 
 For information on installing provider packages check
@@ -231,7 +231,7 @@ packages:
 * **Airflow Providers**: SemVer rules apply to changes in the particular 
provider's code only.
   SemVer MAJOR and MINOR versions for the packages are independent from 
Airflow version.
   For example `google 4.1.0` and `amazon 3.0.3` providers can happily be 
installed
-  with `Airflow 2.1.3`. If there are limits of cross-dependencies between 
providers and Airflow packages,
+  with `Airflow 2.1.4`. If there are limits of cross-dependencies between 
providers and Airflow packages,
   they are present in providers as `install_requires` limitations. We aim to 
keep backwards
   compatibility of providers with all previously released Airflow 2 versions 
but
   there will be sometimes breaking changes that might make some, or all
@@ -254,7 +254,7 @@ Apache Airflow version life cycle:
 
 | Version | Current Patch/Minor | State | First Release | Limited Support 
| EOL/Terminated |
 
|-|-|---|---|-||
-| 2   | 2.1.3   | Supported | Dec 17, 2020  | Dec 2021
| TBD|
+| 2   | 2.1.4   | Supported | Dec 17, 2020  | Dec 2021
| TBD|
 | 1.10| 1.10.15 | EOL   | Aug 27, 2018  | Dec 17, 2020
| June 17, 2021  |
 | 1.9 | 1.9.0   | EOL   | Jan 03, 2018  | Aug 27, 2018
| Aug 27, 2018   |
 | 1.8 | 1.8.2   | EOL   | Mar 19, 2017  | Jan 03, 2018
| Jan 03, 2018   |
@@ -280,7 +280,7 @@ They are based on the official release schedule of Python 
and Kubernetes, nicely
 
 2. The "oldest" supported version of Python/Kubernetes is the default one. 
"Default" is only meaningful
in terms of "smoke tests" in CI PRs which are run using this default 
version and default reference
-   image available. Currently ``apache/airflow:latest`` and 
``apache/airflow:2.1.3` images
+   image available. Currently ``apache/airflow:latest`` and 
``apache/airflow:2.1.4` images
are both Python 3.6 images, however the first MINOR/MAJOR release of 
Airflow release after 23.12.2021 will
become Python 3.7 images.
 
diff --git a/setup.py b/setup.py
index 33cb4f9..33e1c8d 100644
--- a/setup.py
+++ b/setup.py
@@ -41,7 +41,7 @@ PY39 = sys.version_info >= (3, 9)
 
 logger = logging.getLogger(__name__)
 
-version = '2.1.3'
+version = '2.1.4'
 
 my_dir = dirname(__file__)
 


[airflow] 05/05: Add Changelog for 2.1.4

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e7fc43f873d3f42b231408b7df742f45cf17fd96
Author: Kaxil Naik 
AuthorDate: Fri Sep 10 15:11:27 2021 +0100

Add Changelog for 2.1.4
---
 CHANGELOG.txt | 38 ++
 1 file changed, 38 insertions(+)

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index ecbb6b6..5a092cd 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -1,3 +1,41 @@
+Airflow 2.1.4, 2021-09-15
+-
+
+Bug Fixes
+"
+
+- Fix deprecation error message rather than silencing it (#18126)
+- Limit the number of queued dagruns created by the Scheduler (#18065)
+- Fix ``DagRun`` execution order from queued to running not being properly 
followed (#18061)
+- Fix ``max_active_runs`` not allowing moving of queued dagruns to running 
(#17945)
+- Avoid redirect loop for users with no permissions (#17838)
+- Avoid endless redirect loop when user has no roles (#17613)
+- Fix log links on graph TI modal (#17862)
+- Hide variable import form if user lacks permission (#18000)
+- Improve dag/task concurrency check (#17786)
+- Fix Clear task instances endpoint resets all DAG runs bug (#17961)
+- Fixes incorrect parameter passed to views (#18083) (#18085)
+- Fix Sentry handler from ``LocalTaskJob`` causing error (#18119)
+- Limit ``colorlog`` version (6.x is incompatible) (#18099)
+- Only show Pause/Unpause tooltip on hover (#17957)
+- Improve graph view load time for dags with open groups (#17821)
+- Increase width for Run column (#17817)
+- Fix wrong query on running tis (#17631)
+- Add root to tree refresh url (#17633)
+- Do not delete running DAG from the UI (#17630)
+- Improve discoverability of Provider packages' functionality
+
+Doc only changes
+
+
+- Update version added fields in airflow/config_templates/config.yml (#18128)
+- Improve the description of how to handle dynamic task generation (#17963)
+- Improve cross-links to operators and hooks references (#17622)
+- Doc: Fix replacing Airflow version for Docker stack (#17711)
+- Make the providers operators/hooks reference much more usable (#17768)
+- Update description about the new ``connection-types`` provider meta-data
+- Suggest to use secrets backend for variable when it contains sensitive data 
(#17319)
+
 Airflow 2.1.3, 2021-08-21
 -
 


[airflow] 02/05: Fix deprecation error message rather than silencing it (#18126)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8b0960278e62bef80d1899db90752fce56df4809
Author: Ash Berlin-Taylor 
AuthorDate: Fri Sep 10 00:07:19 2021 +0100

Fix deprecation error message rather than silencing it (#18126)

(cherry picked from commit c9d29467f71060f14863ca3508cb1055572479b5)
---
 .../versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py  | 2 +-
 .../versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py| 2 +-
 airflow/models/dagrun.py  | 4 ++--
 3 files changed, 4 insertions(+), 4 deletions(-)

diff --git 
a/airflow/migrations/versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py
 
b/airflow/migrations/versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py
index 82bb4c2..a9f612d 100644
--- 
a/airflow/migrations/versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py
+++ 
b/airflow/migrations/versions/092435bf5d12_add_max_active_runs_column_to_dagmodel_.py
@@ -44,7 +44,7 @@ def upgrade():
 batch_op.create_index(
 'idx_dag_run_running_dags',
 ["state", "dag_id"],
-postgres_where=text("state='running'"),
+postgresql_where=text("state='running'"),
 mssql_where=text("state='running'"),
 sqlite_where=text("state='running'"),
 )
diff --git 
a/airflow/migrations/versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py
 
b/airflow/migrations/versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py
index 7326d73..6a1cbe6 100644
--- 
a/airflow/migrations/versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py
+++ 
b/airflow/migrations/versions/ccde3e26fe78_add_index_on_state_dag_id_for_queued_.py
@@ -40,7 +40,7 @@ def upgrade():
 batch_op.create_index(
 'idx_dag_run_queued_dags',
 ["state", "dag_id"],
-postgres_where=text("state='queued'"),
+postgresql_where=text("state='queued'"),
 mssql_where=text("state='queued'"),
 sqlite_where=text("state='queued'"),
 )
diff --git a/airflow/models/dagrun.py b/airflow/models/dagrun.py
index 1e5c2c1..ec4bdfb 100644
--- a/airflow/models/dagrun.py
+++ b/airflow/models/dagrun.py
@@ -103,7 +103,7 @@ class DagRun(Base, LoggingMixin):
 'idx_dag_run_running_dags',
 'state',
 'dag_id',
-postgres_where=text("state='running'"),
+postgresql_where=text("state='running'"),
 mssql_where=text("state='running'"),
 sqlite_where=text("state='running'"),
 ),
@@ -113,7 +113,7 @@ class DagRun(Base, LoggingMixin):
 'idx_dag_run_queued_dags',
 'state',
 'dag_id',
-postgres_where=text("state='queued'"),
+postgresql_where=text("state='queued'"),
 mssql_where=text("state='queued'"),
 sqlite_where=text("state='queued'"),
 ),


[airflow] 01/05: Limit the number of queued dagruns created by the Scheduler (#18065)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit bf276ca72024ae10f4d55e8a79366f855e5e25a8
Author: Ephraim Anierobi 
AuthorDate: Thu Sep 9 14:24:24 2021 +0100

Limit the number of queued dagruns created by the Scheduler (#18065)

There's no limit to the amount of queued dagruns to create currently
and it has become a concern with issues raised against it. See #18023 and 
#17979

Co-authored-by: Sam Wheating 
(cherry picked from commit 0eb41b5952c2ce1884594c82bbf05835912b9812)
---
 airflow/config_templates/config.yml|  8 
 airflow/config_templates/default_airflow.cfg   |  4 ++
 airflow/jobs/scheduler_job.py  | 21 -
 ...26fe78_add_index_on_state_dag_id_for_queued_.py | 52 ++
 airflow/models/dagrun.py   | 10 +
 docs/apache-airflow/migrations-ref.rst |  4 +-
 tests/jobs/test_scheduler_job.py   | 25 +++
 7 files changed, 122 insertions(+), 2 deletions(-)

diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index 9945213..7abcb06 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -191,6 +191,14 @@
   type: string
   example: ~
   default: "16"
+- name: max_queued_runs_per_dag
+  description: |
+The maximum number of queued dagruns for a single DAG. The scheduler 
will not create more DAG runs
+if it reaches the limit. This is not configurable at the DAG level.
+  version_added: 2.1.4
+  type: string
+  example: ~
+  default: "16"
 - name: load_examples
   description: |
 Whether to load the DAG examples that ship with Airflow. It's good to
diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 03d5e1f..56a1d90 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -127,6 +127,10 @@ dags_are_paused_at_creation = True
 # which is defaulted as ``max_active_runs_per_dag``.
 max_active_runs_per_dag = 16
 
+# The maximum number of queued dagruns for a single DAG. The scheduler will 
not create more DAG runs
+# if it reaches the limit. This is not configurable at the DAG level.
+max_queued_runs_per_dag = 16
+
 # Whether to load the DAG examples that ship with Airflow. It's good to
 # get started, but you probably want to set this to ``False`` in a production
 # environment
diff --git a/airflow/jobs/scheduler_job.py b/airflow/jobs/scheduler_job.py
index 8d5f888..45083a4 100644
--- a/airflow/jobs/scheduler_job.py
+++ b/airflow/jobs/scheduler_job.py
@@ -985,14 +985,31 @@ class SchedulerJob(BaseJob):
 existing_dagruns = (
 session.query(DagRun.dag_id, 
DagRun.execution_date).filter(existing_dagruns_filter).all()
 )
+max_queued_dagruns = conf.getint('core', 'max_queued_runs_per_dag')
+
+queued_runs_of_dags = defaultdict(
+int,
+session.query(DagRun.dag_id, func.count('*'))
+.filter(  # We use `list` here because SQLA doesn't accept a set
+# We use set to avoid duplicate dag_ids
+DagRun.dag_id.in_(list({dm.dag_id for dm in dag_models})),
+DagRun.state == State.QUEUED,
+)
+.group_by(DagRun.dag_id)
+.all(),
+)
 
 for dag_model in dag_models:
+# Lets quickly check if we have exceeded the number of queued 
dagruns per dags
+total_queued = queued_runs_of_dags[dag_model.dag_id]
+if total_queued >= max_queued_dagruns:
+continue
+
 try:
 dag = self.dagbag.get_dag(dag_model.dag_id, session=session)
 except SerializedDagNotFound:
 self.log.exception("DAG '%s' not found in serialized_dag 
table", dag_model.dag_id)
 continue
-
 dag_hash = self.dagbag.dags_hash.get(dag.dag_id)
 # Explicitly check if the DagRun already exists. This is an edge 
case
 # where a Dag Run is created but `DagModel.next_dagrun` and 
`DagModel.next_dagrun_create_after`
@@ -1003,6 +1020,7 @@ class SchedulerJob(BaseJob):
 # create a new one. This is so that in the next Scheduling loop we 
try to create new runs
 # instead of falling in a loop of Integrity Error.
 if (dag.dag_id, dag_model.next_dagrun) not in existing_dagruns:
+
 dag.create_dagrun(
 run_type=DagRunType.SCHEDULED,
 execution_date=dag_model.next_dagrun,
@@ -1012,6 +1030,7 @@ class SchedulerJob(BaseJob):
 dag_hash=dag_hash,
 creating_job_id=self.id,
 )
+queue

[airflow] 03/05: Update version added fields in airflow/config_templates/config.yml (#18128)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a253b10f563b741bc5f2877c86e73869c7f9cd5c
Author: Kamil Breguła 
AuthorDate: Fri Sep 10 01:21:06 2021 +0200

Update version added fields in airflow/config_templates/config.yml (#18128)

(cherry picked from commit 2767781b880b0fb03d46950c06e1e44902c25a7c)
---
 airflow/config_templates/config.yml | 128 ++--
 1 file changed, 64 insertions(+), 64 deletions(-)

diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index 7abcb06..38be813 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -231,7 +231,7 @@
 but means plugin changes picked up by tasks straight away)
   default: "False"
   example: ~
-  version_added: "2.0.0"
+  version_added: 2.0.0
   see_also: ":ref:`plugins:loading`"
   type: boolean
 - name: fernet_key
@@ -382,7 +382,7 @@
 All the template_fields for each of Task Instance are stored in the 
Database.
 Keeping this number small may cause an error when you try to view 
``Rendered`` tab in
 TaskInstance view for older tasks.
-  version_added: 2.0.0
+  version_added: 1.10.10
   type: integer
   example: ~
   default: "30"
@@ -422,7 +422,7 @@
 Number of times the code should be retried in case of DB Operational 
Errors.
 Not all transactions will be retried as it can cause undesired state.
 Currently it is only used in ``DagFileProcessor.process_file`` to 
retry ``dagbag.sync_to_db``.
-  version_added: ~
+  version_added: 2.0.0
   type: integer
   example: ~
   default: "3"
@@ -431,7 +431,7 @@
 Hide sensitive Variables or Connection extra json keys from UI and 
task logs when set to True
 
 (Connection passwords are always hidden in logs)
-  version_added: ~
+  version_added: 2.1.0
   type: boolean
   example: ~
   default: "True"
@@ -439,7 +439,7 @@
   description: |
 A comma-separated list of extra sensitive keywords to look for in 
variables names or connection's
 extra JSON.
-  version_added: ~
+  version_added: 2.1.0
   type: string
   example: ~
   default: ""
@@ -451,7 +451,7 @@
   description: |
 The folder where airflow should store its log files
 This path must be absolute
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "{AIRFLOW_HOME}/logs"
@@ -459,7 +459,7 @@
   description: |
 Airflow can store logs remotely in AWS S3, Google Cloud Storage or 
Elastic Search.
 Set this to True if you want to enable remote logging.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "False"
@@ -467,7 +467,7 @@
   description: |
 Users must supply an Airflow connection id that provides access to the 
storage
 location.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: ""
@@ -477,7 +477,7 @@
 Credentials
 
`__
 will
 be used.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: ""
@@ -489,14 +489,14 @@
 GCS buckets should start with "gs://"
 WASB buckets should start with "wasb" just to help Airflow select 
correct handler
 Stackdriver logs should start with "stackdriver://"
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: ""
 - name: encrypt_s3_logs
   description: |
 Use server-side encryption for logs stored in S3
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "False"
@@ -505,7 +505,7 @@
 Logging level.
 
 Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, 
``DEBUG``.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "INFO"
@@ -514,7 +514,7 @@
 Logging level for Flask-appbuilder UI.
 
 Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, 
``DEBUG``.
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: ~
   default: "WARN"
@@ -523,7 +523,7 @@
 Logging class
 Specify the class that will specify the logging configuration
 This class has to be on the python classpath
-  version_added: ~
+  version_added: 2.0.0
   type: string
   example: "my.path.default_local_settings.LOGGING_CONFIG"
   default: ""
@@ -531,14 +531,14 @@
   description: |
 Flag to enable/disable Co

[airflow] branch v2-1-test updated (cd4d4fd -> e7fc43f)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard cd4d4fd  Add Changelog for 2.1.4
 discard 79d4768  Bump version to 2.1.4
 discard ef4e37b  Update version added fields in 
airflow/config_templates/config.yml (#18128)
 discard 54f3a77  Fix deprecation error message rather than silencing it 
(#18126)
 discard 203b455  Limit the number of queued dagruns created by the Scheduler 
(#18065)
 new bf276ca  Limit the number of queued dagruns created by the Scheduler 
(#18065)
 new 8b09602  Fix deprecation error message rather than silencing it 
(#18126)
 new a253b10  Update version added fields in 
airflow/config_templates/config.yml (#18128)
 new eaca043  Bump version to 2.1.4
 new e7fc43f  Add Changelog for 2.1.4

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (cd4d4fd)
\
 N -- N -- N   refs/heads/v2-1-test (e7fc43f)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 5 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 docs/apache-airflow/migrations-ref.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


[airflow] branch main updated: Fix typo in StandardTaskRunning log message (#18149)

2021-09-10 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 42c835f  Fix typo in StandardTaskRunning log message (#18149)
42c835f is described below

commit 42c835f2b729c429d046b091c844a8f7ab0fd5f8
Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com>
AuthorDate: Fri Sep 10 11:31:15 2021 -0600

Fix typo in StandardTaskRunning log message (#18149)
---
 airflow/task/task_runner/standard_task_runner.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/task/task_runner/standard_task_runner.py 
b/airflow/task/task_runner/standard_task_runner.py
index 15830d5..a72244e 100644
--- a/airflow/task/task_runner/standard_task_runner.py
+++ b/airflow/task/task_runner/standard_task_runner.py
@@ -86,7 +86,7 @@ class StandardTaskRunner(BaseTaskRunner):
 return_code = 0
 except Exception:
 self.log.exception(
-"Failed to execute job %s fo task %s",
+"Failed to execute job %s for task %s",
 self._task_instance.job_id,
 self._task_instance.task_id,
 )


[GitHub] [airflow] potiuk merged pull request #18149: Fix typo in StandardTaskRunning log message

2021-09-10 Thread GitBox


potiuk merged pull request #18149:
URL: https://github.com/apache/airflow/pull/18149


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #18130: Add script that validate version fields in config.yaml

2021-09-10 Thread GitBox


potiuk commented on a change in pull request #18130:
URL: https://github.com/apache/airflow/pull/18130#discussion_r706354052



##
File path: scripts/tools/validate_version_added_fields_in_config.py
##
@@ -0,0 +1,112 @@
+#!/usr/bin/env python

Review comment:
   Still this one should be removed :)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #18130: Add script that validate version fields in config.yaml

2021-09-10 Thread GitBox


potiuk commented on a change in pull request #18130:
URL: https://github.com/apache/airflow/pull/18130#discussion_r706353620



##
File path: dev/README_RELEASE_AIRFLOW.md
##
@@ -745,3 +746,15 @@ EOF
 ## Update Announcements page
 
 Update "Announcements" page at the [Official Airflow 
website](https://airflow.apache.org/announcements/)
+

Review comment:
   Ah. I see. It's fetching from PYPI. fine for me then,




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18149: Fix typo in StandardTaskRunning log message

2021-09-10 Thread GitBox


github-actions[bot] commented on pull request #18149:
URL: https://github.com/apache/airflow/pull/18149#issuecomment-917080206


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] hpatel-higi commented on issue #18127: Fernet InvalidToken

2021-09-10 Thread GitBox


hpatel-higi commented on issue #18127:
URL: https://github.com/apache/airflow/issues/18127#issuecomment-917070868


   Here is a screenshot of the connections page from the airflow 1.10.8 
website.  You can see the connections are marked as Encrypted.
   
   
![image](https://user-images.githubusercontent.com/67915750/132891908-e5a0fa4b-c6b7-4678-bdc4-9d21e7bdae34.png)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] c-nuro opened a new pull request #18150: Remove check for at least one schema

2021-09-10 Thread GitBox


c-nuro opened a new pull request #18150:
URL: https://github.com/apache/airflow/pull/18150


   For the case when updating an existing table or insert data to a particular 
partition, no schema is needed.
   Autodetect doesn't always work, e.g. cannot distinguish partition correctly. 
Other options requires forking the schema to airflow.
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18150: Remove check for at least one schema

2021-09-10 Thread GitBox


boring-cyborg[bot] commented on pull request #18150:
URL: https://github.com/apache/airflow/pull/18150#issuecomment-917069079


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (975a4e0 -> d491afb)

2021-09-10 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 975a4e0  Fix quarentine tests affected by AIP-39 (#18141)
 add d491afb  Doc: Minor wording tweaks (#18148)

No new revisions were added by this update.

Summary of changes:
 docs/apache-airflow/start/docker.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)


  1   2   3   >