[GitHub] [airflow] fuguixing edited a comment on issue #17429: Scan the DAGs directory for new files througth restfull API

2021-08-05 Thread GitBox


fuguixing edited a comment on issue #17429:
URL: https://github.com/apache/airflow/issues/17429#issuecomment-894013281


   Thanks for your reply!
   In our env, I need to make the new specified DAG file available immediately 
without waiting for Airflow call **_refresh_dag_dir**(in the 
dag_processing.py), if **dag_dir_list_interval** is too small, which may cause 
a waste of resources. I tried the following code, which can load the new 
specified DAG file to the database and make it available immediately, and it 
uses **DagFileProcessor** in the scheduler_job.py. Can I encapsulate it as a 
restful API?
   
   heartbeat()  in the dag_processing.py
   `processor = self._processor_factory(file_path, self._zombies)`
   `processor.start()`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] fuguixing commented on issue #17429: Scan the DAGs directory for new files througth restfull API

2021-08-05 Thread GitBox


fuguixing commented on issue #17429:
URL: https://github.com/apache/airflow/issues/17429#issuecomment-894013281


   Thanks for your reply!
   In our env, I need to make the new specified DAG available immediately 
without waiting for Airflow call **_refresh_dag_dir**(in the 
dag_processing.py), if **dag_dir_list_interval** is too small, which may cause 
a waste of resources. I tried the following code, which can load the new 
specified DAG to the database and make it available immediately, and it uses 
**DagFileProcessor** in the scheduler_job.py. Can I encapsulate it as a restful 
API?
   
   heartbeat()  in the dag_processing.py
   `processor = self._processor_factory(file_path, self._zombies)`
   `processor.start()`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] baolsen commented on issue #10874: SSHHook get_conn() does not re-use client

2021-08-05 Thread GitBox


baolsen commented on issue #10874:
URL: https://github.com/apache/airflow/issues/10874#issuecomment-894009918


   PR is up :) Observations and suggestions welcome. 
   Happy to make other improvements based on the feedback. 
   https://github.com/apache/airflow/pull/17378


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] goodhamgupta edited a comment on issue #16728: SparkKubernetesOperator to support arguments

2021-08-05 Thread GitBox


goodhamgupta edited a comment on issue #16728:
URL: https://github.com/apache/airflow/issues/16728#issuecomment-894005544


   Hi @potiuk,
   
   I think I don't understand the issue well enough. I see that for the 
SparkKubernetesOperator, the following code segment:
   ```py
   class SparkKubernetesOperator(BaseOperator):
   """
   """
   
   template_fields = ['application_file', 'namespace']
   template_ext = ('.yaml', '.yml', '.json')
   ui_color = '#f4a460'
   ```
   
   I think this should already allow .json type file to be uploaded from the 
UI? I would appreciate it if you could provide me with any pointers on how to 
start this issue.  
   
   Thanks!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] goodhamgupta commented on issue #16728: SparkKubernetesOperator to support arguments

2021-08-05 Thread GitBox


goodhamgupta commented on issue #16728:
URL: https://github.com/apache/airflow/issues/16728#issuecomment-894005544


   Hi @potiuk,
   
   I think I don't understand the issue well enough. I see that for the 
SparkKubernetesOperator, the following code segment:
   ```py
   class SparkKubernetesOperator(BaseOperator):
   """
   """
   
   template_fields = ['application_file', 'namespace']
   template_ext = ('.yaml', '.yml', '.json')
   ui_color = '#f4a460'
   ```
   
   I think this should already allow .json type file to be uploaded from the 
UI? I would appreciate if you could provide me with any pointers on how to 
start this issue.  
   
   Thanks!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #17456: Adding TaskGroup support in chain()

2021-08-05 Thread GitBox


github-actions[bot] commented on pull request #17456:
URL: https://github.com/apache/airflow/pull/17456#issuecomment-894005427


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #16866: Remove default_args pattern + added get_current_context() use for Core Airflow example DAGs

2021-08-05 Thread GitBox


uranusjr commented on a change in pull request #16866:
URL: https://github.com/apache/airflow/pull/16866#discussion_r683950115



##
File path: airflow/example_dags/tutorial_taskflow_api_etl_virtualenv.py
##
@@ -19,24 +19,14 @@
 
 # [START tutorial]
 # [START import_module]
-import json

Review comment:
   Why move this?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell opened a new pull request #17456: Adding TaskGroup support in chain()

2021-08-05 Thread GitBox


josh-fell opened a new pull request #17456:
URL: https://github.com/apache/airflow/pull/17456


   Related to: #17083, #16635
   
   This PR ensures that `TaskGroups` can be used to set dependencies while 
calling the `chain()` method.  Support for `XComArgs` and `EdgeModifiers` has 
been implemented in previous PRs: #16732, #17099
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-4922) If a task crashes, host name is not committed to the database so logs aren't able to be seen in the UI

2021-08-05 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17394452#comment-17394452
 ] 

ASF GitHub Bot commented on AIRFLOW-4922:
-

gowdra01 commented on pull request #6722:
URL: https://github.com/apache/airflow/pull/6722#issuecomment-893961036


   I am stuck with this too I am using 2.1.0 version and getting the error
   *** Failed to fetch log file from worker. 503 Server Error: Service 
Unavailable for url:


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> If a task crashes, host name is not committed to the database so logs aren't 
> able to be seen in the UI
> --
>
> Key: AIRFLOW-4922
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4922
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.3
>Reporter: Andrew Harmon
>Assignee: wanghong-T
>Priority: Major
>
> Sometimes when a task fails, the log show the following
> {code}
> *** Log file does not exist: 
> /usr/local/airflow/logs/my_dag/my_task/2019-07-07T09:00:00+00:00/1.log*** 
> Fetching from: 
> http://:8793/log/my_dag/my_task/2019-07-07T09:00:00+00:00/1.log*** 
> Failed to fetch log file from worker. Invalid URL 
> 'http://:8793/log/my_dag/my_task/2019-07-07T09:00:00+00:00/1.log': No host 
> supplied
> {code}
> I believe this is due to the fact that the row is not committed to the 
> database until after the task finishes. 
> https://github.com/apache/airflow/blob/a1f9d9a03faecbb4ab52def2735e374b2e88b2b9/airflow/models/taskinstance.py#L857



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] gowdra01 commented on pull request #6722: [AIRFLOW-4922]Fix task get log by Web UI

2021-08-05 Thread GitBox


gowdra01 commented on pull request #6722:
URL: https://github.com/apache/airflow/pull/6722#issuecomment-893961036


   I am stuck with this too I am using 2.1.0 version and getting the error
   *** Failed to fetch log file from worker. 503 Server Error: Service 
Unavailable for url:


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-4922) If a task crashes, host name is not committed to the database so logs aren't able to be seen in the UI

2021-08-05 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17394442#comment-17394442
 ] 

ASF GitHub Bot commented on AIRFLOW-4922:
-

xuemengran commented on pull request #6722:
URL: https://github.com/apache/airflow/pull/6722#issuecomment-893956847


   In the 2.1.1 version, I tried to modify the 
airflow/utils/log/file_task_handler.py file to obtain the hostname information 
by reading the log table. 
   I confirmed through debug that I could get the host information in this way, 
but a bigger problem appeared. 
   The task is marked as successful without scheduling, and the log is still 
not viewable, so I confirm that to solve this problem, the host information 
must be written to the task_instance table before the task is executed. 
   I think this bug is very Important, because it directly affects the use of 
airflow in distributed scenarios, please solve it as soon as possible!!!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> If a task crashes, host name is not committed to the database so logs aren't 
> able to be seen in the UI
> --
>
> Key: AIRFLOW-4922
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4922
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.3
>Reporter: Andrew Harmon
>Assignee: wanghong-T
>Priority: Major
>
> Sometimes when a task fails, the log show the following
> {code}
> *** Log file does not exist: 
> /usr/local/airflow/logs/my_dag/my_task/2019-07-07T09:00:00+00:00/1.log*** 
> Fetching from: 
> http://:8793/log/my_dag/my_task/2019-07-07T09:00:00+00:00/1.log*** 
> Failed to fetch log file from worker. Invalid URL 
> 'http://:8793/log/my_dag/my_task/2019-07-07T09:00:00+00:00/1.log': No host 
> supplied
> {code}
> I believe this is due to the fact that the row is not committed to the 
> database until after the task finishes. 
> https://github.com/apache/airflow/blob/a1f9d9a03faecbb4ab52def2735e374b2e88b2b9/airflow/models/taskinstance.py#L857



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] xuemengran commented on pull request #6722: [AIRFLOW-4922]Fix task get log by Web UI

2021-08-05 Thread GitBox


xuemengran commented on pull request #6722:
URL: https://github.com/apache/airflow/pull/6722#issuecomment-893956847


   In the 2.1.1 version, I tried to modify the 
airflow/utils/log/file_task_handler.py file to obtain the hostname information 
by reading the log table. 
   I confirmed through debug that I could get the host information in this way, 
but a bigger problem appeared. 
   The task is marked as successful without scheduling, and the log is still 
not viewable, so I confirm that to solve this problem, the host information 
must be written to the task_instance table before the task is executed. 
   I think this bug is very Important, because it directly affects the use of 
airflow in distributed scenarios, please solve it as soon as possible!!!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Sonins edited a comment on pull request #17428: Fix elasticsearch-secret template port default function

2021-08-05 Thread GitBox


Sonins edited a comment on pull request #17428:
URL: https://github.com/apache/airflow/pull/17428#issuecomment-893932430


   Thank you for your review @jedcunningham . 
   I did the unit test using pytest in breeze environment, but I don't know how 
to attach the result to this PR in fancy style.
   
   So, I just paste the result of pytest.
   ```
   root@fa3d600de005:/opt/airflow/chart# pytest tests/
   
   = test session starts 
==
   platform linux -- Python 3.6.14, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- 
/usr/local/bin/python
   cachedir: .pytest_cache
   rootdir: /opt/airflow, configfile: pytest.ini
   plugins: requests-mock-1.9.3, flaky-3.7.0, timeouts-1.2.1, anyio-3.3.0, 
forked-1.3.0, httpx-0.12.0, instafail-0.4.2, cov-2.12.1, xdist-2.3.0, 
rerunfailures-9.1.1, celery-4.4.7
   setup timeout: 0.0s, execution timeout: 0.0s, teardown timeout: 0.0s
   collected 400 items
   
   tests/test_airflow_common.py::AirflowCommon::test_annotations PASSED [  
0%]
   tests/test_airflow_common.py::AirflowCommon::test_dags_mount_0 PASSED[  
0%]
   tests/test_airflow_common.py::AirflowCommon::test_dags_mount_1 PASSED[  
0%]
   tests/test_airflow_common.py::AirflowCommon::test_dags_mount_2 PASSED[  
1%]
   tests/test_annotations.py::AnnotationsTest::test_service_account_annotations 
PASSED [  1%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_annotations_on_airflow_pods_in_deployment
 PASSED [  1%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_basic_deployment_without_default_users
 PASSED [  1%]
   tests/test_basic_helm_chart.py::TestBaseChartTest::test_basic_deployments 
PASSED [  2%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_chart_is_consistent_with_official_airflow_image
 PASSED [  2%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_dags_access_mode
 PASSED [  2%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_0_airflow
 PASSED [  2%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_1_pod_template
 PASSED [  3%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_2_flower
 PASSED [  3%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_3_statsd
 PASSED [  3%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_4_redis
 PASSED [  3%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_5_pgbouncer
 PASSED [  4%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_6_pgbouncerExporter
 PASSED [  4%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_7_gitSync
 PASSED [  4%]
   tests/test_basic_helm_chart.py::TestBaseChartTest::test_labels_are_valid 
PASSED [  4%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_network_policies_are_valid
 PASSED [  5%]
   tests/test_basic_helm_chart.py::TestBaseChartTest::test_unsupported_executor 
PASSED [  5%]
   
tests/test_celery_kubernetes_executor.py::CeleryKubernetesExecutorTest::test_should_create_a_worker_deployment_with_the_celery_executor
 PASSED [  5%]
   
tests/test_celery_kubernetes_executor.py::CeleryKubernetesExecutorTest::test_should_create_a_worker_deployment_with_the_celery_kubernetes_executor
 PASSED [  5%]
   tests/test_chart_quality.py::ChartQualityTest::test_values_validate_schema 
PASSED [  6%]
   
tests/test_cleanup_pods.py::CleanupPodsTest::test_should_change_image_when_set_airflow_image
 PASSED [  6%]
   
tests/test_cleanup_pods.py::CleanupPodsTest::test_should_create_cronjob_for_enabled_cleanup
 PASSED [  6%]
   
tests/test_cleanup_pods.py::CleanupPodsTest::test_should_create_valid_affinity_tolerations_and_node_selector
 PASSED [  6%]
   tests/test_configmap.py::ConfigmapTest::test_airflow_local_settings PASSED [ 
 7%]
   
tests/test_configmap.py::ConfigmapTest::test_kerberos_config_available_with_celery_executor
 PASSED [  7%]
   tests/test_configmap.py::ConfigmapTest::test_multiple_annotations PASSED [  
7%]
   
tests/test_configmap.py::ConfigmapTest::test_no_airflow_local_settings_by_default
 PASSED [  7%]
   tests/test_configmap.py::ConfigmapTest::test_single_annotation PASSED[  
8%]
   
tests/test_create_user_job.py::CreateUserJobTest::test_should_create_valid_affinity_tolerations_and_node_selector
 PASSED [  8%]
   tests/test_create_user_job.py::CreateUserJobTest::test_should_run_by_default 
PASSED [  8%]
   
tests/test_create_user_job.py::CreateUserJobTest::test_should_support_annotations
 PASSED [  8%]
   
tests/test_dags_persistent_volume_claim.py::DagsPersistentVolumeClaimTest::test_should_generate_a_document_if_persistence_is_enabled_and_not_using_an_existing_claim
 PASSED [  9%]
   
tests/test_dags_persistent_volume_claim.py::DagsPersistentVolumeClaimTest::test_should_not_generate_a_document_if_persistence_is_disabled
 PASSED [  9%]
   

[GitHub] [airflow] Sonins commented on pull request #17428: Fix elasticsearch-secret template port default function

2021-08-05 Thread GitBox


Sonins commented on pull request #17428:
URL: https://github.com/apache/airflow/pull/17428#issuecomment-893932430


   Thank you for your review @jedcunningham . 
   I did the unit test using pytest in breeze environment, but I don't know how 
to attach the result to this PR in fancy style.
   
   So, I just paste the result of pytest.
   ```
   = test session starts 
==
   platform linux -- Python 3.6.14, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- 
/usr/local/bin/python
   cachedir: .pytest_cache
   rootdir: /opt/airflow, configfile: pytest.ini
   plugins: requests-mock-1.9.3, flaky-3.7.0, timeouts-1.2.1, anyio-3.3.0, 
forked-1.3.0, httpx-0.12.0, instafail-0.4.2, cov-2.12.1, xdist-2.3.0, 
rerunfailures-9.1.1, celery-4.4.7
   setup timeout: 0.0s, execution timeout: 0.0s, teardown timeout: 0.0s
   collected 400 items
   
   tests/test_airflow_common.py::AirflowCommon::test_annotations PASSED [  
0%]
   tests/test_airflow_common.py::AirflowCommon::test_dags_mount_0 PASSED[  
0%]
   tests/test_airflow_common.py::AirflowCommon::test_dags_mount_1 PASSED[  
0%]
   tests/test_airflow_common.py::AirflowCommon::test_dags_mount_2 PASSED[  
1%]
   tests/test_annotations.py::AnnotationsTest::test_service_account_annotations 
PASSED [  1%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_annotations_on_airflow_pods_in_deployment
 PASSED [  1%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_basic_deployment_without_default_users
 PASSED [  1%]
   tests/test_basic_helm_chart.py::TestBaseChartTest::test_basic_deployments 
PASSED [  2%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_chart_is_consistent_with_official_airflow_image
 PASSED [  2%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_dags_access_mode
 PASSED [  2%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_0_airflow
 PASSED [  2%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_1_pod_template
 PASSED [  3%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_2_flower
 PASSED [  3%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_3_statsd
 PASSED [  3%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_4_redis
 PASSED [  3%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_5_pgbouncer
 PASSED [  4%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_6_pgbouncerExporter
 PASSED [  4%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_invalid_pull_policy_7_gitSync
 PASSED [  4%]
   tests/test_basic_helm_chart.py::TestBaseChartTest::test_labels_are_valid 
PASSED [  4%]
   
tests/test_basic_helm_chart.py::TestBaseChartTest::test_network_policies_are_valid
 PASSED [  5%]
   tests/test_basic_helm_chart.py::TestBaseChartTest::test_unsupported_executor 
PASSED [  5%]
   
tests/test_celery_kubernetes_executor.py::CeleryKubernetesExecutorTest::test_should_create_a_worker_deployment_with_the_celery_executor
 PASSED [  5%]
   
tests/test_celery_kubernetes_executor.py::CeleryKubernetesExecutorTest::test_should_create_a_worker_deployment_with_the_celery_kubernetes_executor
 PASSED [  5%]
   tests/test_chart_quality.py::ChartQualityTest::test_values_validate_schema 
PASSED [  6%]
   
tests/test_cleanup_pods.py::CleanupPodsTest::test_should_change_image_when_set_airflow_image
 PASSED [  6%]
   
tests/test_cleanup_pods.py::CleanupPodsTest::test_should_create_cronjob_for_enabled_cleanup
 PASSED [  6%]
   
tests/test_cleanup_pods.py::CleanupPodsTest::test_should_create_valid_affinity_tolerations_and_node_selector
 PASSED [  6%]
   tests/test_configmap.py::ConfigmapTest::test_airflow_local_settings PASSED [ 
 7%]
   
tests/test_configmap.py::ConfigmapTest::test_kerberos_config_available_with_celery_executor
 PASSED [  7%]
   tests/test_configmap.py::ConfigmapTest::test_multiple_annotations PASSED [  
7%]
   
tests/test_configmap.py::ConfigmapTest::test_no_airflow_local_settings_by_default
 PASSED [  7%]
   tests/test_configmap.py::ConfigmapTest::test_single_annotation PASSED[  
8%]
   
tests/test_create_user_job.py::CreateUserJobTest::test_should_create_valid_affinity_tolerations_and_node_selector
 PASSED [  8%]
   tests/test_create_user_job.py::CreateUserJobTest::test_should_run_by_default 
PASSED [  8%]
   
tests/test_create_user_job.py::CreateUserJobTest::test_should_support_annotations
 PASSED [  8%]
   
tests/test_dags_persistent_volume_claim.py::DagsPersistentVolumeClaimTest::test_should_generate_a_document_if_persistence_is_enabled_and_not_using_an_existing_claim
 PASSED [  9%]
   
tests/test_dags_persistent_volume_claim.py::DagsPersistentVolumeClaimTest::test_should_not_generate_a_document_if_persistence_is_disabled
 PASSED [  9%]
   

[GitHub] [airflow] mik-laj commented on a change in pull request #16571: Implemented Basic EKS Integration

2021-08-05 Thread GitBox


mik-laj commented on a change in pull request #16571:
URL: https://github.com/apache/airflow/pull/16571#discussion_r683872000



##
File path: airflow/providers/amazon/aws/hooks/eks.py
##
@@ -0,0 +1,420 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""Interact with Amazon EKS, using the boto3 library."""
+import base64
+import json
+import re
+import tempfile
+from contextlib import contextmanager
+from functools import partial
+from typing import Callable, Dict, List, Optional
+
+import yaml
+from botocore.signers import RequestSigner
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+from airflow.utils.json import AirflowJsonEncoder
+
+DEFAULT_CONTEXT_NAME = 'aws'
+DEFAULT_PAGINATION_TOKEN = ''
+DEFAULT_POD_USERNAME = 'aws'
+STS_TOKEN_EXPIRES_IN = 60
+
+
+class EKSHook(AwsBaseHook):
+"""
+Interact with Amazon EKS, using the boto3 library.
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+:class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+"""
+
+client_type = 'eks'
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = self.client_type
+super().__init__(*args, **kwargs)
+
+def create_cluster(self, name: str, roleArn: str, resourcesVpcConfig: 
Dict, **kwargs) -> Dict:
+"""
+Creates an Amazon EKS control plane.
+
+:param name: The unique name to give to your Amazon EKS Cluster.
+:type name: str
+:param roleArn: The Amazon Resource Name (ARN) of the IAM role that 
provides permissions
+  for the Kubernetes control plane to make calls to AWS API operations 
on your behalf.
+:type roleArn: str
+:param resourcesVpcConfig: The VPC configuration used by the cluster 
control plane.
+:type resourcesVpcConfig: Dict
+
+:return: Returns descriptive information about the created EKS Cluster.
+:rtype: Dict
+"""
+eks_client = self.conn
+
+response = eks_client.create_cluster(
+name=name, roleArn=roleArn, resourcesVpcConfig=resourcesVpcConfig, 
**kwargs
+)
+
+self.log.info("Created cluster with the name %s.", 
response.get('cluster').get('name'))
+return response
+
+def create_nodegroup(
+self, clusterName: str, nodegroupName: str, subnets: List[str], 
nodeRole: str, **kwargs
+) -> Dict:
+"""
+Creates an Amazon EKS Managed Nodegroup for an EKS Cluster.
+
+:param clusterName: The name of the cluster to create the EKS Managed 
Nodegroup in.
+:type clusterName: str
+:param nodegroupName: The unique name to give your managed nodegroup.
+:type nodegroupName: str
+:param subnets: The subnets to use for the Auto Scaling group that is 
created for your nodegroup.
+:type subnets: List[str]
+:param nodeRole: The Amazon Resource Name (ARN) of the IAM role to 
associate with your nodegroup.
+:type nodeRole: str
+
+:return: Returns descriptive information about the created EKS Managed 
Nodegroup.
+:rtype: Dict
+"""
+eks_client = self.conn
+# The below tag is mandatory and must have a value of either 'owned' 
or 'shared'
+# A value of 'owned' denotes that the subnets are exclusive to the 
nodegroup.
+# The 'shared' value allows more than one resource to use the subnet.
+tags = {'kubernetes.io/cluster/' + clusterName: 'owned'}
+if "tags" in kwargs:
+tags = {**tags, **kwargs["tags"]}
+kwargs.pop("tags")
+
+response = eks_client.create_nodegroup(
+clusterName=clusterName,
+nodegroupName=nodegroupName,
+subnets=subnets,
+nodeRole=nodeRole,
+tags=tags,
+**kwargs,
+)
+
+self.log.info(
+"Created a managed nodegroup named %s in cluster %s",
+response.get('nodegroup').get('nodegroupName'),
+response.get('nodegroup').get('clusterName'),
+)
+return response
+
+def delete_cluster(self, name: str) -> 

[GitHub] [airflow] github-actions[bot] commented on pull request #16268: Change Regex For Inclusive Words

2021-08-05 Thread GitBox


github-actions[bot] commented on pull request #16268:
URL: https://github.com/apache/airflow/pull/16268#issuecomment-893904676


   This pull request has been automatically marked as stale because it has not 
had recent activity. It will be closed in 5 days if no further activity occurs. 
Thank you for your contributions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #16012: Fixing Glue hooks/operators

2021-08-05 Thread GitBox


github-actions[bot] commented on pull request #16012:
URL: https://github.com/apache/airflow/pull/16012#issuecomment-893904709


   This pull request has been automatically marked as stale because it has not 
had recent activity. It will be closed in 5 days if no further activity occurs. 
Thank you for your contributions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj edited a comment on issue #17429: Scan the DAGs directory for new files througth restfull API

2021-08-05 Thread GitBox


mik-laj edited a comment on issue #17429:
URL: https://github.com/apache/airflow/issues/17429#issuecomment-893892411


   Will the proposed imrpovments from 
https://github.com/apache/airflow/issues/17437  solve your problem?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jarfgit edited a comment on pull request #17145: Adds an s3 list prefixes operator

2021-08-05 Thread GitBox


jarfgit edited a comment on pull request #17145:
URL: https://github.com/apache/airflow/pull/17145#issuecomment-893890145


   @o-nikolas @iostreamdoth @potiuk Ok, at long last I've updated this pull 
request:
   
   * Refactored `S3ListOperator` to take a `recursive` parameter
   * Refactored the `s3_list_keys` hook to return both a list of keys and list 
of common prefixes (I know this is out of scope for the issue, but making 
unnecessary API calls bothers me and it was easy enough to do)
   * Experimented with deleting the `s3_list_prefixes` hook altogether, but 
it's used elsewhere and constituted some serious scope creep  
   * If `recursive == True` then we add the prefixes to the list of keys 
returned by the operator
   * Ensured `recursive` defaults to `False` and _shouldn't_ present a breaking 
change
   * Added unit tests to both the `s3_list_keys` and `s3_list_prefixes` hooks 
to make sure I (and presumably future contributors) fully understood what the 
`delimiter` and `prefix` args were doing and how various combinations affected 
what was returned when `recursive == True`
   
   
   The above assumes the following:
   
   * The user wants to retrieve both keys and prefixes using the same optional 
params (i.e. `delimiter` and `prefix`). 
   * The user can distinguish keys from prefixes from the list returned by the 
operator. I would assume that keys have a file extension and prefixes would 
include the delimiter... but it seems possible that keys may not _always_ have 
a file extension and I'm basing my understanding of the prefixes containing the 
delimiter on the unit tests.
   
   
   So I have this question:
   
   If we want to refactor the operator to return both prefixes and keys but a 
user might want to use different optional params between keys and prefixes, I 
don't see an alternative other than requiring the user to use the operator 
twice with different params. With this in mind, is there a valid argument to 
have a dedicated `S3ListPrefixes` operator after all?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #17429: Scan the DAGs directory for new files througth restfull API

2021-08-05 Thread GitBox


mik-laj commented on issue #17429:
URL: https://github.com/apache/airflow/issues/17429#issuecomment-893892411


   Related PR: https://github.com/apache/airflow/issues/17437


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #17429: Scan the DAGs directory for new files througth restfull API

2021-08-05 Thread GitBox


mik-laj commented on issue #17429:
URL: https://github.com/apache/airflow/issues/17429#issuecomment-893891678


   This can be problematic as the webserver is stateless and does not have 
access to the DAG files. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #17443: No link for kibana when using frontend configuration

2021-08-05 Thread GitBox


mik-laj commented on issue #17443:
URL: https://github.com/apache/airflow/issues/17443#issuecomment-893891221


   To enable external links for logs, you need to use a task handler that 
supports external links. In this case, you should use 
https://github.com/apache/airflow/blob/866a601b76e219b3c043e1dbbc8fb22300866351/airflow/providers/elasticsearch/log/es_task_handler.py#L44
   
   It looks like you haven't remote logging turned on, so this task handler is 
not used.
   ```
   [logging]
   remote_logging = True
   ```
   To check the current task handler, you can use ``airflow info`` command:
   ```
   $ airflow info  | grep 'task_logging_handler'
   task_logging_handler   | airflow.utils.log.file_task_handler.FileTaskHandler
   ```
   For more details, see:
   
http://airflow.apache.org/docs/apache-airflow-providers-elasticsearch/stable/logging.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jarfgit commented on pull request #17145: Adds an s3 list prefixes operator

2021-08-05 Thread GitBox


jarfgit commented on pull request #17145:
URL: https://github.com/apache/airflow/pull/17145#issuecomment-893890145


   @o-nikolas @iostreamdoth @potiuk Ok, at long last I've updated this pull 
request:
   
   * Refactored `S3ListOperator` to take a `recursive` parameter
   * Refactored the S3 list keys hook to return both a list of keys and list of 
common prefixes (I know this is out of scope for the issue, but making 
unnecessary API calls bothers me and it was easy enough to do)
   * If `recursive == True` then we add the prefixes to the list of keys 
returned by the operator
   * Ensured `recursive` defaults to `False` and _shouldn't_ present a breaking 
change
   * Added unit tests to both the `s3_list_keys` and `s3_list_prefixes` hooks 
to make sure I (and presumably future contributors) fully understood what the 
`delimiter` and `prefix` args were doing and how various combinations affected 
what was returned when `recursive == True`
   
   
   The above assumes the following:
   
   * The user wants to retrieve both keys and prefixes using the same optional 
params (i.e. `delimiter` and `prefix`). 
   * The user can distinguish keys from prefixes from the list returned by the 
operator. I would assume that keys have a file extension and prefixes would 
include the delimiter... but it seems possible that keys may not _always_ have 
a file extension and I'm basing my understanding of the prefixes containing the 
delimiter on the unit tests.
   
   
   So I have this question:
   
   If we want to refactor the operator to return both prefixes and keys but a 
user might want to use different optional params between keys and prefixes, I 
don't see an alternative other than requiring the user to use the operator 
twice with different params. With this in mind, is there a valid argument to 
have a dedicated `S3ListPrefixes` operator after all?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #17400: Example dag slackfile

2021-08-05 Thread GitBox


mik-laj commented on a change in pull request #17400:
URL: https://github.com/apache/airflow/pull/17400#discussion_r683846325



##
File path: airflow/providers/slack/operators/slack.py
##
@@ -195,49 +196,50 @@ class SlackAPIFileOperator(SlackAPIOperator):
 :type content: str
 """
 
-template_fields = ('channel', 'initial_comment', 'filename', 'filetype', 
'content')
+template_fields = ('channel', 'initial_comment', 'filetype', 'content')
 ui_color = '#44BEDF'
 
 def __init__(
 self,
 channel: str = '#general',
 initial_comment: str = 'No message has been set!',
-filename: str = None,
+file: str = None,

Review comment:
   It looks like a breaking change. Is there any way we can keep backward 
compatibility? If not, can you add a note to the changelog? For example note, 
see: google provider 
https://github.com/apache/airflow/blob/main/airflow/providers/google/CHANGELOG.rst




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #17426: Allow using default celery commands for custom Celery executors subclassed from existing

2021-08-05 Thread GitBox


mik-laj commented on a change in pull request #17426:
URL: https://github.com/apache/airflow/pull/17426#discussion_r683844854



##
File path: airflow/cli/cli_parser.py
##
@@ -60,10 +62,17 @@ def _check_value(self, action, value):
 if action.dest == 'subcommand' and value == 'celery':
 executor = conf.get('core', 'EXECUTOR')
 if executor not in (CELERY_EXECUTOR, CELERY_KUBERNETES_EXECUTOR):
-message = (
-f'celery subcommand works only with CeleryExecutor, your 
current executor: {executor}'
-)
-raise ArgumentError(action, message)
+executor_cls = 
import_string(ExecutorLoader.executors.get(executor, executor))

Review comment:
   I am not sure if this will allow the executor to load in all cases, and 
in particular if it works with executed provided by plugins. 
`ExecutorLoader.load_executor` supports one more syntax:
   
https://github.com/apache/airflow/blob/8505d2f0a4524313e3eff7a4f16b9a9439c7a79f/airflow/executors/executor_loader.py#L73
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #17426: Allow using default celery commands for custom Celery executors subclassed from existing

2021-08-05 Thread GitBox


mik-laj commented on a change in pull request #17426:
URL: https://github.com/apache/airflow/pull/17426#discussion_r683844854



##
File path: airflow/cli/cli_parser.py
##
@@ -60,10 +62,17 @@ def _check_value(self, action, value):
 if action.dest == 'subcommand' and value == 'celery':
 executor = conf.get('core', 'EXECUTOR')
 if executor not in (CELERY_EXECUTOR, CELERY_KUBERNETES_EXECUTOR):
-message = (
-f'celery subcommand works only with CeleryExecutor, your 
current executor: {executor}'
-)
-raise ArgumentError(action, message)
+executor_cls = 
import_string(ExecutorLoader.executors.get(executor, executor))

Review comment:
   I am not sure if this will allow the executor to load in all cases, and 
in particular if it works with executed provided by plugins.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #17448: Aws secrets manager backend

2021-08-05 Thread GitBox


mik-laj commented on a change in pull request #17448:
URL: https://github.com/apache/airflow/pull/17448#discussion_r683842856



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -96,28 +97,74 @@ def __init__(
 
 @cached_property
 def client(self):
-"""Create a Secrets Manager client"""
+"""
+Create a Secrets Manager client
+"""
 session = boto3.session.Session(
-profile_name=self.profile_name,
+profile_name=self.profile_name
 )
 return session.client(service_name="secretsmanager", **self.kwargs)
 
-def get_conn_uri(self, conn_id: str) -> Optional[str]:
+def _get_extra(self, secret, conn_string):
+if 'extra' in secret:
+extra_dict = secret['extra']
+kvs = "&".join([f"{key}={value}" for key, value in 
extra_dict.items()])

Review comment:
   What do you think about 
[urllib.parse.urlencode](https://docs.python.org/3/library/urllib.parse.html#urllib.parse.urlencode).
 I'm afraid the current implementation may have problems with some special 
characters.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #17448: Aws secrets manager backend

2021-08-05 Thread GitBox


mik-laj commented on a change in pull request #17448:
URL: https://github.com/apache/airflow/pull/17448#discussion_r683842856



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -96,28 +97,74 @@ def __init__(
 
 @cached_property
 def client(self):
-"""Create a Secrets Manager client"""
+"""
+Create a Secrets Manager client
+"""
 session = boto3.session.Session(
-profile_name=self.profile_name,
+profile_name=self.profile_name
 )
 return session.client(service_name="secretsmanager", **self.kwargs)
 
-def get_conn_uri(self, conn_id: str) -> Optional[str]:
+def _get_extra(self, secret, conn_string):
+if 'extra' in secret:
+extra_dict = secret['extra']
+kvs = "&".join([f"{key}={value}" for key, value in 
extra_dict.items()])

Review comment:
   What do you think about 
[urllib.parse.urlencode(](https://docs.python.org/3/library/urllib.parse.html#urllib.parse.urlencode).
 I'm afraid the current implementation may have problems with some special 
characters.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on pull request #17448: Aws secrets manager backend

2021-08-05 Thread GitBox


mik-laj commented on pull request #17448:
URL: https://github.com/apache/airflow/pull/17448#issuecomment-893875377


   Can you also update the docs?
   
http://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/secrets-backends/aws-secrets-manager.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] wolfier opened a new issue #17453: SQLAlchemy constraint for apache-airflow-snowflake-provider installation

2021-08-05 Thread GitBox


wolfier opened a new issue #17453:
URL: https://github.com/apache/airflow/issues/17453


   **Apache Airflow version**: 2.1.0 or really any Airflow versions
   
   **What happened**:
   
   When I try to install the snowflake provider, the version of SQLAlchemy also 
get upgraded. Due to the dependency of packages installed by the snowflake 
provider, more specifically the requirements of snowflake-sqlalchemy, 
SQLAlchemy is forced toe upgraded.
   
   This upgrade caused some issues with the webserver startup, which generated 
this unhelpful error log.
   
   ```
   [2021-08-04 19:37:45,566] {abstract.py:229} ERROR - Failed to add operation 
for GET /api/v1/connections
   Traceback (most recent call last):
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/apis/abstract.py",
 line 209, in add_paths
   self.add_operation(path, method)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/apis/abstract.py",
 line 173, in add_operation
   pass_context_arg_name=self.pass_context_arg_name
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/operations/__init__.py",
 line 8, in make_operation
   return spec.operation_cls.from_spec(spec, *args, **kwargs)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/operations/openapi.py",
 line 138, in from_spec
   **kwargs
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/operations/openapi.py",
 line 89, in __init__
   pass_context_arg_name=pass_context_arg_name
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/operations/abstract.py",
 line 96, in __init__
   self._resolution = resolver.resolve(self)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/resolver.py", 
line 40, in resolve
   return Resolution(self.resolve_function_from_operation_id(operation_id), 
operation_id)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/resolver.py", 
line 66, in resolve_function_from_operation_id
   raise ResolverError(str(e), sys.exc_info())
   airflow._vendor.connexion.exceptions.ResolverError: 
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
 File "/usr/local/bin/airflow", line 8, in 
   sys.exit(main())
 File "/usr/local/lib/python3.7/site-packages/airflow/__main__.py", line 
40, in main
   args.func(args)
 File "/usr/local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", 
line 48, in command
   return func(*args, **kwargs)
 File "/usr/local/lib/python3.7/site-packages/airflow/utils/cli.py", line 
91, in wrapper
   return f(*args, **kwargs)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/cli/commands/sync_perm_command.py",
 line 26, in sync_perm
   appbuilder = cached_app().appbuilder  # pylint: disable=no-member
 File "/usr/local/lib/python3.7/site-packages/airflow/www/app.py", line 
146, in cached_app
   app = create_app(config=config, testing=testing)
 File "/usr/local/lib/python3.7/site-packages/airflow/www/app.py", line 
130, in create_app
   init_api_connexion(flask_app)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/www/extensions/init_views.py", 
line 186, in init_api_connexion
   specification='v1.yaml', base_path=base_path, validate_responses=True, 
strict_validation=True
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/apps/flask_app.py",
 line 57, in add_api
   api = super(FlaskApp, self).add_api(specification, **kwargs)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/apps/abstract.py",
 line 156, in add_api
   options=api_options.as_dict())
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/apis/abstract.py",
 line 111, in __init__
   self.add_paths()
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/apis/abstract.py",
 line 216, in add_paths
   self._handle_add_operation_error(path, method, err.exc_info)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/apis/abstract.py",
 line 231, in _handle_add_operation_error
   raise value.with_traceback(traceback)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/resolver.py", 
line 61, in resolve_function_from_operation_id
   return self.function_resolver(operation_id)
 File 
"/usr/local/lib/python3.7/site-packages/airflow/_vendor/connexion/utils.py", 
line 111, in get_function_from_name
   module = importlib.import_module(module_name)
 File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in 
import_module
   return _bootstrap._gcd_import(name[level:], package, level)
 File "", line 1006, in _gcd_import
 File "", line 983, in _find_and_load
 File "", line 967, in _find_and_load_unlocked
 File "", line 677, in 

[GitHub] [airflow] jedcunningham opened a new pull request #17452: Pin snowflake-sqlalchemy

2021-08-05 Thread GitBox


jedcunningham opened a new pull request #17452:
URL: https://github.com/apache/airflow/pull/17452


   snowflake-sqlalchemy 1.3 now depends on sqlalchemy 1.4+, so pin to a max of 
1.2.x.
   
   The webserver cannot start when we have sqlalchemy 1.4, throwing the 
following exceptions:
   
   ```airflow._vendor.connexion.exceptions.ResolverError: ```
   
   and
   
   ```AttributeError: columns```
   
   A workaround for any users trying to install 
`apache-airflow-providers-snowflake`, simply also add 
`snowflake-sqlalchemy==1.2.5` (as of writing) to avoid pulling in the latest 
version of `snowflake-sqlalchemy`, and as a result, the latest `sqlalchemy`.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Wittline commented on issue #8177: provide_context=True not working with PythonVirtualenvOperator

2021-08-05 Thread GitBox


Wittline commented on issue #8177:
URL: https://github.com/apache/airflow/issues/8177#issuecomment-893833279


   Hi @jatejeda Please check my github, I used the **PythonVirtualenvOperator** 
 in a personal project using the version 2.
   
   https://github.com/Wittline/uber-expenses-tracking


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jatejeda commented on issue #8177: provide_context=True not working with PythonVirtualenvOperator

2021-08-05 Thread GitBox


jatejeda commented on issue #8177:
URL: https://github.com/apache/airflow/issues/8177#issuecomment-893831486


   no, it wasn't fixed ... I have the same bug in 2.0.2
   
   > Hi,
   > 
   > Was this bug fixed in Airflow version 2?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] fatmumuhomer commented on a change in pull request #17236: [Airflow 16364] Add conn_timeout and cmd_timeout params to SSHOperator; add conn_timeout param to SSHHook

2021-08-05 Thread GitBox


fatmumuhomer commented on a change in pull request #17236:
URL: https://github.com/apache/airflow/pull/17236#discussion_r683804274



##
File path: docs/apache-airflow-providers-ssh/connections/ssh.rst
##
@@ -47,7 +47,8 @@ Extra (optional)
 * ``key_file`` - Full Path of the private SSH Key file that will be used 
to connect to the remote_host.
 * ``private_key`` - Content of the private key used to connect to the 
remote_host.
 * ``private_key_passphrase`` - Content of the private key passphrase used 
to decrypt the private key.
-* ``timeout`` - An optional timeout (in seconds) for the TCP connect. 
Default is ``10``.
+* ``conn_timeout`` - An optional timeout (in seconds) for the TCP connect. 
Default is ``10``. The paramater to 
:class:`~airflow.providers.ssh.hooks.ssh.SSHHook` conn_timeout takes precedence.

Review comment:
   Sorry - what do you mean by "on the other side"?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] 17/17: Switches to "/" convention in ghcr.io images (#17356)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8fc76971c80d1b52a32bc244548c058aaffe27dd
Author: Jarek Potiuk 
AuthorDate: Thu Aug 5 18:39:43 2021 +0200

Switches to "/" convention in ghcr.io images (#17356)

We are using ghcr.io as image cache for our CI builds and Breeze
and it seems ghcr.io is being "rebuilt" while running.

We had been using "airflow-.." image convention before,
bacause multiple nesting levels of images were not supported,
however we experienced errors recently with pushing 2.1 images
(https://issues.apache.org/jira/browse/INFRA-22124) and during
investigation it turned out, that it is possible now to use "/"
in the name of the image, and while it still does not introduce
multiple nesting levels and folder structure, the UI of GitHub
treats it like that and if you have image which starts wiht
"airflow/", the airflow prefix is stripped out and you can also
have even more "/" in then name to introduce further hierarchy.

Since we have to change image naming convention due to (still
unresolved) bug with no permission to push the v2-1-test image
we've decided to change naming convention for all our cache
images to follow this - now available - "/" connvention to make
it better structured and easier to manage/understand.

Some more optimisations are implemented - Python, prod-build and
ci-manifest images are only pushed when "latest" image is prepared.
They are not needed for the COMMIT builds because we only need
final images for those builds. This simplified the code quite
a bit.

The push of cache image in CI is done in one job for both
CI and PROD images and the image is rebuilt again with
latest constraints, to account for the latest constraints
but to make sure that UPGRADE_TO_NEWER_DEPENDENCIES
is not set during the build (which invalidates the cache
for next non-upgrade builds)

Backwards-compatibility was implemented to allow PRs that have
not been upgraded to continue building after this one is merged,
also a workaround has been implemented to make this change
to work even if it is not merged yet to main.

This "legacy" mode will be removed in ~week when everybody rebase
on top of main.

Documentation is updated reflecting those changes.

(cherry picked from commit 1bd3a5c68c88cf3840073d6276460a108f864187)
---
 .github/workflows/build-images.yml |  18 +++
 .github/workflows/ci.yml   | 159 +++--
 CI.rst |  51 ---
 IMAGES.rst |  24 ++--
 README.md  | 127 +---
 breeze |  17 +--
 dev/retag_docker_images.py |   9 +-
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |  19 +--
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |  29 +---
 ...ify_ci_image.sh => ci_push_legacy_ci_images.sh} |  35 +
 ...y_ci_image.sh => ci_push_legacy_prod_images.sh} |  35 +
 .../images/ci_wait_for_and_verify_all_ci_images.sh |   2 +
 .../ci_wait_for_and_verify_all_prod_images.sh  |   2 +
 .../ci/images/ci_wait_for_and_verify_ci_image.sh   |  27 ++--
 .../ci/images/ci_wait_for_and_verify_prod_image.sh |  32 +++--
 scripts/ci/libraries/_build_images.sh  | 109 --
 scripts/ci/libraries/_initialization.sh|  16 +--
 scripts/ci/libraries/_kind.sh  |  16 +--
 scripts/ci/libraries/_parallel.sh  |   7 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   | 117 +--
 scripts/ci/libraries/_script_init.sh   |   2 +-
 scripts/ci/selective_ci_checks.sh  |  10 +-
 22 files changed, 423 insertions(+), 440 deletions(-)

diff --git a/.github/workflows/build-images.yml 
b/.github/workflows/build-images.yml
index ec8f435..c2a9054 100644
--- a/.github/workflows/build-images.yml
+++ b/.github/workflows/build-images.yml
@@ -203,6 +203,10 @@ jobs:
 run: ./scripts/ci/images/ci_prepare_ci_image_on_ci.sh
   - name: "Push CI images ${{ matrix.python-version }}:${{ 
env.TARGET_COMMIT_SHA }}"
 run: ./scripts/ci/images/ci_push_ci_images.sh
+  # Remove me on 15th of August 2021 after all users had chance to rebase
+  - name: "Push Legacy CI images ${{ matrix.python-version }}:${{ 
env.TARGET_COMMIT_SHA }}"
+run: ./scripts/ci/images/ci_push_legacy_ci_images.sh
+if: github.event_name == 'pull_request_target'
 
   build-prod-images:
 permissions:
@@ -229,8 +233,11 @@ jobs:
   VERSION_SUFFIX_FOR_PYPI: ".dev0"
 steps:
   - name: Set envs
+# Set pull image tag for CI image build, 

[airflow] 11/17: Increases timeout for helm chart builds (#17417)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 38c7115599884e1b927fc3c06d8035fb59aff3f2
Author: Jarek Potiuk 
AuthorDate: Wed Aug 4 22:02:46 2021 +0200

Increases timeout for helm chart builds (#17417)

(cherry picked from commit 4348239686b3e2d3df17e5e8ed6462dfc6b98164)
---
 .github/workflows/ci.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 8c31784..22634cf 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -561,7 +561,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
   PACKAGE_FORMAT: "sdist"
 
   tests-helm:
-timeout-minutes: 20
+timeout-minutes: 40
 name: "Python unit tests for helm chart"
 runs-on: ${{ fromJson(needs.build-info.outputs.runsOn) }}
 needs: [build-info, ci-images]


[airflow] 05/17: bump dnspython (#16698)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 867adda5e28b5d3c366e3cd3e1f8f60f5f412c4b
Author: kurtqq <47721902+kur...@users.noreply.github.com>
AuthorDate: Tue Jun 29 00:21:24 2021 +0300

bump dnspython (#16698)

(cherry picked from commit 57dcac22137bc958c1ed9f12fa54484e13411a6f)
---
 setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index e6c1d46..9e8c28d 100644
--- a/setup.py
+++ b/setup.py
@@ -370,7 +370,7 @@ ldap = [
 ]
 leveldb = ['plyvel']
 mongo = [
-'dnspython>=1.13.0,<2.0.0',
+'dnspython>=1.13.0,<3.0.0',
 'pymongo>=3.6.0',
 ]
 mssql = [


[airflow] 07/17: Update alias for field_mask in Google Memmcache (#16975)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 37c935d601ff6c3f501fcfab3bf1b57665fddbc1
Author: Jarek Potiuk 
AuthorDate: Tue Jul 13 20:54:38 2021 +0200

Update alias for field_mask in Google Memmcache (#16975)

The July 12 2021 release of google-memcache library removed
field_mask alias from the library which broke our typechecking
and made google provider unimportable. This PR fixes the import
to use the actual import.

(cherry picked from commit a3f5c93806258b5ad396a638ba0169eca7f9d065)
---
 .../providers/google/cloud/hooks/cloud_memorystore.py| 16 
 setup.py |  4 +++-
 2 files changed, 11 insertions(+), 9 deletions(-)

diff --git a/airflow/providers/google/cloud/hooks/cloud_memorystore.py 
b/airflow/providers/google/cloud/hooks/cloud_memorystore.py
index caf1cd6..8f4165b 100644
--- a/airflow/providers/google/cloud/hooks/cloud_memorystore.py
+++ b/airflow/providers/google/cloud/hooks/cloud_memorystore.py
@@ -487,8 +487,8 @@ class CloudMemorystoreHook(GoogleBaseHook):
 -  ``redisConfig``
 
 If a dict is provided, it must be of the same form as the protobuf 
message
-:class:`~google.cloud.redis_v1.types.FieldMask`
-:type update_mask: Union[Dict, google.cloud.redis_v1.types.FieldMask]
+:class:`~google.protobuf.field_mask_pb2.FieldMask`
+:type update_mask: Union[Dict, 
google.protobuf.field_mask_pb2.FieldMask]
 :param instance: Required. Update description. Only fields specified 
in ``update_mask`` are updated.
 
 If a dict is provided, it must be of the same form as the protobuf 
message
@@ -871,7 +871,7 @@ class CloudMemorystoreMemcachedHook(GoogleBaseHook):
 @GoogleBaseHook.fallback_to_default_project_id
 def update_instance(
 self,
-update_mask: Union[Dict, cloud_memcache.field_mask.FieldMask],
+update_mask: Union[Dict, FieldMask],
 instance: Union[Dict, cloud_memcache.Instance],
 project_id: str,
 location: Optional[str] = None,
@@ -889,9 +889,9 @@ class CloudMemorystoreMemcachedHook(GoogleBaseHook):
 -  ``displayName``
 
 If a dict is provided, it must be of the same form as the protobuf 
message
-
:class:`~google.cloud.memcache_v1beta2.types.cloud_memcache.field_mask.FieldMask`
+:class:`~google.protobuf.field_mask_pb2.FieldMask`)
 :type update_mask:
-Union[Dict, 
google.cloud.memcache_v1beta2.types.cloud_memcache.field_mask.FieldMask]
+Union[Dict, google.protobuf.field_mask_pb2.FieldMask]
 :param instance: Required. Update description. Only fields specified 
in ``update_mask`` are updated.
 
 If a dict is provided, it must be of the same form as the protobuf 
message
@@ -935,7 +935,7 @@ class CloudMemorystoreMemcachedHook(GoogleBaseHook):
 @GoogleBaseHook.fallback_to_default_project_id
 def update_parameters(
 self,
-update_mask: Union[Dict, cloud_memcache.field_mask.FieldMask],
+update_mask: Union[Dict, FieldMask],
 parameters: Union[Dict, cloud_memcache.MemcacheParameters],
 project_id: str,
 location: str,
@@ -951,9 +951,9 @@ class CloudMemorystoreMemcachedHook(GoogleBaseHook):
 
 :param update_mask: Required. Mask of fields to update.
 If a dict is provided, it must be of the same form as the protobuf 
message
-
:class:`~google.cloud.memcache_v1beta2.types.cloud_memcache.field_mask.FieldMask`
+:class:`~google.protobuf.field_mask_pb2.FieldMask`
 :type update_mask:
-Union[Dict, 
google.cloud.memcache_v1beta2.types.cloud_memcache.field_mask.FieldMask]
+Union[Dict, google.protobuf.field_mask_pb2.FieldMask]
 :param parameters: The parameters to apply to the instance.
 If a dict is provided, it must be of the same form as the protobuf 
message
 
:class:`~google.cloud.memcache_v1beta2.types.cloud_memcache.MemcacheParameters`
diff --git a/setup.py b/setup.py
index 9dde824..46ff73d 100644
--- a/setup.py
+++ b/setup.py
@@ -292,7 +292,9 @@ google = [
 'google-cloud-kms>=2.0.0,<3.0.0',
 'google-cloud-language>=1.1.1,<2.0.0',
 'google-cloud-logging>=2.1.1,<3.0.0',
-'google-cloud-memcache>=0.2.0',
+# 1.1.0 removed field_mask and broke import for released providers
+# We can remove the <1.1.0 limitation after we release new Google Provider
+'google-cloud-memcache>=0.2.0,<1.1.0',
 'google-cloud-monitoring>=2.0.0,<3.0.0',
 'google-cloud-os-login>=2.0.0,<3.0.0',
 'google-cloud-pubsub>=2.0.0,<3.0.0',


[airflow] 15/17: Improve diagnostics message when users have secret_key misconfigured (#17410)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2df7e6e41cf6968b48658c21e989868cb3960027
Author: Jarek Potiuk 
AuthorDate: Wed Aug 4 15:15:38 2021 +0200

Improve diagnostics message when users have secret_key misconfigured 
(#17410)

* Improve diagnostics message when users have secret_key misconfigured

Recently fixed log open-access vulnerability have caused
quite a lot of questions and issues from the affected users who
did not have webserver/secret_key configured for their workers
(effectively leading to random value for those keys for workers)

This PR explicitly explains the possible reason for the problem and
encourages the user to configure their webserver's secret_key
in both - workers and webserver.

Related to: #17251 and a number of similar slack discussions.

(cherry picked from commit 2321020e29511f3741940440739e4cc01c0a7ba2)
---
 airflow/utils/log/file_task_handler.py | 5 +
 1 file changed, 5 insertions(+)

diff --git a/airflow/utils/log/file_task_handler.py 
b/airflow/utils/log/file_task_handler.py
index 2dc9beb..56b9d23 100644
--- a/airflow/utils/log/file_task_handler.py
+++ b/airflow/utils/log/file_task_handler.py
@@ -186,6 +186,11 @@ class FileTaskHandler(logging.Handler):
 )
 response.encoding = "utf-8"
 
+if response.status_code == 403:
+log += "***  Please make sure that all your webservers 
and workers have" \
+   " the same 'secret_key' configured in 'webserver' 
section !\n***"
+log += "*** See more at 
https://airflow.apache.org/docs/apache-airflow/; \
+   "stable/configurations-ref.html#secret-key\n***"
 # Check if the resource was properly fetched
 response.raise_for_status()
 


[airflow] 13/17: Optimizes structure of the Dockerfiles and use latest tools (#17418)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 106d9f09852090da21f3dfb744c69e4be5fd90e5
Author: Jarek Potiuk 
AuthorDate: Wed Aug 4 23:32:12 2021 +0200

Optimizes structure of the Dockerfiles and use latest tools (#17418)

* Remove CONTINUE_ON_PIP_CHECK_FAILURE parameter

This parameter was useful when upgrading new dependencies,
however it is going to be replaced with better approach in the
upcoming image convention change.

* Optimizes structure of the Dockerfiles and use latest tools

This PR optimizes the structure of Dockerfile by moving some
expensive operations before the COPY sources so that
rebuilding image when only few sources change is much faster.

At the same time, we upgrade PIP and HELM chart used to latest
versions and clean-up some parameter inconsistencies.

(cherry picked from commit 94b03f6f43277e7332c25fdc63aedfde605f9773)
---
 .github/workflows/build-images.yml |  1 -
 BREEZE.rst | 10 +
 Dockerfile | 27 +--
 Dockerfile.ci  | 52 +++---
 IMAGES.rst |  6 ---
 breeze |  9 
 breeze-complete|  2 +-
 docs/docker-stack/build-arg-ref.rst|  6 ---
 scripts/ci/libraries/_build_images.sh  | 20 -
 scripts/ci/libraries/_initialization.sh| 10 ++---
 scripts/docker/install_additional_dependencies.sh  |  5 +--
 scripts/docker/install_airflow.sh  |  4 +-
 ...nstall_airflow_dependencies_from_branch_tip.sh} | 11 ++---
 .../docker/install_from_docker_context_files.sh|  2 +-
 14 files changed, 55 insertions(+), 110 deletions(-)

diff --git a/.github/workflows/build-images.yml 
b/.github/workflows/build-images.yml
index f29e199..ec8f435 100644
--- a/.github/workflows/build-images.yml
+++ b/.github/workflows/build-images.yml
@@ -148,7 +148,6 @@ jobs:
   BACKEND: postgres
   PYTHON_MAJOR_MINOR_VERSION: ${{ matrix.python-version }}
   UPGRADE_TO_NEWER_DEPENDENCIES: ${{ 
needs.build-info.outputs.upgradeToNewerDependencies }}
-  CONTINUE_ON_PIP_CHECK_FAILURE: "true"
   DOCKER_CACHE: ${{ needs.build-info.outputs.cacheDirective }}
   CHECK_IF_BASE_PYTHON_IMAGE_UPDATED: >
 ${{ github.event_name == 'pull_request_target' && 'false' || 'true' }}
diff --git a/BREEZE.rst b/BREEZE.rst
index 90d3f0b..468709b 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1279,9 +1279,6 @@ This is the current syntax for  `./breeze <./breeze>`_:
   --upgrade-to-newer-dependencies
   Upgrades PIP packages to latest versions available without looking 
at the constraints.
 
-  --continue-on-pip-check-failure
-  Continue even if 'pip check' fails.
-
   -I, --production-image
   Use production image for entering the environment and builds (not 
for tests).
 
@@ -2382,9 +2379,9 @@ This is the current syntax for  `./breeze <./breeze>`_:
   Helm version - only used in case one of kind-cluster commands is 
used.
   One of:
 
- v3.2.4
+ v3.6.3
 
-  Default: v3.2.4
+  Default: v3.6.3
 
   --executor EXECUTOR
   Executor to use in a kubernetes cluster.
@@ -2435,9 +2432,6 @@ This is the current syntax for  `./breeze <./breeze>`_:
   --upgrade-to-newer-dependencies
   Upgrades PIP packages to latest versions available without looking 
at the constraints.
 
-  --continue-on-pip-check-failure
-  Continue even if 'pip check' fails.
-
   

Use different Airflow version at runtime in CI image
 
diff --git a/Dockerfile b/Dockerfile
index 66c9649..210918c 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -44,7 +44,8 @@ ARG AIRFLOW_GID="5"
 
 ARG PYTHON_BASE_IMAGE="python:3.6-slim-buster"
 
-ARG AIRFLOW_PIP_VERSION=21.1.2
+ARG AIRFLOW_PIP_VERSION=21.2.2
+ARG AIRFLOW_IMAGE_REPOSITORY="https://github.com/apache/airflow;
 
 # By default PIP has progress bar but you can disable it.
 ARG PIP_PROGRESS_BAR="on"
@@ -108,12 +109,13 @@ ARG DEV_APT_COMMAND="\
 && curl https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - > 
/dev/null \
 && echo 'deb https://dl.yarnpkg.com/debian/ stable main' > 
/etc/apt/sources.list.d/yarn.list"
 ARG ADDITIONAL_DEV_APT_COMMAND="echo"
+ARG ADDITIONAL_DEV_APT_ENV=""
 
 ENV DEV_APT_DEPS=${DEV_APT_DEPS} \
 ADDITIONAL_DEV_APT_DEPS=${ADDITIONAL_DEV_APT_DEPS} \
 DEV_APT_COMMAND=${DEV_APT_COMMAND} \
 ADDITIONAL_DEV_APT_COMMAND=${ADDITIONAL_DEV_APT_COMMAND} \
-ADDITIONAL_DEV_APT_ENV=""
+

[airflow] 12/17: Improve image building documentation for new users (#17409)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8609c82c7cdd3047efee0c55bb214541e3ea293d
Author: Jarek Potiuk 
AuthorDate: Wed Aug 4 23:07:22 2021 +0200

Improve image building documentation for new users (#17409)

* Improve image building documentation for new users

This PR improves documentation for building images of airflow,
specifically targetting users who do not have big experience with
building the images. It shows examples on how custom image building
can be easily used to upgrade provider packages as well as how
image building can be easily integrated in quick-start using
docker-compose.

(cherry picked from commit 4ee4199f6f0107f39726fc3551d1bf20b8b1283c)
---
 docs/apache-airflow-providers/index.rst |  5 +
 docs/apache-airflow/start/docker-compose.yaml   |  6 +-
 docs/apache-airflow/start/docker.rst| 12 
 docs/docker-stack/build.rst | 13 +
 .../docker-examples/extending/add-apt-packages/Dockerfile   |  2 +-
 .../extending/add-build-essential-extend/Dockerfile |  2 +-
 .../extending/{embedding-dags => add-providers}/Dockerfile  |  7 +++
 .../docker-examples/extending/add-pypi-packages/Dockerfile  |  2 +-
 .../docker-examples/extending/embedding-dags/Dockerfile |  2 +-
 .../docker-examples/extending/writable-directory/Dockerfile |  2 +-
 10 files changed, 43 insertions(+), 10 deletions(-)

diff --git a/docs/apache-airflow-providers/index.rst 
b/docs/apache-airflow-providers/index.rst
index 7329b7f..71c5132 100644
--- a/docs/apache-airflow-providers/index.rst
+++ b/docs/apache-airflow-providers/index.rst
@@ -21,6 +21,8 @@ Provider packages
 
 .. contents:: :local:
 
+.. _providers:community-maintained-providers:
+
 Community maintained providers
 ''
 
@@ -31,6 +33,9 @@ Those provider packages are separated per-provider (for 
example ``amazon``, ``go
 etc.). Those packages are available as ``apache-airflow-providers`` packages - 
separately per each provider
 (for example there is an ``apache-airflow-providers-amazon`` or 
``apache-airflow-providers-google`` package).
 
+The full list of community managed providers is available at
+`Providers Index 
`_.
+
 You can install those provider packages separately in order to interface with 
a given service. For those
 providers that have corresponding extras, the provider packages (latest 
version from PyPI) are installed
 automatically when Airflow is installed with the extra.
diff --git a/docs/apache-airflow/start/docker-compose.yaml 
b/docs/apache-airflow/start/docker-compose.yaml
index 5a301cf..06991e7 100644
--- a/docs/apache-airflow/start/docker-compose.yaml
+++ b/docs/apache-airflow/start/docker-compose.yaml
@@ -44,7 +44,11 @@
 version: '3'
 x-airflow-common:
   
+  # In order to add custom dependencies or upgrade provider packages you can 
use your extended image.
+  # Comment the image line, place your Dockerfile in the directory where you 
placed the docker-compose.yaml
+  # and uncomment the "build" line below, Then run `docker-compose build` to 
build the images.
   image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:|version|}
+  # build: .
   environment:
 
 AIRFLOW__CORE__EXECUTOR: CeleryExecutor
@@ -60,7 +64,7 @@ x-airflow-common:
 - ./dags:/opt/airflow/dags
 - ./logs:/opt/airflow/logs
 - ./plugins:/opt/airflow/plugins
-  user: "${AIRFLOW_UID:-5}:${AIRFLOW_GID:-5}"
+  user: "${AIRFLOW_UID:-5}:${AIRFLOW_GID:-0}"
   depends_on:
 redis:
   condition: service_healthy
diff --git a/docs/apache-airflow/start/docker.rst 
b/docs/apache-airflow/start/docker.rst
index 77c9333..747f87c 100644
--- a/docs/apache-airflow/start/docker.rst
+++ b/docs/apache-airflow/start/docker.rst
@@ -68,6 +68,18 @@ If you need install a new Python library or system library, 
you can :doc:`build
 .. _initializing_docker_compose_environment:
 
 
+Using custom images
+===
+
+When you want to run Airflow locally, you might want to use an extended image, 
containing some additional dependencies - for
+example you might add new python packages, or upgrade airflow providers to a 
later version. This can be done very easily
+by placing a custom Dockerfile alongside your `docker-compose.yaml`. Then you 
can use `docker-compose build` command to build your image (you need to
+do it only once). You can also add the `--build` flag to your `docker-compose` 
commands to rebuild the images
+on-the-fly when you run other `docker-compose` commands.
+
+Examples of how you can extend the image with custom providers, python 
packages,
+apt packages and more can be found in :doc:`Building the image 
`.
+
 

[airflow] 04/17: Add type annotations to setup.py (#16658)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit dfeb73da4f1d3f7a18ad6ab0e320b780588abe13
Author: Ali Muhammad 
AuthorDate: Fri Jun 25 22:25:52 2021 +0500

Add type annotations to setup.py (#16658)

(cherry picked from commit 402274641168f412f44c545c34f3e7edf5cf1476)
---
 setup.py | 40 
 1 file changed, 24 insertions(+), 16 deletions(-)

diff --git a/setup.py b/setup.py
index 21097e6..e6c1d46 100644
--- a/setup.py
+++ b/setup.py
@@ -62,14 +62,14 @@ class CleanCommand(Command):
 description = "Tidy up the project root"
 user_options: List[str] = []
 
-def initialize_options(self):
+def initialize_options(self) -> None:
 """Set default values for options."""
 
-def finalize_options(self):
+def finalize_options(self) -> None:
 """Set final values for options."""
 
 @staticmethod
-def rm_all_files(files: List[str]):
+def rm_all_files(files: List[str]) -> None:
 """Remove all files from the list"""
 for file in files:
 try:
@@ -77,7 +77,7 @@ class CleanCommand(Command):
 except Exception as e:
 logger.warning("Error when removing %s: %s", file, e)
 
-def run(self):
+def run(self) -> None:
 """Remove temporary files and directories."""
 os.chdir(my_dir)
 self.rm_all_files(glob.glob('./build/*'))
@@ -98,10 +98,10 @@ class CompileAssets(Command):
 description = "Compile and build the frontend assets"
 user_options: List[str] = []
 
-def initialize_options(self):
+def initialize_options(self) -> None:
 """Set default values for options."""
 
-def finalize_options(self):
+def finalize_options(self) -> None:
 """Set final values for options."""
 
 def run(self) -> None:
@@ -118,10 +118,10 @@ class ListExtras(Command):
 description = "List available extras"
 user_options: List[str] = []
 
-def initialize_options(self):
+def initialize_options(self) -> None:
 """Set default values for options."""
 
-def finalize_options(self):
+def finalize_options(self) -> None:
 """Set final values for options."""
 
 def run(self) -> None:
@@ -165,7 +165,7 @@ def git_version(version_: str) -> str:
 return 'no_git_version'
 
 
-def write_version(filename: str = os.path.join(*[my_dir, "airflow", 
"git_version"])):
+def write_version(filename: str = os.path.join(*[my_dir, "airflow", 
"git_version"])) -> None:
 """
 Write the Semver version + git hash to file, e.g. 
".dev0+2f635dc265e78db6708f59f68e8009abb92c1e65".
 
@@ -766,7 +766,7 @@ PACKAGES_EXCLUDED_FOR_ALL.extend(
 )
 
 
-def is_package_excluded(package: str, exclusion_list: List[str]):
+def is_package_excluded(package: str, exclusion_list: List[str]) -> bool:
 """
 Checks if package should be excluded.
 
@@ -820,7 +820,7 @@ PREINSTALLED_PROVIDERS = [
 ]
 
 
-def get_provider_package_from_package_id(package_id: str):
+def get_provider_package_from_package_id(package_id: str) -> str:
 """
 Builds the name of provider package out of the package id provided/
 
@@ -831,16 +831,18 @@ def get_provider_package_from_package_id(package_id: str):
 return f"apache-airflow-providers-{package_suffix}"
 
 
-def get_excluded_providers():
+def get_excluded_providers() -> List[str]:
 """
 Returns packages excluded for the current python version.
+
 Currently the only excluded provider is apache hive for Python 3.9.
 Until https://github.com/dropbox/PyHive/issues/380 is fixed.
+
 """
 return ['apache.hive'] if PY39 else []
 
 
-def get_all_provider_packages():
+def get_all_provider_packages() -> str:
 """Returns all provider packages configured in setup.py"""
 excluded_providers = get_excluded_providers()
 return " ".join(
@@ -851,7 +853,13 @@ def get_all_provider_packages():
 
 
 class AirflowDistribution(Distribution):
-"""The setuptools.Distribution subclass with Airflow specific behaviour"""
+"""
+The setuptools.Distribution subclass with Airflow specific behaviour
+
+The reason for pylint: disable=signature-differs of parse_config_files is 
explained here:
+https://github.com/PyCQA/pylint/issues/3737
+
+"""
 
 def parse_config_files(self, *args, **kwargs) -> None:
 """
@@ -949,7 +957,7 @@ def add_all_provider_packages() -> None:
 class Develop(develop_orig):
 """Forces removal of providers in editable mode."""
 
-def run(self):
+def run(self) -> None:
 self.announce('Installing in editable mode. Uninstalling provider 
packages!', level=log.INFO)
 # We need to run "python3 -m pip" because it might be that older PIP 
binary is in the path
 # And it results with an error when running pip directly (cannot 
import pip module)
@@ -973,7 +981,7 @@ class 

[airflow] 16/17: Fix failing static checks in main (#17424)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f1966908e0f9b7dee4415046c19a213781271fe4
Author: Jarek Potiuk 
AuthorDate: Wed Aug 4 23:33:29 2021 +0200

Fix failing static checks in main (#17424)

(cherry picked from commit 60848a6ea78e29f20004a5e9f177335e6b99c706)
---
 airflow/utils/log/file_task_handler.py | 12 
 1 file changed, 8 insertions(+), 4 deletions(-)

diff --git a/airflow/utils/log/file_task_handler.py 
b/airflow/utils/log/file_task_handler.py
index 56b9d23..2387633 100644
--- a/airflow/utils/log/file_task_handler.py
+++ b/airflow/utils/log/file_task_handler.py
@@ -187,10 +187,14 @@ class FileTaskHandler(logging.Handler):
 response.encoding = "utf-8"
 
 if response.status_code == 403:
-log += "***  Please make sure that all your webservers 
and workers have" \
-   " the same 'secret_key' configured in 'webserver' 
section !\n***"
-log += "*** See more at 
https://airflow.apache.org/docs/apache-airflow/; \
-   "stable/configurations-ref.html#secret-key\n***"
+log += (
+"***  Please make sure that all your Airflow 
components (e.g. schedulers, webservers and workers) have"
+" the same 'secret_key' configured in 'webserver' 
section !\n***"
+)
+log += (
+"*** See more at 
https://airflow.apache.org/docs/apache-airflow/;
+"stable/configurations-ref.html#secret-key\n***"
+)
 # Check if the resource was properly fetched
 response.raise_for_status()
 


[airflow] 14/17: Add timeout when asking whether to rebuild image (#17412)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7371f986b0eedec677c76dc7b6db28a996b9c525
Author: Jarek Potiuk 
AuthorDate: Wed Aug 4 15:44:06 2021 +0200

Add timeout when asking whether to rebuild image (#17412)

This PR adds timeout to answer the question, whether to rebuild
image when `breeze` is invoked or when pre-commit is run.

This reflects the typical use cases where rebuild is mostly not
needed, only in case of some tests which require new dependencies
to be included.

User has still 4 seconds to answer Y and have the images rebuilt
and just the presence of the question will be enough to get the
user trigger it from time to time.

(cherry picked from commit 2938acd817561c79674ca333b83ee1972248df98)
---
 confirm | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/confirm b/confirm
index e796737..42316da 100755
--- a/confirm
+++ b/confirm
@@ -15,7 +15,7 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-set -euo pipefail
+set -uo pipefail
 
 if [[ -n "${FORCE_ANSWER_TO_QUESTIONS=}" ]]; then
 RESPONSE=${FORCE_ANSWER_TO_QUESTIONS}
@@ -31,8 +31,8 @@ if [[ -n "${FORCE_ANSWER_TO_QUESTIONS=}" ]]; then
 esac
 else
 echo
-echo "Please confirm ${1}. Are you sure? [y/N/q]"
-read -r RESPONSE
+echo "Please confirm ${1} (or wait 4 seconds to skip it). Are you sure? 
[y/N/q]"
+read -t 4 -r RESPONSE
 fi
 
 case "${RESPONSE}" in


[airflow] 09/17: Enhancement to bash scripts (#17098)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7a43960a5c3c5822c56149ece261ab99c6a92282
Author: Shraman Basyal <49772843+bshra...@users.noreply.github.com>
AuthorDate: Mon Aug 2 14:49:11 2021 -0500

Enhancement to bash scripts (#17098)

(cherry picked from commit e544ffc2241a067b7433a954f51d9fa91f345c29)
---
 scripts/in_container/prod/airflow_scheduler_autorestart.sh | 8 ++--
 1 file changed, 6 insertions(+), 2 deletions(-)

diff --git a/scripts/in_container/prod/airflow_scheduler_autorestart.sh 
b/scripts/in_container/prod/airflow_scheduler_autorestart.sh
index 09e0344..0fdc644 100755
--- a/scripts/in_container/prod/airflow_scheduler_autorestart.sh
+++ b/scripts/in_container/prod/airflow_scheduler_autorestart.sh
@@ -18,7 +18,11 @@
 
 while echo "Running"; do
 airflow scheduler -n 5
-echo "Scheduler crashed with exit code $?.  Respawning.." >&2
-date >> /tmp/airflow_scheduler_errors.txt
+return_code=$?
+if (( return_code != 0 )); then
+echo "Scheduler crashed with exit code $return_code. Respawning.." >&2
+date >> /tmp/airflow_scheduler_errors.txt
+fi
+
 sleep 1
 done


[airflow] 10/17: Do not pull CI image for ownership fixing on first, fresh breeze run (#17419)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 70794b00175e01c18d72fca3ce8eb58ef306e76b
Author: Jarek Potiuk 
AuthorDate: Wed Aug 4 22:00:33 2021 +0200

Do not pull CI image for ownership fixing on first, fresh breeze run 
(#17419)

When you run Breeze on fresh machine, this script pulled the CI
image before any operation. It is not harmful in most cases but
it unnecessarily delays the first real image check and rebuild,
where fixing ownership is not really needed (as we've never run
Breeze before).

(cherry picked from commit 537c25417814d86bcee195ee03027840ce5837b6)
---
 scripts/ci/tools/fix_ownership.sh | 14 +-
 1 file changed, 9 insertions(+), 5 deletions(-)

diff --git a/scripts/ci/tools/fix_ownership.sh 
b/scripts/ci/tools/fix_ownership.sh
index 6ed1161..de15621 100755
--- a/scripts/ci/tools/fix_ownership.sh
+++ b/scripts/ci/tools/fix_ownership.sh
@@ -33,8 +33,12 @@ sanity_checks::sanitize_mounted_files
 
 read -r -a EXTRA_DOCKER_FLAGS 
<<<"$(local_mounts::convert_local_mounts_to_docker_params)"
 
-docker_v run --entrypoint /bin/bash "${EXTRA_DOCKER_FLAGS[@]}" \
---rm \
---env-file "${AIRFLOW_SOURCES}/scripts/ci/docker-compose/_docker.env" \
-"${AIRFLOW_CI_IMAGE}" \
--c /opt/airflow/scripts/in_container/run_fix_ownership.sh || true
+if docker image inspect "${AIRFLOW_CI_IMAGE}" >/dev/null 2>&1; then
+docker_v run --entrypoint /bin/bash "${EXTRA_DOCKER_FLAGS[@]}" \
+--rm \
+--env-file "${AIRFLOW_SOURCES}/scripts/ci/docker-compose/_docker.env" \
+"${AIRFLOW_CI_IMAGE}" \
+-c /opt/airflow/scripts/in_container/run_fix_ownership.sh || true
+else
+echo "Skip fixing ownership as seems that you do not have the 
${AIRFLOW_CI_IMAGE} image yet"
+fi


[airflow] 06/17: AIRFLOW-5529 Add Apache Drill provider. (#16884)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 42334b0f9145e70eb23938cad78589f6a97731dc
Author: dzamo <9107319+dz...@users.noreply.github.com>
AuthorDate: Mon Jul 12 19:59:35 2021 +0200

AIRFLOW-5529 Add Apache Drill provider. (#16884)

(cherry picked from commit 8808b641942e1b81c21db054fd6d36e2031cfab8)
---
 CONTRIBUTING.rst   |  22 ++---
 INSTALL|  24 +++---
 airflow/providers/apache/drill/CHANGELOG.rst   |  25 ++
 airflow/providers/apache/drill/__init__.py |  17 
 .../apache/drill/example_dags/example_drill_dag.py |  46 +++
 airflow/providers/apache/drill/hooks/__init__.py   |  17 
 airflow/providers/apache/drill/hooks/drill.py  |  89 +
 .../providers/apache/drill/operators/__init__.py   |  17 
 airflow/providers/apache/drill/operators/drill.py  |  71 
 airflow/providers/apache/drill/provider.yaml   |  49 
 airflow/ui/src/views/Docs.tsx  |   1 +
 airflow/utils/db.py|  10 +++
 .../commits.rst|  23 ++
 .../connections/drill.rst  |  44 ++
 .../index.rst  |  50 
 .../operators.rst  |  51 
 docs/apache-airflow/extra-packages-ref.rst |   2 +
 docs/conf.py   |   1 +
 docs/integration-logos/apache/drill.png| Bin 0 -> 40173 bytes
 docs/spelling_wordlist.txt |   1 +
 setup.py   |   3 +
 tests/providers/apache/drill/__init__.py   |  17 
 tests/providers/apache/drill/hooks/__init__.py |  17 
 tests/providers/apache/drill/hooks/test_drill.py   |  84 +++
 tests/providers/apache/drill/operators/__init__.py |  17 
 .../providers/apache/drill/operators/test_drill.py |  63 +++
 26 files changed, 738 insertions(+), 23 deletions(-)

diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 98cdf93..be807f4 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -573,17 +573,17 @@ This is the full list of those extras:
 
   .. START EXTRAS HERE
 
-airbyte, all, all_dbs, amazon, apache.atlas, apache.beam, apache.cassandra, 
apache.druid,
-apache.hdfs, apache.hive, apache.kylin, apache.livy, apache.pig, apache.pinot, 
apache.spark,
-apache.sqoop, apache.webhdfs, asana, async, atlas, aws, azure, cassandra, 
celery, cgroups, cloudant,
-cncf.kubernetes, crypto, dask, databricks, datadog, deprecated_api, devel, 
devel_all, devel_ci,
-devel_hadoop, dingding, discord, doc, docker, druid, elasticsearch, exasol, 
facebook, ftp, gcp,
-gcp_api, github_enterprise, google, google_auth, grpc, hashicorp, hdfs, hive, 
http, imap, jdbc,
-jenkins, jira, kerberos, kubernetes, ldap, leveldb, microsoft.azure, 
microsoft.mssql,
-microsoft.winrm, mongo, mssql, mysql, neo4j, odbc, openfaas, opsgenie, oracle, 
pagerduty, papermill,
-password, pinot, plexus, postgres, presto, qds, qubole, rabbitmq, redis, s3, 
salesforce, samba,
-segment, sendgrid, sentry, sftp, singularity, slack, snowflake, spark, sqlite, 
ssh, statsd, tableau,
-telegram, trino, vertica, virtualenv, webhdfs, winrm, yandex, zendesk
+airbyte, all, all_dbs, amazon, apache.atlas, apache.beam, apache.cassandra, 
apache.drill,
+apache.druid, apache.hdfs, apache.hive, apache.kylin, apache.livy, apache.pig, 
apache.pinot,
+apache.spark, apache.sqoop, apache.webhdfs, asana, async, atlas, aws, azure, 
cassandra, celery,
+cgroups, cloudant, cncf.kubernetes, crypto, dask, databricks, datadog, 
deprecated_api, devel,
+devel_all, devel_ci, devel_hadoop, dingding, discord, doc, docker, druid, 
elasticsearch, exasol,
+facebook, ftp, gcp, gcp_api, github_enterprise, google, google_auth, grpc, 
hashicorp, hdfs, hive,
+http, imap, jdbc, jenkins, jira, kerberos, kubernetes, ldap, leveldb, 
microsoft.azure,
+microsoft.mssql, microsoft.winrm, mongo, mssql, mysql, neo4j, odbc, openfaas, 
opsgenie, oracle,
+pagerduty, papermill, password, pinot, plexus, postgres, presto, qds, qubole, 
rabbitmq, redis, s3,
+salesforce, samba, segment, sendgrid, sentry, sftp, singularity, slack, 
snowflake, spark, sqlite,
+ssh, statsd, tableau, telegram, trino, vertica, virtualenv, webhdfs, winrm, 
yandex, zendesk
 
   .. END EXTRAS HERE
 
diff --git a/INSTALL b/INSTALL
index 111b51f..554af5c 100644
--- a/INSTALL
+++ b/INSTALL
@@ -1,6 +1,6 @@
 # INSTALL / BUILD instructions for Apache Airflow
 
-This ia a generic installation method that requires a number of dependencies 
to be installed.
+This is a generic installation method that requires a number of dependencies 
to be installed.
 
 Depending on your system you might need different 

[airflow] 08/17: Updates to FlaskAppBuilder 3.3.2+ (#17208)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3c0f4d9908a81b662296e747230dba5ddb96f213
Author: Jarek Potiuk 
AuthorDate: Wed Jul 28 21:37:48 2021 +0200

Updates to FlaskAppBuilder 3.3.2+ (#17208)

There are some clarifications about using the authentication
via FlaskAppBuilder - the change implements minimum version of the
FAB to 3.3.2 and clarifies the dependencies used in FAB 3 series
to be only authlib rather than flask-oauth.

Fixes: #16944 (this is the second, proper fix this time).
(cherry picked from commit 6d7fa874ff201af7f602be9c58a827998814bdd1)
---
 setup.cfg |  2 +-
 setup.py  | 10 --
 2 files changed, 5 insertions(+), 7 deletions(-)

diff --git a/setup.cfg b/setup.cfg
index fbe58cb..d3c5f57 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -100,7 +100,7 @@ install_requires =
 #  https://github.com/readthedocs/sphinx_rtd_theme/issues/1115
 docutils<0.17
 flask>=1.1.0, <2.0
-flask-appbuilder~=3.3
+flask-appbuilder>=3.3.2, <4.0.0
 flask-caching>=1.5.0, <2.0.0
 flask-login>=0.3, <0.5
 flask-wtf>=0.14.3, <0.15
diff --git a/setup.py b/setup.py
index 46ff73d..c74808a 100644
--- a/setup.py
+++ b/setup.py
@@ -270,10 +270,8 @@ exasol = [
 facebook = [
 'facebook-business>=6.0.2',
 ]
-flask_oauth = [
-'Flask-OAuthlib>=0.9.1,<0.9.6',  # Flask OAuthLib 0.9.6 requires 
Flask-Login 0.5.0 - breaks FAB
-'oauthlib!=2.0.3,!=2.0.4,!=2.0.5,<3.0.0,>=1.1.2',
-'requests-oauthlib<1.2.0',
+flask_appbuilder_authlib = [
+'authlib',
 ]
 google = [
 'PyOpenSSL',
@@ -622,8 +620,8 @@ CORE_EXTRAS_REQUIREMENTS: Dict[str, List[str]] = {
 'cncf.kubernetes': kubernetes,  # also has provider, but it extends the 
core with the KubernetesExecutor
 'dask': dask,
 'deprecated_api': deprecated_api,
-'github_enterprise': flask_oauth,
-'google_auth': flask_oauth,
+'github_enterprise': flask_appbuilder_authlib,
+'google_auth': flask_appbuilder_authlib,
 'kerberos': kerberos,
 'ldap': ldap,
 'leveldb': leveldb,


[airflow] 03/17: Bump Jinja2 upper-bound from 2.12.0 to 4.0.0 (#16595)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit fe5920e5ed3e1158bffbc3a15d24d57c10d79843
Author: Ashwin Madavan 
AuthorDate: Thu Jun 24 15:07:23 2021 -0400

Bump Jinja2 upper-bound from 2.12.0 to 4.0.0 (#16595)

(cherry picked from commit 5d5268f5e553a7031ebfb08754c31fca5c13bda7)
---
 setup.cfg | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/setup.cfg b/setup.cfg
index 0e4868f..fbe58cb 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -114,7 +114,7 @@ install_requires =
 iso8601>=0.1.12
 # Logging is broken with itsdangerous > 2
 itsdangerous>=1.1.0, <2.0
-jinja2>=2.10.1, <2.12.0
+jinja2>=2.10.1,<4
 jsonschema~=3.0
 lazy-object-proxy
 lockfile>=0.12.2


[airflow] 01/17: Switch back http provider after requests removes LGPL dependency (#16974)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ab3634a28cc5601c6bb78a81aaa7f349afa27377
Author: Jarek Potiuk 
AuthorDate: Tue Jul 13 22:13:30 2021 +0200

Switch back http provider after requests removes LGPL dependency (#16974)

Following merging the https://github.com/psf/requests/pull/5797
and requests 2.26.0 release without LGPL chardet dependency,
we can now bring back http as pre-installed provider as it does
not bring chardet automatically any more.

(cherry picked from commit c46e841519ef2df7dc40ff2596dd49c010514d87)
---
 docs/apache-airflow/extra-packages-ref.rst |  2 +-
 setup.py   | 13 ++---
 2 files changed, 7 insertions(+), 8 deletions(-)

diff --git a/docs/apache-airflow/extra-packages-ref.rst 
b/docs/apache-airflow/extra-packages-ref.rst
index b4b4bb4..b1dff07 100644
--- a/docs/apache-airflow/extra-packages-ref.rst
+++ b/docs/apache-airflow/extra-packages-ref.rst
@@ -258,7 +258,7 @@ Those are extras that provide support for integration with 
external systems via
 
+-+-+--+--+
 | grpc| ``pip install 'apache-airflow[grpc]'``  | 
Grpc hooks and operators |  |
 
+-+-+--+--+
-| http| ``pip install 'apache-airflow[http]'``  | 
HTTP hooks, operators and sensors|  |
+| http| ``pip install 'apache-airflow[http]'``  | 
HTTP hooks, operators and sensors|  *   |
 
+-+-+--+--+
 | imap| ``pip install 'apache-airflow[imap]'``  | 
IMAP hooks and sensors   |  *   |
 
+-+-+--+--+
diff --git a/setup.py b/setup.py
index 5d6f752..21097e6 100644
--- a/setup.py
+++ b/setup.py
@@ -231,13 +231,13 @@ dask = [
 'distributed>=2.11.1, <2.20',
 ]
 databricks = [
-'requests>=2.20.0, <3',
+'requests>=2.26.0, <3',
 ]
 datadog = [
 'datadog>=0.14.0',
 ]
 deprecated_api = [
-'requests>=2.20.0',
+'requests>=2.26.0',
 ]
 doc = [
 # Sphinx is limited to < 3.5.0 because of 
https://github.com/sphinx-doc/sphinx/issues/8880
@@ -330,7 +330,9 @@ hive = [
 'thrift>=0.9.2',
 ]
 http = [
-'requests>=2.20.0',
+# The 2.26.0 release of requests got rid of the chardet LGPL mandatory 
dependency, allowing us to
+# release it as a requirement for airflow
+'requests>=2.26.0',
 ]
 http_provider = [
 # NOTE ! The HTTP provider is NOT preinstalled by default when Airflow is 
installed - because it
@@ -810,12 +812,9 @@ EXTRAS_REQUIREMENTS = sort_extras_requirements()
 # Those providers are pre-installed always when airflow is installed.
 # Those providers do not have dependency on airflow2.0 because that would lead 
to circular dependencies.
 # This is not a problem for PIP but some tools (pipdeptree) show those as a 
warning.
-# NOTE ! The HTTP provider is NOT preinstalled by default when Airflow is 
installed - because it
-#depends on `requests` library and until `chardet` is mandatory 
dependency of `requests`
-#we cannot make it mandatory dependency. See 
https://github.com/psf/requests/pull/5797
 PREINSTALLED_PROVIDERS = [
 'ftp',
-# 'http',
+'http',
 'imap',
 'sqlite',
 ]


[airflow] 02/17: Remove SQLAlchemy <1.4 constraint (#16630)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 68977b9ed93e351f55ea93e1848be13ccef0
Author: Tzu-ping Chung 
AuthorDate: Thu Jun 24 21:10:15 2021 +0800

Remove SQLAlchemy <1.4 constraint (#16630)

This was added due to flask-sqlalchemy and sqlalchemy-utils not declaring
the upper bounds. They have since released sqlalchemy 1.4-compatible
versions, so we can remove that hack.

Note that this does *not* actually make us run on sqlalchemy 1.4 since
flask-appbuilder still has a <1.4 pin. But that's for flask-appbuilder
to worry about -- code in Airflow is compatible, so we can remove the
constraint now, and get sqlalchemy 1.4 as soon as flask-appbuilder
allows us to.

(cherry picked from commit d181604739c048c6969d8997dbaf8b159607904b)
---
 setup.cfg | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/setup.cfg b/setup.cfg
index 8b03296..0e4868f 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -150,8 +150,7 @@ install_requires =
 pyyaml>=5.1
 rich>=9.2.0
 setproctitle>=1.1.8, <2
-# SQLAlchemy 1.4 breaks sqlalchemy-utils 
https://github.com/kvesteri/sqlalchemy-utils/issues/505
-sqlalchemy>=1.3.18, <1.4
+sqlalchemy>=1.3.18
 sqlalchemy_jsonfield~=1.0
 # Required by vendored-in connexion
 swagger-ui-bundle>=0.0.2


[airflow] branch v2-1-test updated (4678cf54 -> 8fc7697)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard 4678cf54 Validate type of `priority_weight` during parsing (#16765)
 discard b558205  Fix calculating duration in tree view (#16695)
 discard 8e8b58a  Fix CLI 'kubernetes cleanup-pods' which fails on invalid 
label key (#17298)
 discard 012321b  Fail tasks in scheduler when executor reports they failed 
(#15929)
 discard 673d78a  fix(smart_sensor): Unbound variable errors (#14774)
 discard ac832a2  Updates to FlaskAppBuilder 3.3.2+ (#17208)
 discard f14860d  Update alias for field_mask in Google Memmcache (#16975)
 discard 4aef457  AIRFLOW-5529 Add Apache Drill provider. (#16884)
 discard ab46af9  bump dnspython (#16698)
 discard f3b74d8  Add type annotations to setup.py (#16658)
 discard c967d29  Bump Jinja2 upper-bound from 2.12.0 to 4.0.0 (#16595)
 discard e8058eb  Remove SQLAlchemy <1.4 constraint (#16630)
 discard 88e6305  Switch back http provider after requests removes LGPL 
dependency (#16974)
 discard 3d2a5cf  Switches to "/" convention in ghcr.io images with 
optimisations
 new ab3634a  Switch back http provider after requests removes LGPL 
dependency (#16974)
 new 68977b9  Remove SQLAlchemy <1.4 constraint (#16630)
 new fe5920e  Bump Jinja2 upper-bound from 2.12.0 to 4.0.0 (#16595)
 new dfeb73d  Add type annotations to setup.py (#16658)
 new 867adda  bump dnspython (#16698)
 new 42334b0  AIRFLOW-5529 Add Apache Drill provider. (#16884)
 new 37c935d  Update alias for field_mask in Google Memmcache (#16975)
 new 3c0f4d9  Updates to FlaskAppBuilder 3.3.2+ (#17208)
 new 7a43960  Enhancement to bash scripts (#17098)
 new 70794b0  Do not pull CI image for ownership fixing on first, fresh 
breeze run (#17419)
 new 38c7115  Increases timeout for helm chart builds (#17417)
 new 8609c82  Improve image building documentation for new users (#17409)
 new 106d9f0  Optimizes structure of the Dockerfiles and use latest tools 
(#17418)
 new 7371f98  Add timeout when asking whether to rebuild image (#17412)
 new 2df7e6e  Improve diagnostics message when users have secret_key 
misconfigured (#17410)
 new f196690  Fix failing static checks in main (#17424)
 new 8fc7697  Switches to "/" convention in ghcr.io images (#17356)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (4678cf54)
\
 N -- N -- N   refs/heads/v2-1-test (8fc7697)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 17 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .dockerignore  |   4 +-
 .github/workflows/build-images.yml |   4 +-
 BREEZE.rst |   6 -
 Dockerfile.ci  |   4 +-
 README.md  | 123 -
 airflow/cli/commands/kubernetes_command.py |  12 +-
 airflow/jobs/scheduler_job.py  |   4 +-
 airflow/models/baseoperator.py |   7 +-
 airflow/sensors/smart_sensor.py|   1 -
 airflow/utils/log/file_task_handler.py |   9 ++
 airflow/www/static/js/tree.js  |   6 +-
 breeze |   3 -
 confirm|   6 +-
 docs/apache-airflow-providers/index.rst|   5 +
 docs/apache-airflow/start/docker-compose.yaml  |   6 +-
 docs/apache-airflow/start/docker.rst   |  12 ++
 docs/docker-stack/build.rst|  13 +++
 .../extending/add-apt-packages/Dockerfile  |   2 +-
 .../add-build-essential-extend/Dockerfile  |   2 +-
 .../{embedding-dags => add-providers}/Dockerfile   |   7 +-
 .../extending/add-pypi-packages/Dockerfile |   2 +-
 .../extending/embedding-dags/Dockerfile|   2 +-
 .../extending/writable-directory/Dockerfile|   2 +-
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |   2 +-
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |   2 +-
 .../ci/images/ci_wait_for_and_verify_ci_image.sh   |   6 +-
 

[airflow] branch constraints-2-1 updated: Updated to latest constraints

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch constraints-2-1
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/constraints-2-1 by this push:
 new ced5c9a  Updated to latest constraints
ced5c9a is described below

commit ced5c9a69fdc7609a8105ce633ac34e0179de16d
Author: Jarek Potiuk 
AuthorDate: Thu Aug 5 22:50:11 2021 +0200

Updated to latest constraints
---
 constraints-3.6.txt  | 26 +-
 constraints-3.7.txt  | 26 +-
 constraints-3.8.txt  | 26 +-
 constraints-3.9.txt  | 26 +-
 constraints-no-providers-3.6.txt |  4 ++--
 constraints-no-providers-3.7.txt |  4 ++--
 constraints-no-providers-3.8.txt |  4 ++--
 constraints-no-providers-3.9.txt |  4 ++--
 constraints-source-providers-3.6.txt | 26 +-
 constraints-source-providers-3.7.txt | 26 +-
 constraints-source-providers-3.8.txt | 26 +-
 constraints-source-providers-3.9.txt | 26 +-
 12 files changed, 112 insertions(+), 112 deletions(-)

diff --git a/constraints-3.6.txt b/constraints-3.6.txt
index a48e1e8..b7cfd10 100644
--- a/constraints-3.6.txt
+++ b/constraints-3.6.txt
@@ -119,7 +119,7 @@ argcomplete==1.12.3
 arrow==1.1.1
 asana==0.10.3
 asn1crypto==1.4.0
-astroid==2.6.5
+astroid==2.6.6
 async-generator==1.10
 async-timeout==3.0.1
 atlasclient==1.0.0
@@ -128,7 +128,7 @@ avro-python3==1.9.2.1
 aws-xray-sdk==2.8.0
 azure-batch==11.0.0
 azure-common==1.1.27
-azure-core==1.16.0
+azure-core==1.17.0
 azure-cosmos==3.2.0
 azure-datalake-store==0.0.52
 azure-identity==1.6.0
@@ -200,7 +200,7 @@ docutils==0.16
 ecdsa==0.17.0
 elasticsearch-dbapi==0.2.4
 elasticsearch-dsl==7.4.0
-elasticsearch==7.13.4
+elasticsearch==7.14.0
 email-validator==1.1.3
 entrypoints==0.3
 eventlet==0.31.1
@@ -218,7 +218,7 @@ fsspec==2021.7.0
 future==0.18.2
 gcsfs==2021.7.0
 geomet==0.2.1.post1
-gevent==21.1.2
+gevent==21.8.0
 gitdb==4.0.7
 github3.py==2.0.0
 google-ads==12.0.0
@@ -274,7 +274,7 @@ httplib2==0.19.1
 httpx==0.18.2
 humanize==3.11.0
 hvac==0.11.0
-identify==2.2.11
+identify==2.2.12
 idna-ssl==1.1.0
 idna==3.2
 ijson==3.1.4
@@ -317,7 +317,7 @@ mongomock==3.23.0
 monotonic==1.6
 more-itertools==8.8.0
 moreorless==0.4.0
-moto==2.2.0
+moto==2.2.1
 msal-extensions==0.3.0
 msal==1.13.0
 msgpack==1.0.2
@@ -419,10 +419,10 @@ python3-openid==3.2.0
 pytz==2021.1
 pytzdata==2020.1
 pywinrm==0.4.2
-pyzmq==22.1.0
+pyzmq==22.2.1
 qds-sdk==1.16.1
 redis==3.5.3
-regex==2021.7.6
+regex==2021.8.3
 requests-kerberos==0.12.0
 requests-mock==1.9.3
 requests-ntlm==1.1.0
@@ -431,7 +431,7 @@ requests-toolbelt==0.9.1
 requests==2.26.0
 responses==0.13.3
 rfc3986==1.5.0
-rich==10.6.0
+rich==10.7.0
 rsa==4.7.2
 s3transfer==0.4.2
 sasl==0.3.1
@@ -470,7 +470,7 @@ sphinxcontrib-redoc==1.6.0
 sphinxcontrib-serializinghtml==1.1.5
 sphinxcontrib-spelling==5.2.1
 spython==0.1.15
-sqlalchemy-drill==1.1.0
+sqlalchemy-drill==1.1.1
 sqlparse==0.4.1
 sshtunnel==0.1.5
 starkbank-ecdsa==1.1.1
@@ -486,7 +486,7 @@ textwrap3==0.9.2
 thrift-sasl==0.4.3
 thrift==0.13.0
 toml==0.10.2
-tomli==1.2.0
+tomli==1.2.1
 toolz==0.11.1
 tornado==6.1
 tqdm==4.62.0
@@ -509,8 +509,8 @@ wcwidth==0.2.5
 websocket-client==1.1.0
 wrapt==1.12.1
 xmltodict==0.12.0
-yamllint==1.26.1
-yandexcloud==0.99.0
+yamllint==1.26.2
+yandexcloud==0.100.0
 yarl==1.6.3
 zdesk==2.7.1
 zict==2.0.0
diff --git a/constraints-3.7.txt b/constraints-3.7.txt
index e6cda5a..29eaf21 100644
--- a/constraints-3.7.txt
+++ b/constraints-3.7.txt
@@ -119,7 +119,7 @@ argcomplete==1.12.3
 arrow==1.1.1
 asana==0.10.3
 asn1crypto==1.4.0
-astroid==2.6.5
+astroid==2.6.6
 async-generator==1.10
 async-timeout==3.0.1
 atlasclient==1.0.0
@@ -128,7 +128,7 @@ avro-python3==1.9.2.1
 aws-xray-sdk==2.8.0
 azure-batch==11.0.0
 azure-common==1.1.27
-azure-core==1.16.0
+azure-core==1.17.0
 azure-cosmos==3.2.0
 azure-datalake-store==0.0.52
 azure-identity==1.6.0
@@ -198,7 +198,7 @@ docutils==0.16
 ecdsa==0.17.0
 elasticsearch-dbapi==0.2.4
 elasticsearch-dsl==7.4.0
-elasticsearch==7.13.4
+elasticsearch==7.14.0
 email-validator==1.1.3
 entrypoints==0.3
 eventlet==0.31.1
@@ -216,7 +216,7 @@ fsspec==2021.7.0
 future==0.18.2
 gcsfs==2021.7.0
 geomet==0.2.1.post1
-gevent==21.1.2
+gevent==21.8.0
 gitdb==4.0.7
 github3.py==2.0.0
 google-ads==13.0.0
@@ -272,7 +272,7 @@ httplib2==0.19.1
 httpx==0.18.2
 humanize==3.11.0
 hvac==0.11.0
-identify==2.2.11
+identify==2.2.12
 idna==3.2
 ijson==3.1.4
 imagesize==1.2.0
@@ -315,7 +315,7 @@ mongomock==3.23.0
 monotonic==1.6
 more-itertools==8.8.0
 moreorless==0.4.0
-moto==2.2.0
+moto==2.2.1
 msal-extensions==0.3.0
 msal==1.13.0
 msgpack==1.0.2
@@ -417,10 +417,10 @@ python3-openid==3.2.0
 pytz==2021.1
 pytzdata==2020.1
 pywinrm==0.4.2
-pyzmq==22.1.0
+pyzmq==22.2.1
 

[GitHub] [airflow] fatmumuhomer commented on a change in pull request #17236: [Airflow 16364] Add conn_timeout and cmd_timeout params to SSHOperator; add conn_timeout param to SSHHook

2021-08-05 Thread GitBox


fatmumuhomer commented on a change in pull request #17236:
URL: https://github.com/apache/airflow/pull/17236#discussion_r683763029



##
File path: airflow/providers/ssh/operators/ssh.py
##
@@ -77,9 +87,24 @@ def __init__(
 self.remote_host = remote_host
 self.command = command
 self.timeout = timeout
+self.conn_timeout = conn_timeout
+self.cmd_timeout = cmd_timeout
+if self.conn_timeout is None:
+self.conn_timeout = self.timeout if self.timeout else 
TIMEOUT_DEFAULT

Review comment:
   Good catch! A test around this case would have caught this, too, so my 
fault for not creating one. I'll tackle both of those tonight.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish commented on a change in pull request #17433: KEDA task count query should ignore k8s queue

2021-08-05 Thread GitBox


dstandish commented on a change in pull request #17433:
URL: https://github.com/apache/airflow/pull/17433#discussion_r683756891



##
File path: chart/templates/workers/worker-kedaautoscaler.yaml
##
@@ -49,5 +49,7 @@ spec:
 query: >-
   SELECT ceil(COUNT(*)::decimal / {{ 
.Values.config.celery.worker_concurrency }})
   FROM task_instance
-  WHERE state='running' OR state='queued'
+  WHERE (state='running' OR state='queued')
+{{ $k8s_queue := default (printf "kubernetes") 
.Values.config.celery_kubernetes_executor.kubernetes_queue -}}
+{{ eq .Values.executor "CeleryKubernetesExecutor" | ternary (printf "AND queue 
!= '%s'" $k8s_queue) (print "") | indent 14 }}

Review comment:
   looks good, thanks




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on pull request #17347: Handle and log exceptions raised during task callback

2021-08-05 Thread GitBox


ephraimbuddy commented on pull request #17347:
URL: https://github.com/apache/airflow/pull/17347#issuecomment-893714524


   @SamWheating please rebase again


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham commented on pull request #17445: Add info log how to fix: More than one pod running with labels

2021-08-05 Thread GitBox


jedcunningham commented on pull request #17445:
URL: https://github.com/apache/airflow/pull/17445#issuecomment-893700450


   As I mentioned in the other PR, I think the right fix is to only attempt to 
reattach to running pods. If we were to offer advice in a log message, setting 
`reattach_on_restart` to false might be better advice?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy closed pull request #17425: Use `dag_maker` fixture in models/test_taskinstance.py

2021-08-05 Thread GitBox


ephraimbuddy closed pull request #17425:
URL: https://github.com/apache/airflow/pull/17425


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #17414: Allow custom timetable as a DAG argument

2021-08-05 Thread GitBox


github-actions[bot] commented on pull request #17414:
URL: https://github.com/apache/airflow/pull/17414#issuecomment-893693795


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham commented on a change in pull request #17236: [Airflow 16364] Add conn_timeout and cmd_timeout params to SSHOperator; add conn_timeout param to SSHHook

2021-08-05 Thread GitBox


jedcunningham commented on a change in pull request #17236:
URL: https://github.com/apache/airflow/pull/17236#discussion_r683689193



##
File path: tests/providers/ssh/hooks/test_ssh.py
##
@@ -475,6 +504,122 @@ def 
test_ssh_connection_with_no_host_key_where_no_host_key_check_is_false(self,
 assert ssh_client.return_value.connect.called is True
 assert 
ssh_client.return_value.get_host_keys.return_value.add.called is False
 
+@mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+def test_ssh_connection_with_conn_timeout(self, ssh_mock):
+hook = SSHHook(
+remote_host='remote_host',
+port='port',
+username='username',
+password='password',
+conn_timeout=20,
+key_file='fake.file',
+)
+
+with hook.get_conn():
+ssh_mock.return_value.connect.assert_called_once_with(
+hostname='remote_host',
+username='username',
+password='password',
+key_filename='fake.file',
+timeout=20,
+compress=True,
+port='port',
+sock=None,
+look_for_keys=True,
+)
+
+@mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+def test_ssh_connection_with_conn_timeout_and_timeout(self, ssh_mock):
+hook = SSHHook(
+remote_host='remote_host',
+port='port',
+username='username',
+password='password',
+timeout=10,
+conn_timeout=20,
+key_file='fake.file',
+)
+
+with hook.get_conn():
+ssh_mock.return_value.connect.assert_called_once_with(
+hostname='remote_host',
+username='username',
+password='password',
+key_filename='fake.file',
+timeout=20,
+compress=True,
+port='port',
+sock=None,
+look_for_keys=True,
+)
+
+@mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+def test_ssh_connection_with_timeout_extra(self, ssh_mock):
+hook = SSHHook(
+ssh_conn_id=self.CONN_SSH_WITH_TIMEOUT_EXTRA,
+remote_host='remote_host',
+port='port',
+username='username',
+timeout=10,
+)
+
+with hook.get_conn():
+ssh_mock.return_value.connect.assert_called_once_with(
+hostname='remote_host',
+username='username',
+timeout=20,
+compress=True,
+port='port',
+sock=None,
+look_for_keys=True,
+)
+
+@mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+def test_ssh_connection_with_conn_timeout_extra(self, ssh_mock):
+hook = SSHHook(
+ssh_conn_id=self.CONN_SSH_WITH_CONN_TIMEOUT_EXTRA,
+remote_host='remote_host',
+port='port',
+username='username',
+timeout=10,
+conn_timeout=15,
+)
+
+# conn_timeout parameter wins over extra options
+with hook.get_conn():
+ssh_mock.return_value.connect.assert_called_once_with(
+hostname='remote_host',
+username='username',
+timeout=15,
+compress=True,
+port='port',
+sock=None,
+look_for_keys=True,
+)
+
+@mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+def test_ssh_connection_with_timeout_extra_and_conn_timeout_extra(self, 
ssh_mock):
+hook = SSHHook(
+ssh_conn_id=self.CONN_SSH_WITH_TIMEOUT_AND_CONN_TIMEOUT_EXTRA,
+remote_host='remote_host',
+port='port',
+username='username',
+timeout=10,
+conn_timeout=15,
+)
+
+# conn_timeout parameter wins over extra options
+with hook.get_conn():
+ssh_mock.return_value.connect.assert_called_once_with(
+hostname='remote_host',
+username='username',
+timeout=15,
+compress=True,
+port='port',
+sock=None,
+look_for_keys=True,
+)

Review comment:
   Aren't we missing a test where neither `timeout` and `conn_timeout` are 
passed and we use the value from the connection extras? Also missing a test 
where the default is used if the connection doesn't have one set?

##
File path: airflow/providers/ssh/operators/ssh.py
##
@@ -43,7 +46,12 @@ class SSHOperator(BaseOperator):
 :type remote_host: str
 :param command: command to execute on remote host. (templated)
 :type command: str
-:param timeout: timeout (in seconds) 

[GitHub] [airflow] BasPH commented on a change in pull request #17451: Add date format filters to Jinja environment

2021-08-05 Thread GitBox


BasPH commented on a change in pull request #17451:
URL: https://github.com/apache/airflow/pull/17451#discussion_r683673558



##
File path: docs/apache-airflow/templates-ref.rst
##
@@ -105,14 +105,35 @@ For example, you could use expressions in your templates 
like ``{{ conn.my_conn_
 Just like with ``var`` it's possible to fetch a connection by string  (e.g. 
``{{ conn.get('my_conn_id_'+index).host }}``
 ) or provide defaults (e.g ``{{ conn.get('my_conn_id', {"host": "host1", 
"login": "user1"}).host }}``)
 
+Filters
+---
+
+Airflow defines the some Jinja filters that can be used to format values.
+
+For example, using ``{{ execution_date | ds }}`` will output the 
execution_date in the ``-MM-DD`` format.
+
+=    
==
+Filter Operates on   Description
+=    
==
+``ds`` datetime  Format the datetime as ``-MM-DD``
+``ds_no_dash`` datetime  Format the datetime as ``MMDD``
+``ts`` datetime  Same as ``.isoformat()``, Example: 
``2018-01-01T00:00:00+00:00``
+``ts_no_dash`` datetime  Same as ``ts`` filter without ``-``, 
``:`` or TimeZone info.
+ Example: ``20180101T00``
+``ts_nodash_with_tz``  datetime  As ``ts`` filter without ``-`` or ``:``. 
Example

Review comment:
   Doesn't align with the filter name
   ```suggestion
   ``ts_no_dash_with_tz``  datetime  As ``ts`` filter without ``-`` or 
``:``. Example
   ```

##
File path: airflow/templates.py
##
@@ -30,3 +35,32 @@ def is_safe_attribute(self, obj, attr, value):
 ``_``) whilst still blocking internal or truely private attributes 
(``__`` prefixed ones).
 """
 return not jinja2.sandbox.is_internal_attribute(obj, attr)
+
+
+def ds_filter(value):
+return value.strftime('%Y-%m-%d')
+
+
+def ds_nodash_filter(value):
+return value.strftime('%Y%m%d')
+
+
+def ts_filter(value):
+return value.isoformat()
+
+
+def ts_nodash_filter(value):
+return value.strftime('%Y%m%dT%H%M%S')
+
+
+def ts_nodash_with_tz_filter(value):
+return value.isoformat().replace('-', '').replace(':', '')
+
+
+FILTERS = {
+'ds': ds_filter,
+'ds_no_dash': ds_nodash_filter,
+'ts': ts_filter,
+'ts_no_dash': ts_nodash_filter,
+'ts_no_dash_with_tz': ts_nodash_with_tz_filter,

Review comment:
   Bit nitpicky, but I think keeping the "no_dash" naming equal avoids any 
confusion.
   
   ```suggestion
   def ds_no_dash_filter(value):
   return value.strftime('%Y%m%d')
   
   
   def ts_filter(value):
   return value.isoformat()
   
   
   def ts_no_dash_filter(value):
   return value.strftime('%Y%m%dT%H%M%S')
   
   
   def ts_no_dash_with_tz_filter(value):
   return value.isoformat().replace('-', '').replace(':', '')
   
   
   FILTERS = {
   'ds': ds_filter,
   'ds_no_dash': ds_no_dash_filter,
   'ts': ts_filter,
   'ts_no_dash': ts_no_dash_filter,
   'ts_no_dash_with_tz': ts_no_dash_with_tz_filter,
   ```

##
File path: docs/apache-airflow/index.rst
##
@@ -110,7 +110,7 @@ unit of work and continuity.
 
 Operators and hooks 
 CLI 
-Macros 
+Templates 

Review comment:
   Why not name it something like "context" or "task context"?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham commented on a change in pull request #17433: KEDA task count query should ignore k8s queue

2021-08-05 Thread GitBox


jedcunningham commented on a change in pull request #17433:
URL: https://github.com/apache/airflow/pull/17433#discussion_r683648189



##
File path: chart/templates/workers/worker-kedaautoscaler.yaml
##
@@ -49,5 +49,7 @@ spec:
 query: >-
   SELECT ceil(COUNT(*)::decimal / {{ 
.Values.config.celery.worker_concurrency }})
   FROM task_instance
-  WHERE state='running' OR state='queued'
+  WHERE (state='running' OR state='queued')
+{{ $k8s_queue := default (printf "kubernetes") 
.Values.config.celery_kubernetes_executor.kubernetes_queue -}}
+{{ eq .Values.executor "CeleryKubernetesExecutor" | ternary (printf "AND queue 
!= '%s'" $k8s_queue) (print "") | indent 14 }}

Review comment:
   ```suggestion
 WHERE (state='running' OR state='queued')
 {{- if eq .Values.executor "CeleryKubernetesExecutor" }}
 AND queue != '{{ 
.Values.config.celery_kubernetes_executor.kubernetes_queue }}'
 {{- end }}
   ```
   
   Do we need the default if we've set it in values already? I think this is 
easier to read.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] 04/05: Fix calculating duration in tree view (#16695)

2021-08-05 Thread jhtimmins
This is an automated email from the ASF dual-hosted git repository.

jhtimmins pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b558205e817f168df7e5c37c9f006a6af31d576e
Author: Brent Bovenzi 
AuthorDate: Mon Jun 28 11:23:19 2021 -0400

Fix calculating duration in tree view (#16695)

Make sure moment doesn't default the end_date to now and show the wrong 
duration

(cherry picked from commit f0b3345ddc489627d73d190a1401804e7b0d9c4e)
---
 airflow/www/static/js/tree.js | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)

diff --git a/airflow/www/static/js/tree.js b/airflow/www/static/js/tree.js
index 07daf0e..4bf366a 100644
--- a/airflow/www/static/js/tree.js
+++ b/airflow/www/static/js/tree.js
@@ -305,7 +305,11 @@ document.addEventListener('DOMContentLoaded', () => {
   .style('stroke-opacity', (d) => (d.external_trigger ? '0' : '1'))
   .on('mouseover', function (d) {
 // Calculate duration if it doesn't exist
-const tt = tiTooltip({ ...d, duration: d.duration || 
moment(d.end_date).diff(d.start_date, 'seconds') });
+const tt = tiTooltip({
+  ...d,
+  // if end_date is undefined then moment will default to now instead 
of null
+  duration: d.duration || d.end_date ? 
moment(d.end_date).diff(d.start_date, 'seconds') : null,
+});
 taskTip.direction('n');
 taskTip.show(tt, this);
 d3.select(this).transition().duration(duration)


[airflow] 02/05: Fail tasks in scheduler when executor reports they failed (#15929)

2021-08-05 Thread jhtimmins
This is an automated email from the ASF dual-hosted git repository.

jhtimmins pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 012321b1325c8d810ae60ad7006ab9f22dfaf95e
Author: Ephraim Anierobi 
AuthorDate: Thu May 20 11:22:01 2021 +0100

Fail tasks in scheduler when executor reports they failed (#15929)

When a task fails in executor while still queued in scheduler, the executor 
reports
this failure but scheduler doesn't change the task state resulting in the 
task
being queued until the scheduler is restarted. This commit fixes it by 
ensuring
that when a task is reported to have failed in the executor, the task is 
failed
in scheduler

(cherry picked from commit deececcabc080844ca89272a2e4ab1183cd51e3f)
---
 airflow/jobs/scheduler_job.py| 4 +++-
 tests/jobs/test_scheduler_job.py | 2 +-
 2 files changed, 4 insertions(+), 2 deletions(-)

diff --git a/airflow/jobs/scheduler_job.py b/airflow/jobs/scheduler_job.py
index b99f4b2..1758ae1 100644
--- a/airflow/jobs/scheduler_job.py
+++ b/airflow/jobs/scheduler_job.py
@@ -1252,12 +1252,14 @@ class SchedulerJob(BaseJob):
 "task says its %s. (Info: %s) Was the task killed 
externally?"
 )
 self.log.error(msg, ti, state, ti.state, info)
+
 request = TaskCallbackRequest(
 full_filepath=ti.dag_model.fileloc,
 simple_task_instance=SimpleTaskInstance(ti),
 msg=msg % (ti, state, ti.state, info),
 )
-
+self.log.info('Setting task instance %s state to %s as 
reported by executor', ti, state)
+ti.set_state(state)
 self.processor_agent.send_callback_to_execute(request)
 
 return len(event_buffer)
diff --git a/tests/jobs/test_scheduler_job.py b/tests/jobs/test_scheduler_job.py
index 0d1f530..37ae65b 100644
--- a/tests/jobs/test_scheduler_job.py
+++ b/tests/jobs/test_scheduler_job.py
@@ -907,7 +907,7 @@ class TestSchedulerJob(unittest.TestCase):
 
 self.scheduler_job._process_executor_events(session=session)
 ti1.refresh_from_db()
-assert ti1.state == State.QUEUED
+assert ti1.state == State.FAILED
 mock_task_callback.assert_called_once_with(
 full_filepath='/test_path1/',
 simple_task_instance=mock.ANY,


[airflow] 01/05: fix(smart_sensor): Unbound variable errors (#14774)

2021-08-05 Thread jhtimmins
This is an automated email from the ASF dual-hosted git repository.

jhtimmins pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 673d78a6cc78038121d3f5e99caa6ded488d654a
Author: Shivansh Saini 
AuthorDate: Thu Jun 24 03:52:27 2021 +0530

fix(smart_sensor): Unbound variable errors (#14774)

Signed-off-by: Shivansh Saini 

Closes #14770

(cherry picked from commit 4aec25a80e3803238cf658c416c8e6d3975a30f6)
---
 airflow/sensors/smart_sensor.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/airflow/sensors/smart_sensor.py b/airflow/sensors/smart_sensor.py
index c8c5ba7..8755eb5 100644
--- a/airflow/sensors/smart_sensor.py
+++ b/airflow/sensors/smart_sensor.py
@@ -435,6 +435,7 @@ class SmartSensorOperator(BaseOperator, SkipMixin):
 TI = TaskInstance
 
 count_marked = 0
+query_result = []
 try:
 query_result = (
 session.query(TI, SI)


[airflow] 05/05: Validate type of `priority_weight` during parsing (#16765)

2021-08-05 Thread jhtimmins
This is an automated email from the ASF dual-hosted git repository.

jhtimmins pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4678cf54b5cd651e0232a42746f9be80db43a609
Author: Kaxil Naik 
AuthorDate: Fri Jul 2 01:52:50 2021 +0100

Validate type of `priority_weight` during parsing (#16765)

closes https://github.com/apache/airflow/issues/16762

Without this the scheduler crashes as validation does not happen at DAG 
Parsing time.

(cherry picked from commit 9d170279a60d9d4ed513bae1c3526f042662)
---
 airflow/models/baseoperator.py| 7 ++-
 tests/models/test_baseoperator.py | 5 +
 2 files changed, 11 insertions(+), 1 deletion(-)

diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
index 10e8bfd..1fec8cf 100644
--- a/airflow/models/baseoperator.py
+++ b/airflow/models/baseoperator.py
@@ -586,10 +586,15 @@ class BaseOperator(Operator, LoggingMixin, TaskMixin, 
metaclass=BaseOperatorMeta
 if isinstance(max_retry_delay, timedelta):
 self.max_retry_delay = max_retry_delay
 else:
-self.log.debug("Max_retry_delay isn't timedelta object, 
assuming secs")
+self.log.debug("max_retry_delay isn't a timedelta object, 
assuming secs")
 self.max_retry_delay = timedelta(seconds=max_retry_delay)
 
 self.params = params or {}  # Available in templates!
+if priority_weight is not None and not isinstance(priority_weight, 
int):
+raise AirflowException(
+f"`priority_weight` for task '{self.task_id}' only accepts 
integers, "
+f"received '{type(priority_weight)}'."
+)
 self.priority_weight = priority_weight
 if not WeightRule.is_valid(weight_rule):
 raise AirflowException(
diff --git a/tests/models/test_baseoperator.py 
b/tests/models/test_baseoperator.py
index fa02b4e..04d3f54 100644
--- a/tests/models/test_baseoperator.py
+++ b/tests/models/test_baseoperator.py
@@ -109,6 +109,11 @@ class TestBaseOperator(unittest.TestCase):
 with pytest.raises(AirflowException, 
match='Argument.*test_param.*required'):
 DummyClass(default_args=default_args)
 
+def test_incorrect_priority_weight(self):
+error_msg = "`priority_weight` for task 'test_op' only accepts 
integers, received ''."
+with pytest.raises(AirflowException, match=error_msg):
+DummyOperator(task_id="test_op", priority_weight="2")
+
 @parameterized.expand(
 [
 ("{{ foo }}", {"foo": "bar"}, "bar"),


[airflow] branch v2-1-test updated (ac832a2 -> 4678cf54)

2021-08-05 Thread jhtimmins
This is an automated email from the ASF dual-hosted git repository.

jhtimmins pushed a change to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from ac832a2  Updates to FlaskAppBuilder 3.3.2+ (#17208)
 new 673d78a  fix(smart_sensor): Unbound variable errors (#14774)
 new 012321b  Fail tasks in scheduler when executor reports they failed 
(#15929)
 new 8e8b58a  Fix CLI 'kubernetes cleanup-pods' which fails on invalid 
label key (#17298)
 new b558205  Fix calculating duration in tree view (#16695)
 new 4678cf54 Validate type of `priority_weight` during parsing (#16765)

The 5 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/cli/commands/kubernetes_command.py| 12 ++--
 airflow/jobs/scheduler_job.py |  4 +++-
 airflow/models/baseoperator.py|  7 ++-
 airflow/sensors/smart_sensor.py   |  1 +
 airflow/www/static/js/tree.js |  6 +-
 tests/cli/commands/test_kubernetes_command.py |  8 ++--
 tests/jobs/test_scheduler_job.py  |  2 +-
 tests/models/test_baseoperator.py |  5 +
 8 files changed, 25 insertions(+), 20 deletions(-)


[airflow] 03/05: Fix CLI 'kubernetes cleanup-pods' which fails on invalid label key (#17298)

2021-08-05 Thread jhtimmins
This is an automated email from the ASF dual-hosted git repository.

jhtimmins pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8e8b58aa0a96164c7142d795a9c919d50e2a9aa1
Author: Damir Lampa <60409723+dla...@users.noreply.github.com>
AuthorDate: Thu Jul 29 14:17:51 2021 -0600

Fix CLI 'kubernetes cleanup-pods' which fails on invalid label key (#17298)

Fix for #16013 - CLI 'kubernetes cleanup-pods' fails on invalid label key

(cherry picked from commit 36bdfe8d0ef7e5fc428434f8313cf390ee9acc8f)
---
 airflow/cli/commands/kubernetes_command.py| 12 ++--
 tests/cli/commands/test_kubernetes_command.py |  8 ++--
 2 files changed, 4 insertions(+), 16 deletions(-)

diff --git a/airflow/cli/commands/kubernetes_command.py 
b/airflow/cli/commands/kubernetes_command.py
index 3c3c8e6..2660dae 100644
--- a/airflow/cli/commands/kubernetes_command.py
+++ b/airflow/cli/commands/kubernetes_command.py
@@ -96,16 +96,8 @@ def cleanup_pods(args):
 'try_number',
 'airflow_version',
 ]
-list_kwargs = {
-"namespace": namespace,
-"limit": 500,
-"label_selector": client.V1LabelSelector(
-match_expressions=[
-client.V1LabelSelectorRequirement(key=label, operator="Exists")
-for label in airflow_pod_labels
-]
-),
-}
+list_kwargs = {"namespace": namespace, "limit": 500, "label_selector": 
','.join(airflow_pod_labels)}
+
 while True:
 pod_list = kube_client.list_namespaced_pod(**list_kwargs)
 for pod in pod_list.items:
diff --git a/tests/cli/commands/test_kubernetes_command.py 
b/tests/cli/commands/test_kubernetes_command.py
index f2a8605..490c7fa 100644
--- a/tests/cli/commands/test_kubernetes_command.py
+++ b/tests/cli/commands/test_kubernetes_command.py
@@ -55,12 +55,8 @@ class TestGenerateDagYamlCommand(unittest.TestCase):
 
 
 class TestCleanUpPodsCommand(unittest.TestCase):
-label_selector = kubernetes.client.V1LabelSelector(
-match_expressions=[
-kubernetes.client.V1LabelSelectorRequirement(key=label, 
operator="Exists")
-for label in ['dag_id', 'task_id', 'execution_date', 'try_number', 
'airflow_version']
-]
-)
+
+label_selector = ','.join(['dag_id', 'task_id', 'execution_date', 
'try_number', 'airflow_version'])
 
 @classmethod
 def setUpClass(cls):


[airflow] branch main updated: Switches to "/" convention in ghcr.io images (#17356)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 1bd3a5c  Switches to "/" convention in ghcr.io images (#17356)
1bd3a5c is described below

commit 1bd3a5c68c88cf3840073d6276460a108f864187
Author: Jarek Potiuk 
AuthorDate: Thu Aug 5 18:39:43 2021 +0200

Switches to "/" convention in ghcr.io images (#17356)

We are using ghcr.io as image cache for our CI builds and Breeze
and it seems ghcr.io is being "rebuilt" while running.

We had been using "airflow-.." image convention before,
bacause multiple nesting levels of images were not supported,
however we experienced errors recently with pushing 2.1 images
(https://issues.apache.org/jira/browse/INFRA-22124) and during
investigation it turned out, that it is possible now to use "/"
in the name of the image, and while it still does not introduce
multiple nesting levels and folder structure, the UI of GitHub
treats it like that and if you have image which starts wiht
"airflow/", the airflow prefix is stripped out and you can also
have even more "/" in then name to introduce further hierarchy.

Since we have to change image naming convention due to (still
unresolved) bug with no permission to push the v2-1-test image
we've decided to change naming convention for all our cache
images to follow this - now available - "/" connvention to make
it better structured and easier to manage/understand.

Some more optimisations are implemented - Python, prod-build and
ci-manifest images are only pushed when "latest" image is prepared.
They are not needed for the COMMIT builds because we only need
final images for those builds. This simplified the code quite
a bit.

The push of cache image in CI is done in one job for both
CI and PROD images and the image is rebuilt again with
latest constraints, to account for the latest constraints
but to make sure that UPGRADE_TO_NEWER_DEPENDENCIES
is not set during the build (which invalidates the cache
for next non-upgrade builds)

Backwards-compatibility was implemented to allow PRs that have
not been upgraded to continue building after this one is merged,
also a workaround has been implemented to make this change
to work even if it is not merged yet to main.

This "legacy" mode will be removed in ~week when everybody rebase
on top of main.

Documentation is updated reflecting those changes.
---
 .github/workflows/build-images.yml |  18 +++
 .github/workflows/ci.yml   | 161 +++--
 CI.rst |  51 ---
 IMAGES.rst |  24 +--
 README.md  |   2 +-
 breeze |  17 +--
 dev/retag_docker_images.py |   9 +-
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |  19 +--
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |  29 +---
 ...ify_ci_image.sh => ci_push_legacy_ci_images.sh} |  35 +
 ...y_ci_image.sh => ci_push_legacy_prod_images.sh} |  35 +
 .../images/ci_wait_for_and_verify_all_ci_images.sh |   2 +
 .../ci_wait_for_and_verify_all_prod_images.sh  |   2 +
 .../ci/images/ci_wait_for_and_verify_ci_image.sh   |  27 ++--
 .../ci/images/ci_wait_for_and_verify_prod_image.sh |  32 ++--
 scripts/ci/libraries/_build_images.sh  | 109 --
 scripts/ci/libraries/_initialization.sh|  16 +-
 scripts/ci/libraries/_kind.sh  |  16 +-
 scripts/ci/libraries/_parallel.sh  |   7 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   | 117 +--
 scripts/ci/libraries/_script_init.sh   |   2 +-
 scripts/ci/selective_ci_checks.sh  |  10 +-
 22 files changed, 351 insertions(+), 389 deletions(-)

diff --git a/.github/workflows/build-images.yml 
b/.github/workflows/build-images.yml
index ec8f435..c2a9054 100644
--- a/.github/workflows/build-images.yml
+++ b/.github/workflows/build-images.yml
@@ -203,6 +203,10 @@ jobs:
 run: ./scripts/ci/images/ci_prepare_ci_image_on_ci.sh
   - name: "Push CI images ${{ matrix.python-version }}:${{ 
env.TARGET_COMMIT_SHA }}"
 run: ./scripts/ci/images/ci_push_ci_images.sh
+  # Remove me on 15th of August 2021 after all users had chance to rebase
+  - name: "Push Legacy CI images ${{ matrix.python-version }}:${{ 
env.TARGET_COMMIT_SHA }}"
+run: ./scripts/ci/images/ci_push_legacy_ci_images.sh
+if: github.event_name == 'pull_request_target'
 
   build-prod-images:
 permissions:
@@ -229,8 +233,11 @@ jobs:
   VERSION_SUFFIX_FOR_PYPI: ".dev0"
 steps:
   

[GitHub] [airflow] potiuk merged pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk merged pull request #17356:
URL: https://github.com/apache/airflow/pull/17356


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk commented on a change in pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#discussion_r683616163



##
File path: IMAGES.rst
##
@@ -246,19 +246,21 @@ Images with a commit SHA (built for pull requests and 
pushes)
 
 .. code-block:: bash
 
-  ghcr.io/apache/airflow--pythonX.Y-ci-v2:- for CI 
images
-  ghcr.io/apache/airflow--pythonX.Y-v2:   - for 
production images
-  ghcr.io/apache/airflow--pythonX.Y-build-v2: - for 
production build stage
-  ghcr.io/apache/airflow-python-v2:X.Y-slim-buster-   - for base 
Python images
+  ghcr.io/apache/airflow//ci/python: - for CI 
images
+  ghcr.io/apache/airflow//prod/python:   - for 
production images

Review comment:
   The funny thing is that in Github UI tje first %2F in the name is 
treated as separator, even if it is part of the name :
   
   
   ![Screenshot from 2021-08-05 
18-29-26](https://user-images.githubusercontent.com/595491/128387679-db18388f-b454-43ea-b149-5cdc851f1ec8.png)
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


ashb commented on a change in pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#discussion_r683612026



##
File path: scripts/ci/libraries/_parallel.sh
##
@@ -82,9 +85,9 @@ function parallel::monitor_loop() {
   continue
 fi
 
-echo "${COLOR_BLUE}### The last lines for ${parallel_process} 
process: ${directory}/stdout ###${COLOR_RESET}"
+echo "${COLOR_BLUE}### The last ${PARALLEL_TAIL_LENGTH} lines for 
${parallel_process} process: ${directory}/stdout ###${COLOR_RESET}"
 echo
-tail -2 "${directory}/stdout" || true
+tail "-${PARALLEL_TAIL_LENGTH}" "${directory}/stdout" || true

Review comment:
   Yeah, I use that all the time. I just can't read.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk commented on a change in pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#discussion_r683611390



##
File path: scripts/ci/libraries/_parallel.sh
##
@@ -82,9 +85,9 @@ function parallel::monitor_loop() {
   continue
 fi
 
-echo "${COLOR_BLUE}### The last lines for ${parallel_process} 
process: ${directory}/stdout ###${COLOR_RESET}"
+echo "${COLOR_BLUE}### The last ${PARALLEL_TAIL_LENGTH} lines for 
${parallel_process} process: ${directory}/stdout ###${COLOR_RESET}"
 echo
-tail -2 "${directory}/stdout" || true
+tail "-${PARALLEL_TAIL_LENGTH}" "${directory}/stdout" || true

Review comment:
   `tail -` works same as `tail -n ` 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


github-actions[bot] commented on pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#issuecomment-893596614


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


ashb commented on a change in pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#discussion_r683610137



##
File path: scripts/ci/libraries/_parallel.sh
##
@@ -82,9 +85,9 @@ function parallel::monitor_loop() {
   continue
 fi
 
-echo "${COLOR_BLUE}### The last lines for ${parallel_process} 
process: ${directory}/stdout ###${COLOR_RESET}"
+echo "${COLOR_BLUE}### The last ${PARALLEL_TAIL_LENGTH} lines for 
${parallel_process} process: ${directory}/stdout ###${COLOR_RESET}"
 echo
-tail -2 "${directory}/stdout" || true
+tail "-${PARALLEL_TAIL_LENGTH}" "${directory}/stdout" || true

Review comment:
    I _swore_ I saw that as `-s${...`
   
    




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk commented on a change in pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#discussion_r683609643



##
File path: scripts/ci/libraries/_parallel.sh
##
@@ -82,9 +85,9 @@ function parallel::monitor_loop() {
   continue
 fi
 
-echo "${COLOR_BLUE}### The last lines for ${parallel_process} 
process: ${directory}/stdout ###${COLOR_RESET}"
+echo "${COLOR_BLUE}### The last ${PARALLEL_TAIL_LENGTH} lines for 
${parallel_process} process: ${directory}/stdout ###${COLOR_RESET}"
 echo
-tail -2 "${directory}/stdout" || true
+tail "-${PARALLEL_TAIL_LENGTH}" "${directory}/stdout" || true

Review comment:
   No -s there :) 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk commented on a change in pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#discussion_r683608694



##
File path: IMAGES.rst
##
@@ -246,19 +246,21 @@ Images with a commit SHA (built for pull requests and 
pushes)
 
 .. code-block:: bash
 
-  ghcr.io/apache/airflow--pythonX.Y-ci-v2:- for CI 
images
-  ghcr.io/apache/airflow--pythonX.Y-v2:   - for 
production images
-  ghcr.io/apache/airflow--pythonX.Y-build-v2: - for 
production build stage
-  ghcr.io/apache/airflow-python-v2:X.Y-slim-buster-   - for base 
Python images
+  ghcr.io/apache/airflow//ci/python: - for CI 
images
+  ghcr.io/apache/airflow//prod/python:   - for 
production images

Review comment:
   indeed. Even if it's "sort of done" - only the first "/" is real 
separator - the others are just part of the name and translated to %2F in URL 
:D. The URLs how to get the packages look weird:
   
   
https://github.com/apache/airflow/pkgs/container/airflow%2Fv2-1-test%2Fprod-build%2Fpython3.8




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


ashb commented on a change in pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#discussion_r683608098



##
File path: scripts/ci/libraries/_parallel.sh
##
@@ -82,9 +85,9 @@ function parallel::monitor_loop() {
   continue
 fi
 
-echo "${COLOR_BLUE}### The last lines for ${parallel_process} 
process: ${directory}/stdout ###${COLOR_RESET}"
+echo "${COLOR_BLUE}### The last ${PARALLEL_TAIL_LENGTH} lines for 
${parallel_process} process: ${directory}/stdout ###${COLOR_RESET}"
 echo
-tail -2 "${directory}/stdout" || true
+tail "-${PARALLEL_TAIL_LENGTH}" "${directory}/stdout" || true

Review comment:
   `tail -s` is sleep. Didn't you mean `tail -n`?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


ashb commented on a change in pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#discussion_r683606425



##
File path: IMAGES.rst
##
@@ -246,19 +246,21 @@ Images with a commit SHA (built for pull requests and 
pushes)
 
 .. code-block:: bash
 
-  ghcr.io/apache/airflow--pythonX.Y-ci-v2:- for CI 
images
-  ghcr.io/apache/airflow--pythonX.Y-v2:   - for 
production images
-  ghcr.io/apache/airflow--pythonX.Y-build-v2: - for 
production build stage
-  ghcr.io/apache/airflow-python-v2:X.Y-slim-buster-   - for base 
Python images
+  ghcr.io/apache/airflow//ci/python: - for CI 
images
+  ghcr.io/apache/airflow//prod/python:   - for 
production images

Review comment:
   I'm glad they enabled this, it's so much nicer




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


ashb commented on pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#issuecomment-893591363


   Oh yes, this one is wating for review 臘 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk commented on pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#issuecomment-893590266


   It's all complete from my side :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


ashb commented on pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#issuecomment-893583079


   Let me know where you get to, and I'll try and push this over the line if we 
don't get it merged before then.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #17451: Add date format filters to Jinja environment

2021-08-05 Thread GitBox


ashb commented on a change in pull request #17451:
URL: https://github.com/apache/airflow/pull/17451#discussion_r683589210



##
File path: docs/apache-airflow/index.rst
##
@@ -110,7 +110,7 @@ unit of work and continuity.
 
 Operators and hooks 
 CLI 
-Macros 
+Templates 

Review comment:
   I decided to rename the page, as it was already talking about more than 
just the macros, but by adding filters there too it is even more true.

##
File path: docs/apache-airflow/redirects.txt
##
@@ -44,6 +44,7 @@ start.rst start/index.rst
 cli-ref.rst cli-and-env-variables-ref.rst
 _api/index.rst python-api-ref.rst
 rest-api-ref.rst deprecated-rest-api-ref.rst
+macros-ref.rst templates-ref.rst

Review comment:
   Old url is redirected.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #17451: Add date format filters to Jinja environment

2021-08-05 Thread GitBox


ashb commented on a change in pull request #17451:
URL: https://github.com/apache/airflow/pull/17451#discussion_r683586341



##
File path: airflow/macros/__init__.py
##
@@ -15,7 +15,6 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-"""Macros."""

Review comment:
   
![image](https://user-images.githubusercontent.com/34150/128382152-35023b5f-352f-430f-a065-1de39b51553c.png)
   
   This showed up as `Macros` in the docs output, and now we don't have pylint 
this should be fine :)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb opened a new pull request #17451: Add date format filters to Jinja environment

2021-08-05 Thread GitBox


ashb opened a new pull request #17451:
URL: https://github.com/apache/airflow/pull/17451


   On it's own this doesn't add all that much, but this is preparatory work
   to be combined with the new `data_interval_start` variables we are
   adding, without having to add the ds/ts/no-dash etc permutations of all
   of them.
   
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk edited a comment on pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#issuecomment-893561416


   I'd love to merge it in to get the latest tests of merged v2-1-test @ashb -> 
going for a week of vacations tomorrow, I hope GitHub will solve Packages 
problems as well till then


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk commented on pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#issuecomment-893561416


   I'd love to merge it in to get the latest tests of merged v2-1-tests @ashb 
-> going for a week of vacations tomorrow, I hope GitHub will solve Packages 
problems as well till then


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] andrewgodwin commented on pull request #17431: Add back missing permissions to UserModelView controls.

2021-08-05 Thread GitBox


andrewgodwin commented on pull request #17431:
URL: https://github.com/apache/airflow/pull/17431#issuecomment-893559902


   @uranusjr Given this is blocking a release, I'd suggest it gets landed and 
then new test refactoring can be done separately.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kosteev opened a new pull request #17450: Add missing menu access for dag dependencies and configurations pages

2021-08-05 Thread GitBox


kosteev opened a new pull request #17450:
URL: https://github.com/apache/airflow/pull/17450


   closes: #17449 
   
   Add missing menu access for following pages for "Op" role:
   - "Browse" -> "DAG Dependencies"
   - "Admin" -> "Configurations"
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kosteev opened a new issue #17449: Missing menu items in navigation panel for "Op" role

2021-08-05 Thread GitBox


kosteev opened a new issue #17449:
URL: https://github.com/apache/airflow/issues/17449


   **Apache Airflow version**: 2.1.1
   
   **What happened**:
   
   For user with "Op" role following menu items are not visible in navigation 
panel, however pages are accessible (roles has access to it):
   - "Browse" -> "DAG Dependencies"
   - "Admin" -> "Configurations"
   
   **What you expected to happen**: available menu items in navigation panel


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow-site] branch gh-pages updated: Deploying to gh-pages from @ c401267b05ac5c520c29882fafd4c3463451f268 

2021-08-05 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch gh-pages
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/gh-pages by this push:
 new 46aabfa  Deploying to gh-pages from  @ 
c401267b05ac5c520c29882fafd4c3463451f268 
46aabfa is described below

commit 46aabfa5f37c9c3dbd0d8c1963eae9ccf643ea95
Author: potiuk 
AuthorDate: Thu Aug 5 15:30:11 2021 +

Deploying to gh-pages from  @ c401267b05ac5c520c29882fafd4c3463451f268 
---
 blog/airflow-1.10.10/index.html|   4 +-
 blog/airflow-1.10.12/index.html|   4 +-
 blog/airflow-1.10.8-1.10.9/index.html  |   4 +-
 blog/airflow-survey-2020/index.html|   4 +-
 blog/airflow-survey/index.html |   4 +-
 blog/airflow-two-point-oh-is-here/index.html   |   4 +-
 blog/airflow_summit_2021/index.html|   4 +-
 blog/announcing-new-website/index.html |   4 +-
 blog/apache-airflow-for-newcomers/index.html   |   4 +-
 .../index.html |   4 +-
 .../index.html |   4 +-
 .../index.html |   4 +-
 .../index.html |   4 +-
 .../index.html |   4 +-
 .../index.html |   4 +-
 index.html |  32 
 search/index.html  |   4 +-
 sitemap.xml|  88 ++---
 use-cases/adobe/index.html |   4 +-
 use-cases/big-fish-games/index.html|   4 +-
 use-cases/dish/index.html  |   4 +-
 use-cases/experity/index.html  |   4 +-
 use-cases/onefootball/index.html   |   4 +-
 use-cases/plarium-krasnodar/index.html |   4 +-
 use-cases/seniorlink/index.html|   4 +-
 use-cases/sift/index.html  |   4 +-
 usecase-logos/seniorlink-logo.png  | Bin 6835 -> 0 bytes
 usecase-logos/seniorlink-logo.svg  |  24 ++
 28 files changed, 132 insertions(+), 108 deletions(-)

diff --git a/blog/airflow-1.10.10/index.html b/blog/airflow-1.10.10/index.html
index 7ea7ecd..692e301 100644
--- a/blog/airflow-1.10.10/index.html
+++ b/blog/airflow-1.10.10/index.html
@@ -36,13 +36,13 @@
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git a/blog/airflow-1.10.12/index.html b/blog/airflow-1.10.12/index.html
index ad6f192..f2c4048 100644
--- a/blog/airflow-1.10.12/index.html
+++ b/blog/airflow-1.10.12/index.html
@@ -36,13 +36,13 @@
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git a/blog/airflow-1.10.8-1.10.9/index.html 
b/blog/airflow-1.10.8-1.10.9/index.html
index bbccd0a..c43f69b 100644
--- a/blog/airflow-1.10.8-1.10.9/index.html
+++ b/blog/airflow-1.10.8-1.10.9/index.html
@@ -36,13 +36,13 @@
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git a/blog/airflow-survey-2020/index.html 
b/blog/airflow-survey-2020/index.html
index a72b091..a4a16f4 100644
--- a/blog/airflow-survey-2020/index.html
+++ b/blog/airflow-survey-2020/index.html
@@ -36,13 +36,13 @@
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git a/blog/airflow-survey/index.html b/blog/airflow-survey/index.html
index c132fed..ea72ab2 100644
--- a/blog/airflow-survey/index.html
+++ b/blog/airflow-survey/index.html
@@ -36,13 +36,13 @@
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git a/blog/airflow-two-point-oh-is-here/index.html 
b/blog/airflow-two-point-oh-is-here/index.html
index d8060cc..4aa491c 100644
--- a/blog/airflow-two-point-oh-is-here/index.html
+++ b/blog/airflow-two-point-oh-is-here/index.html
@@ -36,13 +36,13 @@
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git a/blog/airflow_summit_2021/index.html 
b/blog/airflow_summit_2021/index.html
index c866cad..9b6fdf3 100644
--- a/blog/airflow_summit_2021/index.html
+++ b/blog/airflow_summit_2021/index.html
@@ -36,13 +36,13 @@
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git a/blog/announcing-new-website/index.html 
b/blog/announcing-new-website/index.html
index f70cffe..9d2006c 100644
--- a/blog/announcing-new-website/index.html
+++ b/blog/announcing-new-website/index.html
@@ -36,13 +36,13 @@
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git a/blog/apache-airflow-for-newcomers/index.html 
b/blog/apache-airflow-for-newcomers/index.html
index a5ba9b8..cbb6e33 100644
--- a/blog/apache-airflow-for-newcomers/index.html
+++ b/blog/apache-airflow-for-newcomers/index.html
@@ -37,14 +37,14 @@ Authoring Workflow in Apache Airflow. Airflow makes it easy 
to author workflows
 
 
 
-
+
 
 
 
 
 
-
+
 
 
 
diff --git 
a/blog/apache-con-europe-2019-thoughts-and-insights-by-airflow-committers/index.html
 
b/blog/apache-con-europe-2019-thoughts-and-insights-by-airflow-committers/index.html
index 31dd577..04f924f 100644
--- 

[airflow-site] branch main updated: Change Seniorlink logo to svg from png (#463)

2021-08-05 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/main by this push:
 new c401267  Change Seniorlink logo to svg from png (#463)
c401267 is described below

commit c401267b05ac5c520c29882fafd4c3463451f268
Author: Christopher Petrino 
AuthorDate: Thu Aug 5 11:22:08 2021 -0400

Change Seniorlink logo to svg from png (#463)
---
 .../site/static/usecase-logos/seniorlink-logo.png  | Bin 6835 -> 0 bytes
 .../site/static/usecase-logos/seniorlink-logo.svg  |  24 +
 2 files changed, 24 insertions(+)

diff --git a/landing-pages/site/static/usecase-logos/seniorlink-logo.png 
b/landing-pages/site/static/usecase-logos/seniorlink-logo.png
deleted file mode 100644
index 89ee6b2..000
Binary files a/landing-pages/site/static/usecase-logos/seniorlink-logo.png and 
/dev/null differ
diff --git a/landing-pages/site/static/usecase-logos/seniorlink-logo.svg 
b/landing-pages/site/static/usecase-logos/seniorlink-logo.svg
new file mode 100644
index 000..cd5df9e
--- /dev/null
+++ b/landing-pages/site/static/usecase-logos/seniorlink-logo.svg
@@ -0,0 +1,24 @@
+
+http://www.w3.org/2000/svg; xmlns:xlink="http://www.w3.org/1999/xlink;>
+sl-logo_color
+
+
+
+
+
+
+
+
+
+
+
+
+
+


[GitHub] [airflow] potiuk commented on pull request #17356: Switches to "/" convention in ghcr.io images

2021-08-05 Thread GitBox


potiuk commented on pull request #17356:
URL: https://github.com/apache/airflow/pull/17356#issuecomment-893548359


   All looks good - some temporary errors only. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] iostreamdoth edited a comment on pull request #17446: Add support of `path` parameter for GCloud Storage Transfer Service operators

2021-08-05 Thread GitBox


iostreamdoth edited a comment on pull request #17446:
URL: https://github.com/apache/airflow/pull/17446#issuecomment-893547790


   I see now. 
   PATH was missing earlier from transfer spec for gcs data sink.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] iostreamdoth commented on pull request #17446: Add support of `path` parameter for GCloud Storage Transfer Service operators

2021-08-05 Thread GitBox


iostreamdoth commented on pull request #17446:
URL: https://github.com/apache/airflow/pull/17446#issuecomment-893547790


   I see now. 
   PATH: self.gcs_path was missing earlier from transfer spec.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow-site] potiuk merged pull request #463: Change Seniorlink logo to svg from png

2021-08-05 Thread GitBox


potiuk merged pull request #463:
URL: https://github.com/apache/airflow-site/pull/463


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] iostreamdoth commented on pull request #17446: Add support of `path` parameter for GCloud Storage Transfer Service operators

2021-08-05 Thread GitBox


iostreamdoth commented on pull request #17446:
URL: https://github.com/apache/airflow/pull/17446#issuecomment-893530997


   Wondering if that should be part of transfer_options.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (82bda99 -> e478999)

2021-08-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 82bda99  Fix Google Cloud Operators docs (#17440)
 add e478999  Quarantine test_process_sigterm_works_with_retries and 
test_task_sigkill_works_with_retries in TestLocalTaskJob (#17441)

No new revisions were added by this update.

Summary of changes:
 tests/jobs/test_local_task_job.py | 2 ++
 1 file changed, 2 insertions(+)


[GitHub] [airflow] ephraimbuddy merged pull request #17441: Quarantine test_process_sigterm_works_with_retries and test_task_sigkill_works_with_retries in TestLocalTaskJob

2021-08-05 Thread GitBox


ephraimbuddy merged pull request #17441:
URL: https://github.com/apache/airflow/pull/17441


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT opened a new pull request #17448: Aws secrets manager backend

2021-08-05 Thread GitBox


JavierLopezT opened a new pull request #17448:
URL: https://github.com/apache/airflow/pull/17448


   One of the main advantages of using AWS Secrets Manager is its ability to 
automatically create secrets of RDS databases and Redshift databases. Those 
secrets consist of several keys with their values, i.e user, pass, etc. Also, 
it is normal to store API Keys, sftp, or whatever using different values, as 
shown in the picture below:
   ![Captura de pantalla 2020-05-22 a las 10 41 
07](https://user-images.githubusercontent.com/11339132/82648933-c23ac100-9c18-11ea-9f7c-6a36d0333bbe.png)
   
   With the current code, all the keys and values obtained from a secret are 
stored in the schema attribute of the conn object, unless you have just one key 
with the conn_uri in the value. Thus, the current situation is forcing to use 
Secrets Manager in a way it is not intended to.
   
   With this proposed modification, you can use AWS Secrets Manager using keys 
and values and have some kind of freedom to choose different words for each key 
to make the get_conn work.
   
   Third attempt. Coming from here: https://github.com/apache/airflow/pull/15104
   @xinbinhuang I have tried to implement all your suggestions (thanks!) but 
regarding https://github.com/apache/airflow/pull/15104#discussion_r606090823 I 
received an error that Connection could not be imported. Maybe it is because I 
am using the secrets backend replacing the original file with a COPY in the 
Dockerfile, I dunno
   @ dstandish Regarding 
https://github.com/apache/airflow/pull/15104#issuecomment-812596462, I like 
your suggestion but I would rather keep the code as it is now. It's been a 
while since my first pull request for this (more than a year 
https://github.com/apache/airflow/pull/9008) and I want to have it merge ASAP


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   3   4   >