[airflow] branch master updated: Fix S3ToSnowflakeOperator docstring (#12504)
This is an automated email from the ASF dual-hosted git repository. xddeng pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/master by this push: new 234d689 Fix S3ToSnowflakeOperator docstring (#12504) 234d689 is described below commit 234d689387ef89222bfdee481987c37d1e79b5af Author: Kengo Seki AuthorDate: Sat Nov 21 16:27:53 2020 +0900 Fix S3ToSnowflakeOperator docstring (#12504) There's a parameter called s3_bucket in its docstring, but it doesn't exist actually. The stage parameter exists instead. --- airflow/providers/snowflake/transfers/s3_to_snowflake.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/airflow/providers/snowflake/transfers/s3_to_snowflake.py b/airflow/providers/snowflake/transfers/s3_to_snowflake.py index 8461ef7..8a3da4c 100644 --- a/airflow/providers/snowflake/transfers/s3_to_snowflake.py +++ b/airflow/providers/snowflake/transfers/s3_to_snowflake.py @@ -36,8 +36,8 @@ class S3ToSnowflakeOperator(BaseOperator): :type s3_keys: list :param table: reference to a specific table in snowflake database :type table: str -:param s3_bucket: reference to a specific S3 bucket -:type s3_bucket: str +:param stage: reference to a specific snowflake stage +:type stage: str :param file_format: reference to a specific file format :type file_format: str :param schema: reference to a specific schema in snowflake database
[GitHub] [airflow] github-actions[bot] commented on pull request #12504: Fix S3ToSnowflakeOperator docstring
github-actions[bot] commented on pull request #12504: URL: https://github.com/apache/airflow/pull/12504#issuecomment-731522486 The PR should be OK to be merged with just subset of tests as it does not modify Core of Airflow. The committers might merge it or can add a label 'full tests needed' and re-run it to run all tests if they see it is needed! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG merged pull request #12504: Fix S3ToSnowflakeOperator docstring
XD-DENG merged pull request #12504: URL: https://github.com/apache/airflow/pull/12504 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG commented on a change in pull request #12516: Housekeeping for www/security.py
XD-DENG commented on a change in pull request #12516: URL: https://github.com/apache/airflow/pull/12516#discussion_r528113942 ## File path: airflow/www/security.py ## @@ -520,7 +520,6 @@ def update_admin_perm_view(self): :return: None. """ -all_dag_view = self.find_view_menu(permissions.RESOURCE_DAG) dag_pvs = ( self.get_session.query(sqla_models.ViewMenu) .filter(sqla_models.ViewMenu.name.like(f"{permissions.RESOURCE_DAG_PREFIX}%")) Review comment: There are 3 types of ViewMenus in this context: - type-1: non-DAG views - type-2: the single view corresponding to "All DAGs". This is marked as ["DAGs"](https://github.com/apache/airflow/blob/master/airflow/security/permissions.py#L23)(`permissions.RESOURCE_DAG`). Users who has permission to this VM has access to All DAGs. - type-3: DAG views, whose name starts with ["DAG:"](https://github.com/apache/airflow/blob/master/airflow/security/permissions.py#L24)) (`permissions.RESOURCE_DAG_PREFIX`, , for example, "DAG:example_bash_operator", "DAG:example_python_operator", etc. As indicated in the docstring ("_Admin should have all the permission-views, except the dag views. because Admin already has Dags permission. because Admin already has Dags permission_"), for `Admin`, we only need to assign type-1 + type-2 (type-2 already covers all type-3 View Menus). Given the different string structure of `permissions.RESOURCE_DAG` and `permissions.RESOURCE_DAG_PREFIX` ("`DAGs`" and "`DAG:`"), if we get all entries of type-3, then filter them out (`.filter(~...)`), we naturally get type-1 + type-2. That's why I find the whole method can be simplified. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] houqp opened a new pull request #12530: fix dag serialization crash caused by preset DagContext
houqp opened a new pull request #12530: URL: https://github.com/apache/airflow/pull/12530 DagContext should not cause any side effect for `BaseOperator.get_serialized_fields`. This prevents serialization crash caused from use of `DagContext.push_context_managed_dag` in DAG definition. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12526: Fix git archive command in Release Management guide
github-actions[bot] commented on pull request #12526: URL: https://github.com/apache/airflow/pull/12526#issuecomment-731513095 The PR is ready to be merged. No tests are needed! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] mik-laj opened a new pull request #12529: Fix build on RTD
mik-laj opened a new pull request #12529: URL: https://github.com/apache/airflow/pull/12529 Related: https://github.com/apache/airflow/pull/12444 RTD Build: https://readthedocs.org/projects/airflow/builds/12384530/ Unfortunately, debugging on RTD is limited and I couldn't easily test this change before. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] zhongjiajie edited a comment on pull request #8231: Dag bulk_sync_to_db dag_tag only remove not exists
zhongjiajie edited a comment on pull request #8231: URL: https://github.com/apache/airflow/pull/8231#issuecomment-731512094 > @zhongjiajie can you rebase please? It would be awesome to have it merged, so let us know if you need any help 👍 I rebase it, it be quit a long time, sooo sorry about that. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] zhongjiajie commented on pull request #8231: Dag bulk_sync_to_db dag_tag only remove not exists
zhongjiajie commented on pull request #8231: URL: https://github.com/apache/airflow/pull/8231#issuecomment-731512094 I reopen it, it be quit a long time, sooo sorry about that. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] mik-laj opened a new pull request #12528: Add example DAGs to provider docs
mik-laj opened a new pull request #12528: URL: https://github.com/apache/airflow/pull/12528 Sample DAGs are valuable learning materials, so it's worth promoting them more. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] gfeldman edited a comment on issue #12341: KubernetesExecutor single task run error: Only works with the Celery or Kubernetes executors, sorry
gfeldman edited a comment on issue #12341: URL: https://github.com/apache/airflow/issues/12341#issuecomment-731509781 Confirming @mpolatcan diagnosis and that changing the import from `airflow.contrib.executors.kubernetes_executor` to `airflow.executors.kubernetes_executor` in `airflow/www/views.py` and `airflow/www_rbac/views.py` resolves the issue. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] gfeldman commented on issue #12341: KubernetesExecutor single task run error: Only works with the Celery or Kubernetes executors, sorry
gfeldman commented on issue #12341: URL: https://github.com/apache/airflow/issues/12341#issuecomment-731509781 Confirming @mpolatcan diagnosis and that changing the import under `airflow/www/views.py` and `airflow/www_rbac/views.py` resolves the issue. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Commented] (AIRFLOW-6981) Move Google Cloud Build from Discovery API to Python Library
[ https://issues.apache.org/jira/browse/AIRFLOW-6981?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17236586#comment-17236586 ] ASF GitHub Bot commented on AIRFLOW-6981: - ryanyuan commented on pull request #8575: URL: https://github.com/apache/airflow/pull/8575#issuecomment-731509034 @mik-laj Please let me know when you are going to merge this PR. I will do the rebase again. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Move Google Cloud Build from Discovery API to Python Library > > > Key: AIRFLOW-6981 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6981 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 2.0.0 >Reporter: Ryan Yuan >Assignee: Ryan Yuan >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] ryanyuan commented on pull request #8575: [AIRFLOW-6981] Move Google Cloud Build from Discovery API to Python Library
ryanyuan commented on pull request #8575: URL: https://github.com/apache/airflow/pull/8575#issuecomment-731509034 @mik-laj Please let me know when you are going to merge this PR. I will do the rebase again. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] mik-laj opened a new pull request #12527: Move providers references to separate packages
mik-laj opened a new pull request #12527: URL: https://github.com/apache/airflow/pull/12527 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] fritzb edited a comment on issue #10605: Use private docker repository with K8S operator and XCOM sidecar container
fritzb edited a comment on issue #10605: URL: https://github.com/apache/airflow/issues/10605#issuecomment-731501546 Docker Inc started image pull rate limiting around October 30th 2020 (https://www.docker.com/blog/checking-your-current-docker-pull-rate-limits-and-status/). As a result, all KubernetesPodPerator tasks with XCOM started to fail in large deployment. Can we have the fix in Airflow 1.10 branch as well? Based on the severity of the failure, the bug priority should be raised higher. ``` - image: alpine imageID: "" lastState: {} name: airflow-xcom-sidecar ready: false restartCount: 0 started: false state: waiting: message: 'rpc error: code = Unknown desc = Error response from daemon: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit' reason: ErrImagePull ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] fritzb edited a comment on issue #10605: Use private docker repository with K8S operator and XCOM sidecar container
fritzb edited a comment on issue #10605: URL: https://github.com/apache/airflow/issues/10605#issuecomment-731501546 Docker Inc started image pull rate limiting around October 30th 2020. As a result, all KubernetesPodPerator tasks with XCOM started to fail in large deployment. Can we have the fix in Airflow 1.10 branch as well? Based on the severity of the failure, the bug priority should be raised higher. ``` - image: alpine imageID: "" lastState: {} name: airflow-xcom-sidecar ready: false restartCount: 0 started: false state: waiting: message: 'rpc error: code = Unknown desc = Error response from daemon: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit' reason: ErrImagePull ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] fritzb edited a comment on issue #10605: Use private docker repository with K8S operator and XCOM sidecar container
fritzb edited a comment on issue #10605: URL: https://github.com/apache/airflow/issues/10605#issuecomment-731501546 Docker Inc started image pull rate limiting around October 30th 2020. As a result, all KubernetesPodPerator tasks with XCOM started to fail in large deployment. Can we backport this fix to Airflow 1.10 branch? Based on the severity of the failure, the bug priority should be raised higher. ``` - image: alpine imageID: "" lastState: {} name: airflow-xcom-sidecar ready: false restartCount: 0 started: false state: waiting: message: 'rpc error: code = Unknown desc = Error response from daemon: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit' reason: ErrImagePull ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] fritzb commented on issue #10605: Use private docker repository with K8S operator and XCOM sidecar container
fritzb commented on issue #10605: URL: https://github.com/apache/airflow/issues/10605#issuecomment-731501546 Docker Inc started image pull rate limiting around October 30th 2020. As a result, all KubernetesPodPerator tasks with XCOM started to fail in large deployment. Can we backport this fix to Airflow 1.10 branch? Based on the severity of the failure, the bug priority should be raised higher. ` - image: alpine imageID: "" lastState: {} name: airflow-xcom-sidecar ready: false restartCount: 0 started: false state: waiting: message: 'rpc error: code = Unknown desc = Error response from daemon: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit' reason: ErrImagePull` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Comment Edited] (AIRFLOW-4878) Allow alternate docker image for kubernetes xcom sidecar
[ https://issues.apache.org/jira/browse/AIRFLOW-4878?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17236574#comment-17236574 ] Fritz Budiyanto edited comment on AIRFLOW-4878 at 11/21/20, 3:33 AM: - Docker Inc started image pull rate limiting around October 30th 2020. As a result, all KubernetesPodPerator tasks with XCOM started to fail in large deployment. Can we backport this fix to Airflow 1.10 branch? ased on the severity of the failure, the bug priority should be raised higher. - image: alpine imageID: "" lastState: {} name: airflow-xcom-sidecar ready: false restartCount: 0 started: false state: waiting: message: 'rpc error: code = Unknown desc = Error response from daemon: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: [https://www.docker.com/increase-rate-limit] ' reason: ErrImagePull was (Author: fritzb88): Docker Inc started image pull rate limiting around October 30th 2020. As a result, all KubernetesPodPerator tasks with XCOM started to fail in large deployment. Can we backport this fix to Airflow 1.10 branch? ased on the severity of the failure, the bug priority should be raised higher. > Allow alternate docker image for kubernetes xcom sidecar > > > Key: AIRFLOW-4878 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4878 > Project: Apache Airflow > Issue Type: Improvement > Components: operators, xcom >Affects Versions: 2.0.0 >Reporter: Eric Carlson >Priority: Minor > > Currently the image used for xcom push is hardcoded to `python:3.5-alpine`: > - > [https://github.com/apache/airflow/blob/master/airflow/kubernetes/kubernetes_request_factory/pod_request_factory.py#L98] > This is a problem in environments that are locked down and do not have access > to the public docker registry. > I propose adding a similar set of config variables and method arguments that > specify the image and pull policy of the base container. Specifically: > * in the config file, adding \{xcom_container_repository, > xcom_container_tag, xcom_container_image_pull_policy} to complement the > existing \{worker_container_*} variables > * In the pod operator, adding a xcom_image_* arguments to complement the > existing image_* arguments (image, image_pull_policy) > If this proposal is acceptable I have some locally working code that I'll > create a PR from -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4878) Allow alternate docker image for kubernetes xcom sidecar
[ https://issues.apache.org/jira/browse/AIRFLOW-4878?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17236574#comment-17236574 ] Fritz Budiyanto commented on AIRFLOW-4878: -- Docker Inc started image pull rate limiting around October 30th 2020. As a result, all KubernetesPodPerator tasks with XCOM started to fail in large deployment. Can we backport this fix to Airflow 1.10 branch? ased on the severity of the failure, the bug priority should be raised higher. > Allow alternate docker image for kubernetes xcom sidecar > > > Key: AIRFLOW-4878 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4878 > Project: Apache Airflow > Issue Type: Improvement > Components: operators, xcom >Affects Versions: 2.0.0 >Reporter: Eric Carlson >Priority: Minor > > Currently the image used for xcom push is hardcoded to `python:3.5-alpine`: > - > [https://github.com/apache/airflow/blob/master/airflow/kubernetes/kubernetes_request_factory/pod_request_factory.py#L98] > This is a problem in environments that are locked down and do not have access > to the public docker registry. > I propose adding a similar set of config variables and method arguments that > specify the image and pull policy of the base container. Specifically: > * in the config file, adding \{xcom_container_repository, > xcom_container_tag, xcom_container_image_pull_policy} to complement the > existing \{worker_container_*} variables > * In the pod operator, adding a xcom_image_* arguments to complement the > existing image_* arguments (image, image_pull_policy) > If this proposal is acceptable I have some locally working code that I'll > create a PR from -- This message was sent by Atlassian Jira (v8.3.4#803005)
[airflow] tag nightly-master updated (e93b7e3 -> f0b9aae)
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a change to tag nightly-master in repository https://gitbox.apache.org/repos/asf/airflow.git. *** WARNING: tag nightly-master was modified! *** from e93b7e3 (commit) to f0b9aae (commit) from e93b7e3 Improvements for transfer operators references (#12482) add e9cfa39 Turn off foreign keys before altering table to prevent sqlite issue. (#12487) add 4428235 Fixes taskInstances API endpoint when start_date, end_date or state are None(null) (#12453) add 7d55d45 Reorder Migrations to make it 1.10.13 compatible (#12496) add c3cf695 Unquarantine test_cli_webserver_background (#12501) add c34ef85 Separate out documentation building per provider (#12444) add 20843ff Add missing file_token field to get dag details API endpoint (#12463) add 36a9b0f Fix the default value for VaultBackend's config_path (#12518) add 4495685 Temporarily allow force-push on v1-10-stable (#12524) add fd62f60 Add Energy Solutions to INTHEWILD.md (#12523) add f0b9aae Enable v1-10-stable branch protection (#12525) No new revisions were added by this update. Summary of changes: .github/workflows/ci.yml | 16 +- .gitignore | 2 + .pre-commit-config.yaml| 1 + CI.rst | 15 +- CONTRIBUTING.rst | 44 +- INTHEWILD.md | 1 + README.md | 3 +- airflow/api_connexion/openapi/v1.yaml | 7 +- airflow/api_connexion/schemas/dag_schema.py| 9 + .../849da589634d_prefix_dag_permissions.py | 4 +- .../versions/92c57b58940d_add_fab_tables.py| 4 +- ...606e2_add_scheduling_decision_to_dagrun_and_.py | 17 + .../e38be357a868_update_schema_for_smart_sensor.py | 4 +- airflow/provider.yaml.schema.json | 5 + airflow/providers/amazon/aws/hooks/base_aws.py | 2 +- airflow/providers/amazon/aws/operators/datasync.py | 2 +- airflow/providers/amazon/aws/operators/ecs.py | 2 +- airflow/providers/amazon/aws/operators/glacier.py | 2 +- airflow/providers/amazon/aws/sensors/glacier.py| 4 + .../amazon/aws/transfers/glacier_to_gcs.py | 2 +- .../amazon/aws/transfers/imap_attachment_to_s3.py | 2 +- .../amazon/aws/transfers/s3_to_redshift.py | 2 +- airflow/providers/amazon/provider.yaml | 1 + airflow/providers/apache/cassandra/provider.yaml | 1 + .../providers/apache/cassandra/sensors/record.py | 2 +- .../providers/apache/cassandra/sensors/table.py| 2 +- airflow/providers/apache/druid/provider.yaml | 1 + airflow/providers/apache/hdfs/provider.yaml| 1 + airflow/providers/apache/hive/provider.yaml| 1 + airflow/providers/apache/kylin/provider.yaml | 1 + airflow/providers/apache/livy/provider.yaml| 1 + airflow/providers/apache/pig/provider.yaml | 1 + airflow/providers/apache/pinot/provider.yaml | 1 + .../providers/apache/spark/operators/spark_jdbc.py | 2 +- .../providers/apache/spark/operators/spark_sql.py | 2 +- .../apache/spark/operators/spark_submit.py | 2 +- airflow/providers/apache/spark/provider.yaml | 1 + airflow/providers/apache/sqoop/provider.yaml | 1 + airflow/providers/celery/provider.yaml | 1 + airflow/providers/cloudant/provider.yaml | 1 + .../providers/cncf/kubernetes/hooks/kubernetes.py | 2 +- .../cncf/kubernetes/operators/kubernetes_pod.py| 2 +- airflow/providers/cncf/kubernetes/provider.yaml| 1 + .../providers/databricks/operators/databricks.py | 4 + airflow/providers/databricks/provider.yaml | 1 + airflow/providers/datadog/provider.yaml| 1 + airflow/providers/dingding/provider.yaml | 1 + airflow/providers/discord/provider.yaml| 1 + airflow/providers/docker/provider.yaml | 1 + airflow/providers/elasticsearch/provider.yaml | 1 + airflow/providers/exasol/provider.yaml | 1 + airflow/providers/facebook/provider.yaml | 1 + airflow/providers/ftp/provider.yaml| 1 + airflow/providers/google/__init__.py | 5 +- .../cloud/hooks/cloud_storage_transfer_service.py | 2 +- .../providers/google/cloud/operators/cloud_sql.py | 2 +- .../google/cloud/utils/field_validator.py | 2 +- .../google/cloud/utils/mlengine_operator_utils.py | 17 +- .../cloud/utils/mlengine_prediction_summary.py | 4 +- airflow/providers/google/provider.yaml | 109 +-- airflow/providers/grpc/provider.yaml | 1 + airflow/providers/hashicorp/provider.yaml | 1 + airflow/providers/h
[GitHub] [airflow] kaxil commented on a change in pull request #12516: Housekeeping for www/security.py
kaxil commented on a change in pull request #12516: URL: https://github.com/apache/airflow/pull/12516#discussion_r528044405 ## File path: airflow/www/security.py ## @@ -520,7 +520,6 @@ def update_admin_perm_view(self): :return: None. """ -all_dag_view = self.find_view_menu(permissions.RESOURCE_DAG) dag_pvs = ( self.get_session.query(sqla_models.ViewMenu) .filter(sqla_models.ViewMenu.name.like(f"{permissions.RESOURCE_DAG_PREFIX}%")) Review comment: Can you give an example over here? cc @jhtimmins This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on a change in pull request #12516: Housekeeping for www/security.py
kaxil commented on a change in pull request #12516: URL: https://github.com/apache/airflow/pull/12516#discussion_r528044405 ## File path: airflow/www/security.py ## @@ -520,7 +520,6 @@ def update_admin_perm_view(self): :return: None. """ -all_dag_view = self.find_view_menu(permissions.RESOURCE_DAG) dag_pvs = ( self.get_session.query(sqla_models.ViewMenu) .filter(sqla_models.ViewMenu.name.like(f"{permissions.RESOURCE_DAG_PREFIX}%")) Review comment: Can you give an example over here This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on pull request #11850: Add Telegram hook and operator
kaxil commented on pull request #11850: URL: https://github.com/apache/airflow/pull/11850#issuecomment-731480205 tests are failing This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated (fd62f60 -> f0b9aae)
This is an automated email from the ASF dual-hosted git repository. dimberman pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from fd62f60 Add Energy Solutions to INTHEWILD.md (#12523) add f0b9aae Enable v1-10-stable branch protection (#12525) No new revisions were added by this update. Summary of changes: .asf.yaml | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-)
[GitHub] [airflow] kaxil opened a new pull request #12526: Fix git archive command in Release Management guide
kaxil opened a new pull request #12526: URL: https://github.com/apache/airflow/pull/12526 There was a trailing back-tick which I found when cutting 1.10.13rc1 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated (4495685 -> fd62f60)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from 4495685 Temporarily allow force-push on v1-10-stable (#12524) add fd62f60 Add Energy Solutions to INTHEWILD.md (#12523) No new revisions were added by this update. Summary of changes: INTHEWILD.md | 1 + 1 file changed, 1 insertion(+)
[GitHub] [airflow] dimberman merged pull request #12525: Enable v1-10-stable branch protection
dimberman merged pull request #12525: URL: https://github.com/apache/airflow/pull/12525 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #12523: add Energy Solutions to INTHEWILD.md
boring-cyborg[bot] commented on pull request #12523: URL: https://github.com/apache/airflow/pull/12523#issuecomment-731478429 Awesome work, congrats on your first merged pull request! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil merged pull request #12523: add Energy Solutions to INTHEWILD.md
kaxil merged pull request #12523: URL: https://github.com/apache/airflow/pull/12523 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil opened a new pull request #12525: Enable v1-10-stable branch protection
kaxil opened a new pull request #12525: URL: https://github.com/apache/airflow/pull/12525 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] himabindu07 commented on pull request #11195: 2.0 UI Overhaul/Refresh
himabindu07 commented on pull request #11195: URL: https://github.com/apache/airflow/pull/11195#issuecomment-731477874 I have verified with this image quay.io/astronomer/ap-airflow-dev:2.0.0-buster-onbuild-22919 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch v1-10-stable updated (deb7fc0 -> 5b61c21)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to branch v1-10-stable in repository https://gitbox.apache.org/repos/asf/airflow.git. omit deb7fc0 Add setup.cfg for apache-airflow-upgrade-check (#12517) omit d7ace02 Add upgrade check rule to ensure on "latest" versions (#12514) omit 6739b53 Add upgrade rule to check for mesos executor and flag to change it. (#11528) omit 8e5f722 Silence DagBag INFO logs during upgrade check (#12507) omit 47fc5c4 Add readme for upgrade-check "subpackage". (#12506) omit cb0a290 Fix connection upgrade rules so they run with SQLite backend: (#12502) omit 18100a0 Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241) omit fb90c75 Make airflow upgrade_check a command from a separate dist (#12397) omit 0122893 Fixes bad mocking of UUID and removes upper-limitation on sentry omit e7704c5 Marked test_localtaskjob_maintain_heart_rate as xfail omit 74d7e2e Removes stable tests from quarantine (#10768) omit 69db6be Fixes pull error on building tagged image (#12378) omit dbbc072 Add extra info when starting extra actions in Breeze (#12377) omit f184202 Sentry >= 0.19.0 introduces a breaking change omit 88419e6 Limited cryptography to < 3.2 for python 2.7 omit 9424bff All kubernetes tests use the same host python version (#12374) omit bad94b2 Add TargetQueryValue to KEDA Autoscaler (#9748) omit d20392e Fix missing dash in flag for statsd container (#10691) omit 81df5ee Update scheduler deployment - dags volume mount (#10630) omit 8d8e2fd Wrong key in DAGs Persistent Volume Claim (#10627) omit aebfdb5 Add imagePullSecrets to the create user job (#9802) omit d633bac Run create-user-job as user with specified id (#10291) omit 30a3cb0 use the correct claim name in the webserver (#9688) omit c333fa7 Chart: Flower deployment should use Flower image (#10701) omit e255134 Proposal: remove -serviceaccount suffix from KSA names in helm chart (#10892) omit 9632d60 Fix helm unit test for pod_template_file (#12345) omit 7135196 Mount airflow.cfg to pod_template_file (#12311) omit 3b33645 Fix indentation for affinities in helm chart (#12288) omit 5bb095f Fix spelling (#12253) omit 2d1c1ba Move metrics configuration to new section - metrics (#12165) omit 1515d64 Fix default values for Helm Chart (#12153) omit 95e14c8 Enable Black - Python Auto Formmatter (#9550) omit ddea86f Use PyUpgrade to use Python 3.6 features (#11447) omit ebe0a9e Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802) omit 99eba6f fix helm scheduler deployment / scheduler logs (#11685) omit a9396bd8 All k8s object must comply with JSON Schema (#12003) omit 6bed3fa fix helm chart worker deployment without kerberos (#11681) omit f057c06 Add Flower Authentication to Helm Chart (#11836) omit 960389f Validate airflow chart values.yaml & values.schema.json (#11990) omit 8db6881 Remove unused value in Helm Chart - podMutation (#11703) omit 3f2f454 Consistent use images in Helm Chart (#11701) omit 0175f7a fix pod launcher rolebinding in helm chart (#11675) omit dcdf5cb Pod template file uses custom custom env variable (#11480) omit 9700fa3 Improvements for pod template file with git sync container (#11511) omit d95bb96 Create job for airflow migrations (#11533) omit 4cd3e95 Add missing values entries to Parameters in chart/README.md (#11477) omit 46ef146 Allow multiple schedulers in helm chart (#11330) omit 1df2dbe Mount volumes and volumemounts into scheduler and workers (#11426) omit 6811b51 Adds missing schema for kerberos sidecar configuration (#11413) omit c2de339 Mutual SSL added in PGBouncer configuration in the Chart (#11384) omit 169a57a Add capability of adding service account annotations to Helm Chart (#11387) omit 8c728cf Add CeleryKubernetesExecutor to helm chart (#11288) omit 07c1cf2 Single/Multi-Namespace mode for helm chart (#11034) omit b359d70 Kubernetes executor can adopt tasks from other schedulers (#10996) omit 294d3bd Enables Kerberos sidecar support (#11130) omit 5f50a93 Adds Kubernetes Service Account for the webserver (#11131) omit 956e933 Fix gitSync user in the helm Chart (#11127) omit 0bee131 Install cattr on Python 3.7 - Fix docs build on RTD (#12045) omit fa39d24 Removes the cidfile before generation (#12372) omit 59f2f35 Limit version of marshmallow-sqlalchemy omit 6541d49 Bump attrs and cattrs dependencies (#11969) omit 1475110 Bump attrs to > 20.0 (#11799) omit e15b0be Synchronize INTHEWILD.md with master omit c72954e Simplifies check whether the CI image should be rebuilt (#12181) omit ca7cc59 For v1-10-test PRs and pushes, use target branch scripts for images (#12339) omit e2005ae Deploy was not work
[airflow] branch master updated (36a9b0f -> 4495685)
This is an automated email from the ASF dual-hosted git repository. dimberman pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from 36a9b0f Fix the default value for VaultBackend's config_path (#12518) add 4495685 Temporarily allow force-push on v1-10-stable (#12524) No new revisions were added by this update. Summary of changes: .asf.yaml | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-)
[GitHub] [airflow] dimberman merged pull request #12524: Temporarily allow force-push on v1-10-stable
dimberman merged pull request #12524: URL: https://github.com/apache/airflow/pull/12524 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12523: add Energy Solutions to INTHEWILD.md
github-actions[bot] commented on pull request #12523: URL: https://github.com/apache/airflow/pull/12523#issuecomment-731476953 The PR is ready to be merged. No tests are needed! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil opened a new pull request #12524: Temporarily allow force-push on v1-10-stable
kaxil opened a new pull request #12524: URL: https://github.com/apache/airflow/pull/12524 Temporarily allow force-push on v1-10-stable to sync it with v1-10-test --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12505: Fix S3ToSnowflakeOperator to support uploading all files in the specified stage
github-actions[bot] commented on pull request #12505: URL: https://github.com/apache/airflow/pull/12505#issuecomment-731474813 [The Workflow run](https://github.com/apache/airflow/actions/runs/375416447) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pdashk opened a new pull request #12523: add Energy Solutions to INTHEWILD.md
pdashk opened a new pull request #12523: URL: https://github.com/apache/airflow/pull/12523 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #12523: add Energy Solutions to INTHEWILD.md
boring-cyborg[bot] commented on pull request #12523: URL: https://github.com/apache/airflow/pull/12523#issuecomment-731474302 Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst) Here are some useful points: - Pay attention to the quality of your code (flake8, pylint and type annotations). Our [pre-commits]( https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that. - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short [guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it. - Consider using [Breeze environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations. - Be patient and persistent. It might take some time to get a review or get the final approval from Committers. - Please follow [ASF Code of Conduct](https://www.apache.org/foundation/policies/conduct) for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack. - Be sure to read the [Airflow Coding style]( https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices). Apache Airflow is a community-driven project and together we are making it better 🚀. In case of doubts contact the developers at: Mailing List: d...@airflow.apache.org Slack: https://s.apache.org/airflow-slack This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch v1-10-test updated: Replace 1.10.12 to 1.10.13 in Readme instructions
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/v1-10-test by this push: new 5b61c21 Replace 1.10.12 to 1.10.13 in Readme instructions 5b61c21 is described below commit 5b61c21ab5c2f16e94967ed60c7e9b8b38d401ef Author: Kaxil Naik AuthorDate: Sat Nov 21 00:16:15 2020 + Replace 1.10.12 to 1.10.13 in Readme instructions --- README.md | 12 ++-- 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index ad83639..5a5edd6 100644 --- a/README.md +++ b/README.md @@ -76,7 +76,7 @@ Airflow is not a streaming solution, but it is often used to process real-time d Apache Airflow is tested with: -| | Master version (2.0.0dev) | Stable version (1.10.12) | +| | Master version (2.0.0dev) | Stable version (1.10.13) | | | - | | | Python | 3.6, 3.7, 3.8 | 2.7, 3.5, 3.6, 3.7, 3.8 | | PostgreSQL | 9.6, 10, 11, 12, 13 | 9.6, 10, 11, 12, 13 | @@ -109,7 +109,7 @@ if needed. This means that from time to time plain `pip install apache-airflow` produce unusable Airflow installation. In order to have repeatable installation, however, introduced in **Airflow 1.10.10** and updated in -**Airflow 1.10.12** we also keep a set of "known-to-be-working" constraint files in the +**Airflow 1.10.13** we also keep a set of "known-to-be-working" constraint files in the orphan `constraints-master` and `constraints-1-10` branches. We keep those "known-to-be-working" constraints files separately per major/minor python version. You can use them as constraint files when installing Airflow from PyPI. Note that you have to specify @@ -118,14 +118,14 @@ correct Airflow tag/version/branch and python versions in the URL. 1. Installing just Airflow: ```bash -pip install apache-airflow==1.10.12 \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.7.txt"; +pip install apache-airflow==1.10.13 \ + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.7.txt"; ``` 2. Installing with extras (for example postgres,gcp) ```bash -pip install apache-airflow[postgres,gcp]==1.10.11 \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.7.txt"; +pip install apache-airflow[postgres,gcp]==1.10.13 \ + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.7.txt"; ``` For information on installing backport providers check https://airflow.readthedocs.io/en/latest/backport-providers.html.
[airflow] annotated tag 1.10.13rc1 updated (fa4bf45 -> c1a6cd4)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to annotated tag 1.10.13rc1 in repository https://gitbox.apache.org/repos/asf/airflow.git. *** WARNING: tag 1.10.13rc1 was modified! *** from fa4bf45 (commit) to c1a6cd4 (tag) tagging fa4bf45522eb4bb68b6d70de663cd7359e0b228a (commit) replaces 1.10.12 by Kaxil Naik on Sat Nov 21 00:04:38 2020 + - Log - Airflow 1.10.13rc1 -BEGIN PGP SIGNATURE- iQEzBAABCAAdFiEEEnF1VgQO7y7q8bnCdfzNCiX6DksFAl+4WaMACgkQdfzNCiX6 DkvCsggAkjXWMZt77NF+k/0bc7G+yXQBFezrUCPI5VSTN6yH9tbvn7VsXZvu83kF n+prRgMfvZI8x5KHs5IfP33xTVHq9QS/fkuOhel8Mlyv6nVa3jCz/dSyd8EhY+H7 kVrGc22Da3sVmSks1i/dUn3cS2pLOy1+jj3ktAOrS9TM8HpfcF9ZuXHV/8Oi/Tb2 01jfcHeChCzX+b6rKipbrTczaCMWLv8oLtZIMp/yPf8AK2TliD7I2sWSLMRcq0Mw ii+Bw+w6AXGeNriOhPsH6pVcnHEymWmX2s8a5HVtVTRhn0suclVv0Ire4j2/a86b 5STjuDC+zuXJ5eP2NBqhV3eRrlw/0A== =Jwz5 -END PGP SIGNATURE- --- No new revisions were added by this update. Summary of changes:
svn commit: r44607 - /dev/airflow/1.10.13rc1/
Author: kaxilnaik Date: Sat Nov 21 00:08:09 2020 New Revision: 44607 Log: Add artifacts for Airflow 1.10.13rc1 Added: dev/airflow/1.10.13rc1/ dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz (with props) dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz.asc dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz.sha512 dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz (with props) dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz.asc dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz.sha512 dev/airflow/1.10.13rc1/apache_airflow-1.10.13rc1-py2.py3-none-any.whl (with props) dev/airflow/1.10.13rc1/apache_airflow-1.10.13rc1-py2.py3-none-any.whl.asc dev/airflow/1.10.13rc1/apache_airflow-1.10.13rc1-py2.py3-none-any.whl.sha512 Added: dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz == Binary file - no diff available. Propchange: dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz -- svn:mime-type = application/octet-stream Added: dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz.asc == --- dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz.asc (added) +++ dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz.asc Sat Nov 21 00:08:09 2020 @@ -0,0 +1,11 @@ +-BEGIN PGP SIGNATURE- + +iQEzBAABCAAdFiEEEnF1VgQO7y7q8bnCdfzNCiX6DksFAl+4WhwACgkQdfzNCiX6 +DkvfwAgAj1Zl3RJlcmyiI483cZI+vCL88Gm93edEclgkdhSpzN8okcWqL8PYJSLn +3OF4rhLW7qsonaibSq9IzgeDVh2cis220h49ip025gyfPAfp5j8RMYncxr2Vbk12 +8sOl90YaigcxjjhlDrA6QTNjG60ivvcmVCxcw+miKmdpC0iUgXydQ4theKutpVtF +Qf1UIyj28HQwN4yd7EQFlk9edslgs2rZH9hFpkWUrzwqo454x3ymo66wmkbKVect ++nmTwGqUv67ym42e4tBlr2BtMbYqRoM+TkbJpS9XfwpNm7vAKcOM25pc3q35wugd +m8CDAsftxUT+Kkd8Ro/xS8+SuX//Og== +=wtHL +-END PGP SIGNATURE- Added: dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz.sha512 == --- dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz.sha512 (added) +++ dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-bin.tar.gz.sha512 Sat Nov 21 00:08:09 2020 @@ -0,0 +1,4 @@ +apache-airflow-1.10.13rc1-bin.tar.gz: 36D641C0 F2AAEC4E BCE91BD2 66CE2BC6 + AA2D995C 08C9B62A 0EA1CBEC 027E657B + 8AF4B54E 6C3AD117 9634198D F6EA53F8 + 163711BA 95586B5B 7BCF7F4B 098A19E2 Added: dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz == Binary file - no diff available. Propchange: dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz -- svn:mime-type = application/octet-stream Added: dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz.asc == --- dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz.asc (added) +++ dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz.asc Sat Nov 21 00:08:09 2020 @@ -0,0 +1,11 @@ +-BEGIN PGP SIGNATURE- + +iQEzBAABCAAdFiEEEnF1VgQO7y7q8bnCdfzNCiX6DksFAl+4WhgACgkQdfzNCiX6 +Dkts3wf+PtMZf9zdKtu7ao8a1lCVrre6cGrII3UlEAdAgWLqdlqlKjH/k8tTmz06 +lRsYPGHsD6rFWUQLNcRyPizXjNbG5bSHoOOLY31gKvke+xjMCdfByI6FyeTmAOOL +vO8YRg7vsIeYTcNUhJXhfFx+RsSiSia5b/yIaxeFESHjqEkQ2RYbTCvtlizNvCF7 +AILkBfOkHUwFB28kXBFHGE6V7ymaztCyq6oFPImVCl11P0s6D420hhwePXMG4eNh +Gl3zF5/2JsS3Lg4SJV8pOy8A5qKBHY9uF8NQ9pXHiFyUtK6lhIvj9KQWkwtZz2mS +y5PqJ9ms2c9XqdsVmtXmzriTQo0LWA== +=ERfE +-END PGP SIGNATURE- Added: dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz.sha512 == --- dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz.sha512 (added) +++ dev/airflow/1.10.13rc1/apache-airflow-1.10.13rc1-source.tar.gz.sha512 Sat Nov 21 00:08:09 2020 @@ -0,0 +1,3 @@ +apache-airflow-1.10.13rc1-source.tar.gz: +B676E05E 3AFEEC47 63DB584C 83D35E71 3573C956 8582E786 1BC43458 938554FD 7AF58C0E + 9EA38D82 EF502A47 C477FD51 0F162CB5 0707C183 A1D86D6B 2E114D7F Added: dev/airflow/1.10.13rc1/apache_airflow-1.10.13rc1-py2.py3-none-any.whl == Binary file - no diff available. Propchange: dev/airflow/1.10.13rc1/apache_airflow-1.10.13rc1-py2.py3-none-any.whl -- svn:mime-type = application/octet-stream Added: dev/airflow/1.10.13rc1/apache_airflow-1.10.13rc1-py2.py3-none-any.whl.asc
[GitHub] [airflow] himabindu07 commented on pull request #10944: Task Instance Modal UX Enhancements
himabindu07 commented on pull request #10944: URL: https://github.com/apache/airflow/pull/10944#issuecomment-731468535 I have verified with this image quay.io/astronomer/ap-airflow-dev:2.0.0-buster-onbuild-22919 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] 09/09: Add setup.cfg for apache-airflow-upgrade-check (#12517)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit fa4bf45522eb4bb68b6d70de663cd7359e0b228a Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 21:55:22 2020 + Add setup.cfg for apache-airflow-upgrade-check (#12517) Nothing currently uses this setup.cfg from this folder -- automation for that will follow shortly. Now that there is a place list deps for upgrade-check I have moved `packaging` and `importlib_meta` to test_requires of the main dist. Build a py2+py3 wheel. (cherry picked from commit deb7fc0ffe3ddb9bf9aad6f5f9479d20598e2fb5) --- airflow/upgrade/setup.cfg | 64 +++ setup.py | 4 ++- 2 files changed, 67 insertions(+), 1 deletion(-) diff --git a/airflow/upgrade/setup.cfg b/airflow/upgrade/setup.cfg new file mode 100644 index 000..ddd708c --- /dev/null +++ b/airflow/upgrade/setup.cfg @@ -0,0 +1,64 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[metadata] +version=1.0.0 +name = apache-airflow-upgrade-check +description = Check for compatibility between Airflow versions +long_description = file: airflow/upgrade/README.md +long_description_content_type = text/markdown +url = https://airflow.apache.org +author = Apache Airflow PMC +author-email = d...@airflow.apache.org +license = Apache License 2.0 +license_files = + LICENSE + NOTICE +classifiers = +Development Status :: 5 - Production/Stable +Intended Audience :: Developers +License :: OSI Approved :: Apache Software License +Programming Language :: Python :: 2.7 +Programming Language :: Python :: 3 +Programming Language :: Python :: 3.6 +Programming Language :: Python :: 3.7 +Programming Language :: Python :: 3.8 +keywords = airflow, upgrade +project_urls = +Source Code=https://github.com/apache/airflow +Bug Tracker=https://github.com/apache/airflow/issues +Documentation=https://airflow.apache.org/docs/ + +[options] +packages = find: +install_requires = +apache-airflow>=1.10.13,<3 +importlib-metadata~=2.0; python_version<"3.8" +packaging +python_requires = >=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.* +setup_requires = +setuptools>=40.0 +wheel +zip_safe = no + +[options.packages.find] +include = + airflow.upgrade + airflow.upgrade.* + +[bdist_wheel] +universal=1 diff --git a/setup.py b/setup.py index 9617ac7..f5f2a53 100644 --- a/setup.py +++ b/setup.py @@ -426,12 +426,14 @@ devel = [ 'flaky', 'freezegun', 'gitpython', +'importlib-metadata~=2.0; python_version<"3.8"', 'ipdb', 'jira', 'mock;python_version<"3.3"', 'mongomock', 'moto==1.3.14', # TODO - fix Datasync issues to get higher version of moto: #See: https://github.com/apache/airflow/issues/10985 +'packaging', 'parameterized', 'paramiko', 'pre-commit', @@ -445,7 +447,7 @@ devel = [ 'pywinrm', 'qds-sdk>=1.9.6', 'requests_mock', -'yamllint' +'yamllint', ] # IMPORTANT NOTE!!!
[airflow] 07/09: Add upgrade rule to check for mesos executor and flag to change it. (#11528)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 60b43ef0474c9e479715f7cf9e1592a5230440de Author: RaviTeja Pothana AuthorDate: Fri Nov 20 21:07:53 2020 +0530 Add upgrade rule to check for mesos executor and flag to change it. (#11528) * add upgrade rule to check for mesos config and flag to remove it. * change from checking the mesos config section to core/executor config * remove leading new line and indent in desc (cherry picked from commit 6739b537a016a81f5da495a894a0fe990c8ad25e) --- airflow/upgrade/rules/mesos_executor_removed.py| 36 tests/upgrade/rules/test_mesos_executor_removed.py | 48 ++ 2 files changed, 84 insertions(+) diff --git a/airflow/upgrade/rules/mesos_executor_removed.py b/airflow/upgrade/rules/mesos_executor_removed.py new file mode 100644 index 000..c0e6b52 --- /dev/null +++ b/airflow/upgrade/rules/mesos_executor_removed.py @@ -0,0 +1,36 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.upgrade.rules.base_rule import BaseRule +from airflow.configuration import conf + + +class MesosExecutorRemovedRule(BaseRule): +""" +MesosExecutorRemovedRule class to ease upgrade to Airflow 2.0 +""" +title = "Removal of Mesos Executor" +description = "The Mesos Executor has been deprecated as it was not widely used and not maintained." + +def check(self): +executor_key = conf.get(section="core", key="executor") +if executor_key == "MesosExecutor": +return ( +"The Mesos Executor has been deprecated as it was not widely used and not maintained." +"Please migrate to any of the supported executors." +"See https://airflow.apache.org/docs/stable/executor/index.html for more details." +) diff --git a/tests/upgrade/rules/test_mesos_executor_removed.py b/tests/upgrade/rules/test_mesos_executor_removed.py new file mode 100644 index 000..2b1e530 --- /dev/null +++ b/tests/upgrade/rules/test_mesos_executor_removed.py @@ -0,0 +1,48 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from unittest import TestCase + +from airflow.upgrade.rules.mesos_executor_removed import MesosExecutorRemovedRule +from tests.test_utils.config import conf_vars + + +class TestMesosExecutorRemovedRule(TestCase): +@conf_vars({("core", "executor"): "MesosExecutor"}) +def test_invalid_check(self): +rule = MesosExecutorRemovedRule() + +assert isinstance(rule.description, str) +assert isinstance(rule.title, str) + +msg = ( +"The Mesos Executor has been deprecated as it was not widely used and not maintained." +"Please migrate to any of the supported executors." +"See https://airflow.apache.org/docs/stable/executor/index.html for more details." +) + +response = rule.check() +assert response == msg + +@conf_vars({("core", "executor"): "SequentialExecutor"}) +def test_check(self): +rule = MesosExecutorRemovedRule() + +assert isinstance(rule.description, str) +assert isinstance(rule.title, str) + +response = rule.check() +assert response is None
[airflow] 08/09: Add upgrade check rule to ensure on "latest" versions (#12514)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 41bb8f28f1febbd80109435fdc1dcfc0dce44a76 Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 21:21:13 2020 + Add upgrade check rule to ensure on "latest" versions (#12514) This checks against PyPI to make sure this is run with the latest non-preview release of apache-airflow-upgrade-check, and the latest 1.10.x of apache-airflow (cherry picked from commit d7ace0267c34bd5520d321a495399710c1c49cd1) --- airflow/upgrade/rules/__init__.py | 2 +- airflow/upgrade/rules/aaa_airflow_version_check.py | 87 ++ setup.py | 2 + .../rules/test_aaa_airflow_version_check.py| 75 +++ 4 files changed, 165 insertions(+), 1 deletion(-) diff --git a/airflow/upgrade/rules/__init__.py b/airflow/upgrade/rules/__init__.py index 4735c7f..97d0160 100644 --- a/airflow/upgrade/rules/__init__.py +++ b/airflow/upgrade/rules/__init__.py @@ -21,7 +21,7 @@ def get_rules(): """Automatically discover all rules""" rule_classes = [] path = os.path.dirname(os.path.abspath(__file__)) -for file in os.listdir(path): +for file in sorted(os.listdir(path)): if not file.endswith(".py") or file in ("__init__.py", "base_rule.py"): continue py_file = file[:-3] diff --git a/airflow/upgrade/rules/aaa_airflow_version_check.py b/airflow/upgrade/rules/aaa_airflow_version_check.py new file mode 100644 index 000..ad84eb4 --- /dev/null +++ b/airflow/upgrade/rules/aaa_airflow_version_check.py @@ -0,0 +1,87 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +# This module starts with `aaa_` so that it is sorted first alphabetically, but is still a valid python module +# name (starting with digitis is not valid) + +from __future__ import absolute_import + +from packaging.version import Version +import requests + +from airflow.upgrade.rules.base_rule import BaseRule + +try: +import importlib.metadata as importlib_metadata +except ImportError: +import importlib_metadata + + +class VersionCheckRule(BaseRule): + +title = "Check for latest versions of apache-airflow and checker" + +description = """\ +Check that the latest version of apache-airflow-upgrade-check is installed, and +that you are on the latest 1.10.x release of apache-airflow.""" + +def pypi_releases(self, distname): +""" +Get all the non-dev releases of a dist from PyPI +""" + +resp = requests.get("https://pypi.org/pypi/{}/json".format(distname)) +resp.raise_for_status() + +for rel_string in resp.json()["releases"].keys(): +ver = Version(rel_string) +if ver.is_devrelease or ver.is_prerelease: +continue +yield ver + +def check(self): + +current_airflow_version = Version(__import__("airflow").__version__) +try: +upgrade_check_ver = Version( + importlib_metadata.distribution("apache-airflow-upgrade-check").version, +) +except importlib_metadata.PackageNotFoundError: +upgrade_check_ver = Version("0.0.0") + +try: +latest_airflow_v1_release = sorted( +filter(lambda v: v.major == 1, self.pypi_releases("apache-airflow")) +)[-1] + +if current_airflow_version < latest_airflow_v1_release: +yield ( +"There is a more recent version of apache-airflow. Please upgrade to {} and re-run this" +" script" +).format(latest_airflow_v1_release) + +latest_upgrade_check_release = sorted( +self.pypi_releases("apache-airflow-upgrade-check") +)[-1] + +if upgrade_check_ver < latest_upgrade_check_release: +yield ( +"There is a more recent version of apache-airflow-upgrade-check. Please upgrade to {}" +" and re-run this script" +).format(latest_upgrade_check_releas
[airflow] 06/09: Silence DagBag INFO logs during upgrade check (#12507)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit f47efe91a8b4ba5ba43fa8d55990d27516ab0b6c Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 15:21:15 2020 + Silence DagBag INFO logs during upgrade check (#12507) By default, the logs would appear in the middle of the status stream, which makes it slightly harder to parse the output. Before: ``` = STATUS = Legacy UI is deprecated by default..SUCCESS Users must set a kubernetes.pod_template_file value.FAIL Changes in import paths of hooks, operators, sensors and others.FAIL Remove airflow.AirflowMacroPlugin class.SUCCESS [2020-11-20 14:26:04,083] {__init__.py:50} INFO - Using executor SequentialExecutor [2020-11-20 14:26:04,083] {dagbag.py:417} INFO - Filling up the DagBag from /home/ash/airflow/dags Jinja Template Variables cannot be undefinedSUCCESS ``` After: ``` = STATUS = Legacy UI is deprecated by default..SUCCESS Users must set a kubernetes.pod_template_file value.FAIL Changes in import paths of hooks, operators, sensors and others.FAIL Remove airflow.AirflowMacroPlugin class.SUCCESS Jinja Template Variables cannot be undefinedSUCCESS ``` (cherry picked from commit 8e5f7227a4b00149c326637bb51409b8da6caa81) --- airflow/upgrade/rules/undefined_jinja_varaibles.py | 11 +-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py b/airflow/upgrade/rules/undefined_jinja_varaibles.py index b97cfbc..7e39be4 100644 --- a/airflow/upgrade/rules/undefined_jinja_varaibles.py +++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py @@ -17,6 +17,7 @@ from __future__ import absolute_import +import logging import re import jinja2 @@ -131,8 +132,14 @@ The user should do either of the following to fix this - def check(self, dagbag=None): if not dagbag: -dag_folder = conf.get("core", "dags_folder") -dagbag = DagBag(dag_folder) +logger = logging.root +old_level = logger.level +try: +logger.setLevel(logging.ERROR) +dag_folder = conf.get("core", "dags_folder") +dagbag = DagBag(dag_folder) +finally: +logger.setLevel(old_level) dags = dagbag.dags messages = [] for dag_id, dag in dags.items():
[airflow] 05/09: Add readme for upgrade-check "subpackage". (#12506)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 7ea8a1e8fa526b160cd82babe2bec318061aaae8 Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 15:21:05 2020 + Add readme for upgrade-check "subpackage". (#12506) The intent here is this content will be visible on the pypi page for apache-airflow-upgrade-check (once it is published.) This is far from perfect, but the _first_ step is the hardest :) (cherry picked from commit 47fc5c474684c9f59924837a7661e44af106943a) --- airflow/upgrade/README.md | 82 +++ 1 file changed, 82 insertions(+) diff --git a/airflow/upgrade/README.md b/airflow/upgrade/README.md new file mode 100644 index 000..e5e201d --- /dev/null +++ b/airflow/upgrade/README.md @@ -0,0 +1,82 @@ + + +# Apache Airflow Upgrade Check + +[![PyPI version](https://badge.fury.io/py/apache-airflow-upgrade-check.svg)](https://badge.fury.io/py/apache-airflow-upgrade-check) +[![License](http://img.shields.io/:license-Apache%202-blue.svg)](http://www.apache.org/licenses/LICENSE-2.0.txt) +[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/apache-airflow-upgrade-check.svg)](https://pypi.org/project/apache-airflow-upgrade-check/) +[![PyPI - Downloads](https://img.shields.io/pypi/dm/apache-airflow-upgrade-check)](https://pypi.org/project/apache-airflow-upgrade-check/) +[![Twitter Follow](https://img.shields.io/twitter/follow/ApacheAirflow.svg?style=social&label=Follow)](https://twitter.com/ApacheAirflow) +[![Slack Status](https://img.shields.io/badge/slack-join_chat-white.svg?logo=slack&style=social)](https://s.apache.org/airflow-slack) + +This package aims to easy the upgrade journey from [Apache Airflow](https://airflow.apache.org/) 1.10 to 2.0. + +While we have put a lot of effort in to making this upgrade as painless as possible, with many changes +providing upgrade path (where the old code continues to work and prints out a deprecation warning) there were +unfortunately some breaking changes where we couldn't provide a compatibility shim. + +The recommended upgrade path to get to Airflow 2.0.0 is to first upgrade to the latest release in the 1.10 +series (at the time of writing: 1.10.13) and to then run this script. + +```bash +pip install apache-airflow-upgrade-check +airflow upgrade_check +``` + +This will then print out a number of action items that you should follow before upgrading to 2.0.0 or above. + +The exit code of the command will be 0 (success) if no problems are reported, or 1 otherwise. + +For example: + +``` += STATUS = + +Legacy UI is deprecated by default..SUCCESS +Users must set a kubernetes.pod_template_file value.FAIL +Changes in import paths of hooks, operators, sensors and others.FAIL +Remove airflow.AirflowMacroPlugin class.SUCCESS +Jinja Template Variables cannot be undefinedSUCCESS +Fernet is enabled by defaultFAIL +Logging configuration has been moved to new section.SUCCESS +Connection.conn_id is not uniqueSUCCESS +GCP service account key deprecation.SUCCESS +Users must delete deprecated configs for KubernetesExecutor.FAIL +Changes in import path of remote task handlers..SUCCESS +Chain between DAG and operator not allowed..SUCCESS +SendGrid email uses old airflow.contrib module..SUCCESS +Connection.conn_type is not nullableSUCCESS +Found 16 problems. + + RECOMMENDATIONS = + +Users must set a kubernetes.pod_template_file value +--- +In Airflow 2.0, KubernetesExecutor Users need to set a pod_template_file as a base +value for all pods launched by the KubernetesExecutor + + +Problems: + + 1. Please create a pod_template_file by running `airflow generate_pod_template`. +This will generate a pod using your aiflow.cfg settings + +... +```
[airflow] 04/09: Fix connection upgrade rules so they run with SQLite backend: (#12502)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 61079d8990e714e4a5bf4522236d455ef8ce2a1c Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 12:24:41 2020 + Fix connection upgrade rules so they run with SQLite backend: (#12502) When testing out the ugprade check command on 1.10 with the default SQLite backend I got the following error: ``` [2020-11-20 12:10:28,248] {base.py:1372} ERROR - Error closing cursor Traceback (most recent call last): File "/home/ash/.virtualenvs/airflow-1-10/lib/python3.7/site-packages/sqlalchemy/engine/result.py", line 1284, in fetchall l = self.process_rows(self._fetchall_impl()) File "/home/ash/.virtualenvs/airflow-1-10/lib/python3.7/site-packages/sqlalchemy/engine/result.py", line 1230, in _fetchall_impl return self.cursor.fetchall() sqlite3.ProgrammingError: Cannot operate on a closed database. ``` This was caused because the `@provide_session` decorator closed the connection when the function returned, and since we were using a generator expression, we hadn't yet fetched all the rows. This changes it so the rows are fetched before returning. (cherry picked from commit cb0a2902fc7de73e5defc9ce54f5fd1429ec4fc6) --- airflow/upgrade/rules/conn_id_is_unique.py | 2 +- airflow/upgrade/rules/conn_type_is_not_nullable.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/airflow/upgrade/rules/conn_id_is_unique.py b/airflow/upgrade/rules/conn_id_is_unique.py index 8e1e474..edb53c8 100644 --- a/airflow/upgrade/rules/conn_id_is_unique.py +++ b/airflow/upgrade/rules/conn_id_is_unique.py @@ -41,5 +41,5 @@ duplicate values in conn_id column. .having(func.count() > 1) return ( 'Connection.conn_id={} is not unique.'.format(conn_id) -for conn_id in invalid_connections +for conn_id in invalid_connections.all() ) diff --git a/airflow/upgrade/rules/conn_type_is_not_nullable.py b/airflow/upgrade/rules/conn_type_is_not_nullable.py index 8f574d9..a411eb0 100644 --- a/airflow/upgrade/rules/conn_type_is_not_nullable.py +++ b/airflow/upgrade/rules/conn_type_is_not_nullable.py @@ -42,5 +42,5 @@ If you made any modifications to the table directly, make sure you don't have nu 'Connection have empty conn_type field.'.format( conn.id, conn.conn_id ) -for conn in invalid_connections +for conn in invalid_connections.all() )
[airflow] 01/09: Fix the default value for VaultBackend's config_path (#12518)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit e57798b4ddc3a1321cb1b6b41e4c9c1d61303760 Author: Kaxil Naik AuthorDate: Fri Nov 20 21:24:50 2020 + Fix the default value for VaultBackend's config_path (#12518) It is `config` not `configs` --- airflow/contrib/secrets/hashicorp_vault.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/airflow/contrib/secrets/hashicorp_vault.py b/airflow/contrib/secrets/hashicorp_vault.py index 536e7f9..edf48c3 100644 --- a/airflow/contrib/secrets/hashicorp_vault.py +++ b/airflow/contrib/secrets/hashicorp_vault.py @@ -56,7 +56,7 @@ class VaultBackend(BaseSecretsBackend, LoggingMixin): (default: 'variables') :type variables_path: str :param config_path: Specifies the path of the secret to read Airflow Configurations -(default: 'configs'). +(default: 'config'). :type config_path: str :param url: Base URL for the Vault instance being addressed. :type url: str
[airflow] 03/09: Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 15b628e2d7bb51ddca9e62dc7d2299f7b6799e5a Author: Ashmeet Lamba AuthorDate: Thu Nov 19 16:33:06 2020 +0530 Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241) Adding a rule to check for undefined jinja variables when upgrading to Airflow2.0 (cherry picked from commit 18100a0ec96692bb4d7c9e80f206b66a30c65e0d) --- airflow/models/dag.py | 4 +- airflow/upgrade/rules/undefined_jinja_varaibles.py | 153 .../rules/test_undefined_jinja_varaibles.py| 192 + 3 files changed, 347 insertions(+), 2 deletions(-) diff --git a/airflow/models/dag.py b/airflow/models/dag.py index 348e19d..a1908e3 100644 --- a/airflow/models/dag.py +++ b/airflow/models/dag.py @@ -224,7 +224,7 @@ class DAG(BaseDag, LoggingMixin): end_date=None, # type: Optional[datetime] full_filepath=None, # type: Optional[str] template_searchpath=None, # type: Optional[Union[str, Iterable[str]]] -template_undefined=jinja2.Undefined, # type: Type[jinja2.Undefined] +template_undefined=None, # type: Optional[Type[jinja2.Undefined]] user_defined_macros=None, # type: Optional[Dict] user_defined_filters=None, # type: Optional[Dict] default_args=None, # type: Optional[Dict] @@ -807,7 +807,7 @@ class DAG(BaseDag, LoggingMixin): # Default values (for backward compatibility) jinja_env_options = { 'loader': jinja2.FileSystemLoader(searchpath), -'undefined': self.template_undefined, +'undefined': self.template_undefined or jinja2.Undefined, 'extensions': ["jinja2.ext.do"], 'cache_size': 0 } diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py b/airflow/upgrade/rules/undefined_jinja_varaibles.py new file mode 100644 index 000..b97cfbc --- /dev/null +++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py @@ -0,0 +1,153 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from __future__ import absolute_import + +import re + +import jinja2 +import six + +from airflow import conf +from airflow.models import DagBag, TaskInstance +from airflow.upgrade.rules.base_rule import BaseRule +from airflow.utils import timezone + + +class UndefinedJinjaVariablesRule(BaseRule): + +title = "Jinja Template Variables cannot be undefined" + +description = """\ +The default behavior for DAG's Jinja templates has changed. Now, more restrictive validation +of non-existent variables is applied - `jinja2.StrictUndefined`. + +The user should do either of the following to fix this - +1. Fix the Jinja Templates by defining every variable or providing default values +2. Explicitly declare `template_undefined=jinja2.Undefined` while defining the DAG +""" + +def _check_rendered_content(self, rendered_content, seen_oids=None): +"""Replicates the logic in BaseOperator.render_template() to +cover all the cases needed to be checked. +""" +if isinstance(rendered_content, six.string_types): +return set(re.findall(r"{{(.*?)}}", rendered_content)) + +elif isinstance(rendered_content, (int, float, bool)): +return set() + +elif isinstance(rendered_content, (tuple, list, set)): +debug_error_messages = set() +for element in rendered_content: + debug_error_messages.update(self._check_rendered_content(element)) +return debug_error_messages + +elif isinstance(rendered_content, dict): +debug_error_messages = set() +for key, value in rendered_content.items(): + debug_error_messages.update(self._check_rendered_content(value)) +return debug_error_messages + +else: +if seen_oids is None: +seen_oids = set() +return self._nested_check_rendered(rendered_content, seen_oids) + +def _nested_check_rendered(self, rendered_content, seen_oids): +debug_error_mess
[airflow] 02/09: Add 1.10.13 Changelog
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit b2631fca810840b07c00450cd82082e1cef5d3d8 Author: Kaxil Naik AuthorDate: Thu Nov 19 21:28:11 2020 + Add 1.10.13 Changelog --- CHANGELOG.txt | 113 ++ 1 file changed, 113 insertions(+) diff --git a/CHANGELOG.txt b/CHANGELOG.txt index 4fb12de..b818fef 100644 --- a/CHANGELOG.txt +++ b/CHANGELOG.txt @@ -1,3 +1,115 @@ +Airflow 1.10.13, 2020-11-24 + + +New Features + + +- Add "already checked" to failed pods in K8sPodOperator (#11368) +- Pass SQLAlchemy engine options to FAB based UI (#11395) +- [AIRFLOW-4438] Add Gzip compression to S3_hook (#8571) +- Add permission "extra_links" for Viewer role and above (#10719) +- Add generate_yaml command to easily test KubernetesExecutor before deploying pods (#10677) +- Add Secrets backend for Microsoft Azure Key Vault (#10898) + +Bug Fixes +" + +- SkipMixin: Handle empty branches (#11120) +- [AIRFLOW-5274] dag loading duration metric name too long (#5890) +- Handle no Dagrun in DagrunIdDep (#8389) (#11343) +- Fix Kubernetes Executor logs for long dag names (#10942) +- Add on_kill support for the KubernetesPodOperator (#10666) +- KubernetesPodOperator template fix (#10963) +- Fix displaying of add serialized_dag table migration +- Fix Start Date tooltip on DAGs page (#10637) +- URL encode execution date in the Last Run link (#10595) +- Fixes issue with affinity backcompat in Airflow 1.10 +- Fix KubernetesExecutor import in views.py +- Fix issues with Gantt View (#12419) +- Fix Entrypoint and _CMD config variables (#12411) +- Fix operator field update for SerializedBaseOperator (#10924) +- Limited cryptography to < 3.2 for python 2.7 +- Install cattr on Python 3.7 - Fix docs build on RTD (#12045) +- Limit version of marshmallow-sqlalchemy +- Pin `kubernetes` to a max version of 11.0.0 (#11974) +- Use snakebite-py3 for HDFS dependency for Python3 (#12340) +- Removes snakebite kerberos dependency (#10865) +- Fix failing dependencies for FAB and Celery (#10828) +- Fix pod_mutation_hook for 1.10.13 (#10850) +- Fix formatting of Host information +- Fix Logout Google Auth issue in Non-RBAC UI (#11890) +- Add missing imports to app.py (#10650) +- Show Generic Error for Charts & Query View in old UI (#12495) +- TimeSensor should respect the default_timezone config (#9699) +- TimeSensor should respect DAG timezone (#9882) +- Unify user session lifetime configuration (#11970) +- Handle outdated webserver session timeout gracefully. (#12332) + + +Improvements + + +- Add XCom.deserialize_value to Airflow 1.10.13 (#12328) +- Mount airflow.cfg to pod_template_file (#12311) +- All k8s object must comply with JSON Schema (#12003) +- Validate airflow chart values.yaml & values.schema.json (#11990) +- Pod template file uses custom custom env variable (#11480) +- Bump attrs and cattrs dependencies (#11969) +- Bump attrs to > 20.0 (#11799) +- [AIRFLOW-3607] Only query DB once per DAG run for TriggerRuleDep (#4751) +- Rename task with duplicate task_id +- Manage Flask AppBuilder Tables using Alembic Migrations (#12352) +- ``airflow test`` only works for tasks in 1.10, not whole dags (#11191) +- Improve warning messaging for duplicate task_ids in a DAG (#11126) +- Pins moto to 1.3.14 (#10986) +- DbApiHook: Support kwargs in get_pandas_df (#9730) +- Make grace_period_seconds option on K8sPodOperator (#10727) +- Fix syntax error in Dockerfile 'maintainer' Label (#10899) +- The entrypoints in Docker Image should be owned by Airflow (#10853) +- Make dockerfiles Google Shell Guide Compliant (#10734) +- clean-logs script for Dockerfile: trim logs before sleep (#10685) +- When sending tasks to celery from a sub-process, reset signal handlers (#11278) +- SkipMixin: Add missing session.commit() and test (#10421) +- Webserver: Further Sanitize values passed to origin param (#12459) +- Security upgrade lodash from 4.17.19 to 4.17.20 (#11095) +- Log instead of raise an Error for unregistered OperatorLinks (#11959) +- Mask Password in Log table when using the CLI (#11468) +- [AIRFLOW-3607] Optimize dep checking when depends on past set and concurrency limit +- Execute job cancel HTTPRequest in Dataproc Hook (#10361) +- Use rst lexer to format airflow upgrade check output (#11259) +- Remove deprecation warning from contrib/kubernetes/pod.py +- adding body as templated field for CloudSqlImportOperator (#10510) +- Change log level for User's session to DEBUG (#12414) + +Deprecations + + +- Deprecate importing Hooks from plugin-created module (#12133) +- Deprecate adding Operators and Sensors via plugins (#12069) + +Doc only changes + + +- [Doc] Correct description for macro task_instance_key_str (#11062) +- Checks if all the libraries in setup.py are listed in installation
[airflow] branch v1-10-test updated (f421543 -> fa4bf45)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git. discard f421543 Add setup.cfg for apache-airflow-upgrade-check (#12517) discard 9194893 Add upgrade check rule to ensure on "latest" versions (#12514) discard 8a3 Add upgrade rule to check for mesos executor and flag to change it. (#11528) discard ce7f0d5 Silence DagBag INFO logs during upgrade check (#12507) discard 27e7981 Add readme for upgrade-check "subpackage". (#12506) discard 0b75a71 Fix connection upgrade rules so they run with SQLite backend: (#12502) discard 415cda4 Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241) discard 4b299e8 Fix the default value for VaultBackend's config_path (#12518) discard e348467 Add 1.10.13 Changelog new e57798b Fix the default value for VaultBackend's config_path (#12518) new b2631fc Add 1.10.13 Changelog new 15b628e Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241) new 61079d8 Fix connection upgrade rules so they run with SQLite backend: (#12502) new 7ea8a1e Add readme for upgrade-check "subpackage". (#12506) new f47efe9 Silence DagBag INFO logs during upgrade check (#12507) new 60b43ef Add upgrade rule to check for mesos executor and flag to change it. (#11528) new 41bb8f2 Add upgrade check rule to ensure on "latest" versions (#12514) new fa4bf45 Add setup.cfg for apache-airflow-upgrade-check (#12517) This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (f421543) \ N -- N -- N refs/heads/v1-10-test (fa4bf45) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. The 9 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: CHANGELOG.txt | 1 + 1 file changed, 1 insertion(+)
[airflow] 07/07: Add setup.cfg for apache-airflow-upgrade-check (#12517)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit f4215436fedd6da64df4b1007fe8c9a46ef527fb Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 21:55:22 2020 + Add setup.cfg for apache-airflow-upgrade-check (#12517) Nothing currently uses this setup.cfg from this folder -- automation for that will follow shortly. Now that there is a place list deps for upgrade-check I have moved `packaging` and `importlib_meta` to test_requires of the main dist. Build a py2+py3 wheel. (cherry picked from commit deb7fc0ffe3ddb9bf9aad6f5f9479d20598e2fb5) --- airflow/upgrade/setup.cfg | 64 +++ setup.py | 4 ++- 2 files changed, 67 insertions(+), 1 deletion(-) diff --git a/airflow/upgrade/setup.cfg b/airflow/upgrade/setup.cfg new file mode 100644 index 000..ddd708c --- /dev/null +++ b/airflow/upgrade/setup.cfg @@ -0,0 +1,64 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[metadata] +version=1.0.0 +name = apache-airflow-upgrade-check +description = Check for compatibility between Airflow versions +long_description = file: airflow/upgrade/README.md +long_description_content_type = text/markdown +url = https://airflow.apache.org +author = Apache Airflow PMC +author-email = d...@airflow.apache.org +license = Apache License 2.0 +license_files = + LICENSE + NOTICE +classifiers = +Development Status :: 5 - Production/Stable +Intended Audience :: Developers +License :: OSI Approved :: Apache Software License +Programming Language :: Python :: 2.7 +Programming Language :: Python :: 3 +Programming Language :: Python :: 3.6 +Programming Language :: Python :: 3.7 +Programming Language :: Python :: 3.8 +keywords = airflow, upgrade +project_urls = +Source Code=https://github.com/apache/airflow +Bug Tracker=https://github.com/apache/airflow/issues +Documentation=https://airflow.apache.org/docs/ + +[options] +packages = find: +install_requires = +apache-airflow>=1.10.13,<3 +importlib-metadata~=2.0; python_version<"3.8" +packaging +python_requires = >=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.* +setup_requires = +setuptools>=40.0 +wheel +zip_safe = no + +[options.packages.find] +include = + airflow.upgrade + airflow.upgrade.* + +[bdist_wheel] +universal=1 diff --git a/setup.py b/setup.py index 9617ac7..f5f2a53 100644 --- a/setup.py +++ b/setup.py @@ -426,12 +426,14 @@ devel = [ 'flaky', 'freezegun', 'gitpython', +'importlib-metadata~=2.0; python_version<"3.8"', 'ipdb', 'jira', 'mock;python_version<"3.3"', 'mongomock', 'moto==1.3.14', # TODO - fix Datasync issues to get higher version of moto: #See: https://github.com/apache/airflow/issues/10985 +'packaging', 'parameterized', 'paramiko', 'pre-commit', @@ -445,7 +447,7 @@ devel = [ 'pywinrm', 'qds-sdk>=1.9.6', 'requests_mock', -'yamllint' +'yamllint', ] # IMPORTANT NOTE!!!
[airflow] 06/07: Add upgrade check rule to ensure on "latest" versions (#12514)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 91948939ebbcc3e302b4d15eac8a41b1dd913cb6 Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 21:21:13 2020 + Add upgrade check rule to ensure on "latest" versions (#12514) This checks against PyPI to make sure this is run with the latest non-preview release of apache-airflow-upgrade-check, and the latest 1.10.x of apache-airflow (cherry picked from commit d7ace0267c34bd5520d321a495399710c1c49cd1) --- airflow/upgrade/rules/__init__.py | 2 +- airflow/upgrade/rules/aaa_airflow_version_check.py | 87 ++ setup.py | 2 + .../rules/test_aaa_airflow_version_check.py| 75 +++ 4 files changed, 165 insertions(+), 1 deletion(-) diff --git a/airflow/upgrade/rules/__init__.py b/airflow/upgrade/rules/__init__.py index 4735c7f..97d0160 100644 --- a/airflow/upgrade/rules/__init__.py +++ b/airflow/upgrade/rules/__init__.py @@ -21,7 +21,7 @@ def get_rules(): """Automatically discover all rules""" rule_classes = [] path = os.path.dirname(os.path.abspath(__file__)) -for file in os.listdir(path): +for file in sorted(os.listdir(path)): if not file.endswith(".py") or file in ("__init__.py", "base_rule.py"): continue py_file = file[:-3] diff --git a/airflow/upgrade/rules/aaa_airflow_version_check.py b/airflow/upgrade/rules/aaa_airflow_version_check.py new file mode 100644 index 000..ad84eb4 --- /dev/null +++ b/airflow/upgrade/rules/aaa_airflow_version_check.py @@ -0,0 +1,87 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +# This module starts with `aaa_` so that it is sorted first alphabetically, but is still a valid python module +# name (starting with digitis is not valid) + +from __future__ import absolute_import + +from packaging.version import Version +import requests + +from airflow.upgrade.rules.base_rule import BaseRule + +try: +import importlib.metadata as importlib_metadata +except ImportError: +import importlib_metadata + + +class VersionCheckRule(BaseRule): + +title = "Check for latest versions of apache-airflow and checker" + +description = """\ +Check that the latest version of apache-airflow-upgrade-check is installed, and +that you are on the latest 1.10.x release of apache-airflow.""" + +def pypi_releases(self, distname): +""" +Get all the non-dev releases of a dist from PyPI +""" + +resp = requests.get("https://pypi.org/pypi/{}/json".format(distname)) +resp.raise_for_status() + +for rel_string in resp.json()["releases"].keys(): +ver = Version(rel_string) +if ver.is_devrelease or ver.is_prerelease: +continue +yield ver + +def check(self): + +current_airflow_version = Version(__import__("airflow").__version__) +try: +upgrade_check_ver = Version( + importlib_metadata.distribution("apache-airflow-upgrade-check").version, +) +except importlib_metadata.PackageNotFoundError: +upgrade_check_ver = Version("0.0.0") + +try: +latest_airflow_v1_release = sorted( +filter(lambda v: v.major == 1, self.pypi_releases("apache-airflow")) +)[-1] + +if current_airflow_version < latest_airflow_v1_release: +yield ( +"There is a more recent version of apache-airflow. Please upgrade to {} and re-run this" +" script" +).format(latest_airflow_v1_release) + +latest_upgrade_check_release = sorted( +self.pypi_releases("apache-airflow-upgrade-check") +)[-1] + +if upgrade_check_ver < latest_upgrade_check_release: +yield ( +"There is a more recent version of apache-airflow-upgrade-check. Please upgrade to {}" +" and re-run this script" +).format(latest_upgrade_check_releas
[airflow] 04/07: Silence DagBag INFO logs during upgrade check (#12507)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit ce7f0d508a3c28858848f66ab3b33a67366d8904 Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 15:21:15 2020 + Silence DagBag INFO logs during upgrade check (#12507) By default, the logs would appear in the middle of the status stream, which makes it slightly harder to parse the output. Before: ``` = STATUS = Legacy UI is deprecated by default..SUCCESS Users must set a kubernetes.pod_template_file value.FAIL Changes in import paths of hooks, operators, sensors and others.FAIL Remove airflow.AirflowMacroPlugin class.SUCCESS [2020-11-20 14:26:04,083] {__init__.py:50} INFO - Using executor SequentialExecutor [2020-11-20 14:26:04,083] {dagbag.py:417} INFO - Filling up the DagBag from /home/ash/airflow/dags Jinja Template Variables cannot be undefinedSUCCESS ``` After: ``` = STATUS = Legacy UI is deprecated by default..SUCCESS Users must set a kubernetes.pod_template_file value.FAIL Changes in import paths of hooks, operators, sensors and others.FAIL Remove airflow.AirflowMacroPlugin class.SUCCESS Jinja Template Variables cannot be undefinedSUCCESS ``` (cherry picked from commit 8e5f7227a4b00149c326637bb51409b8da6caa81) --- airflow/upgrade/rules/undefined_jinja_varaibles.py | 11 +-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py b/airflow/upgrade/rules/undefined_jinja_varaibles.py index b97cfbc..7e39be4 100644 --- a/airflow/upgrade/rules/undefined_jinja_varaibles.py +++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py @@ -17,6 +17,7 @@ from __future__ import absolute_import +import logging import re import jinja2 @@ -131,8 +132,14 @@ The user should do either of the following to fix this - def check(self, dagbag=None): if not dagbag: -dag_folder = conf.get("core", "dags_folder") -dagbag = DagBag(dag_folder) +logger = logging.root +old_level = logger.level +try: +logger.setLevel(logging.ERROR) +dag_folder = conf.get("core", "dags_folder") +dagbag = DagBag(dag_folder) +finally: +logger.setLevel(old_level) dags = dagbag.dags messages = [] for dag_id, dag in dags.items():
[airflow] 02/07: Fix connection upgrade rules so they run with SQLite backend: (#12502)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 0b75a713e812aac2f1483d5120a1a4d9d0e48554 Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 12:24:41 2020 + Fix connection upgrade rules so they run with SQLite backend: (#12502) When testing out the ugprade check command on 1.10 with the default SQLite backend I got the following error: ``` [2020-11-20 12:10:28,248] {base.py:1372} ERROR - Error closing cursor Traceback (most recent call last): File "/home/ash/.virtualenvs/airflow-1-10/lib/python3.7/site-packages/sqlalchemy/engine/result.py", line 1284, in fetchall l = self.process_rows(self._fetchall_impl()) File "/home/ash/.virtualenvs/airflow-1-10/lib/python3.7/site-packages/sqlalchemy/engine/result.py", line 1230, in _fetchall_impl return self.cursor.fetchall() sqlite3.ProgrammingError: Cannot operate on a closed database. ``` This was caused because the `@provide_session` decorator closed the connection when the function returned, and since we were using a generator expression, we hadn't yet fetched all the rows. This changes it so the rows are fetched before returning. (cherry picked from commit cb0a2902fc7de73e5defc9ce54f5fd1429ec4fc6) --- airflow/upgrade/rules/conn_id_is_unique.py | 2 +- airflow/upgrade/rules/conn_type_is_not_nullable.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/airflow/upgrade/rules/conn_id_is_unique.py b/airflow/upgrade/rules/conn_id_is_unique.py index 8e1e474..edb53c8 100644 --- a/airflow/upgrade/rules/conn_id_is_unique.py +++ b/airflow/upgrade/rules/conn_id_is_unique.py @@ -41,5 +41,5 @@ duplicate values in conn_id column. .having(func.count() > 1) return ( 'Connection.conn_id={} is not unique.'.format(conn_id) -for conn_id in invalid_connections +for conn_id in invalid_connections.all() ) diff --git a/airflow/upgrade/rules/conn_type_is_not_nullable.py b/airflow/upgrade/rules/conn_type_is_not_nullable.py index 8f574d9..a411eb0 100644 --- a/airflow/upgrade/rules/conn_type_is_not_nullable.py +++ b/airflow/upgrade/rules/conn_type_is_not_nullable.py @@ -42,5 +42,5 @@ If you made any modifications to the table directly, make sure you don't have nu 'Connection have empty conn_type field.'.format( conn.id, conn.conn_id ) -for conn in invalid_connections +for conn in invalid_connections.all() )
[airflow] branch v1-10-test updated (4b299e8 -> f421543)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git. from 4b299e8 Fix the default value for VaultBackend's config_path (#12518) new 415cda4 Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241) new 0b75a71 Fix connection upgrade rules so they run with SQLite backend: (#12502) new 27e7981 Add readme for upgrade-check "subpackage". (#12506) new ce7f0d5 Silence DagBag INFO logs during upgrade check (#12507) new 8a3 Add upgrade rule to check for mesos executor and flag to change it. (#11528) new 9194893 Add upgrade check rule to ensure on "latest" versions (#12514) new f421543 Add setup.cfg for apache-airflow-upgrade-check (#12517) The 7 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: airflow/models/dag.py | 4 +- airflow/upgrade/README.md | 82 + airflow/upgrade/rules/__init__.py | 2 +- airflow/upgrade/rules/aaa_airflow_version_check.py | 87 ++ airflow/upgrade/rules/conn_id_is_unique.py | 2 +- airflow/upgrade/rules/conn_type_is_not_nullable.py | 2 +- ...fernet_enabled.py => mesos_executor_removed.py} | 27 ++- airflow/upgrade/rules/undefined_jinja_varaibles.py | 160 + airflow/upgrade/setup.cfg | 64 +++ setup.py | 6 +- .../rules/test_aaa_airflow_version_check.py| 75 ...eprecated.py => test_mesos_executor_removed.py} | 20 ++- .../rules/test_undefined_jinja_varaibles.py| 192 + 13 files changed, 693 insertions(+), 30 deletions(-) create mode 100644 airflow/upgrade/README.md create mode 100644 airflow/upgrade/rules/aaa_airflow_version_check.py copy airflow/upgrade/rules/{fernet_enabled.py => mesos_executor_removed.py} (59%) create mode 100644 airflow/upgrade/rules/undefined_jinja_varaibles.py create mode 100644 airflow/upgrade/setup.cfg create mode 100644 tests/upgrade/rules/test_aaa_airflow_version_check.py copy tests/upgrade/rules/{test_legacy_ui_deprecated.py => test_mesos_executor_removed.py} (66%) create mode 100644 tests/upgrade/rules/test_undefined_jinja_varaibles.py
[airflow] 05/07: Add upgrade rule to check for mesos executor and flag to change it. (#11528)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 8a3ccdf0049aaeea4cee7dcb6b96c2db4578 Author: RaviTeja Pothana AuthorDate: Fri Nov 20 21:07:53 2020 +0530 Add upgrade rule to check for mesos executor and flag to change it. (#11528) * add upgrade rule to check for mesos config and flag to remove it. * change from checking the mesos config section to core/executor config * remove leading new line and indent in desc (cherry picked from commit 6739b537a016a81f5da495a894a0fe990c8ad25e) --- airflow/upgrade/rules/mesos_executor_removed.py| 36 tests/upgrade/rules/test_mesos_executor_removed.py | 48 ++ 2 files changed, 84 insertions(+) diff --git a/airflow/upgrade/rules/mesos_executor_removed.py b/airflow/upgrade/rules/mesos_executor_removed.py new file mode 100644 index 000..c0e6b52 --- /dev/null +++ b/airflow/upgrade/rules/mesos_executor_removed.py @@ -0,0 +1,36 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.upgrade.rules.base_rule import BaseRule +from airflow.configuration import conf + + +class MesosExecutorRemovedRule(BaseRule): +""" +MesosExecutorRemovedRule class to ease upgrade to Airflow 2.0 +""" +title = "Removal of Mesos Executor" +description = "The Mesos Executor has been deprecated as it was not widely used and not maintained." + +def check(self): +executor_key = conf.get(section="core", key="executor") +if executor_key == "MesosExecutor": +return ( +"The Mesos Executor has been deprecated as it was not widely used and not maintained." +"Please migrate to any of the supported executors." +"See https://airflow.apache.org/docs/stable/executor/index.html for more details." +) diff --git a/tests/upgrade/rules/test_mesos_executor_removed.py b/tests/upgrade/rules/test_mesos_executor_removed.py new file mode 100644 index 000..2b1e530 --- /dev/null +++ b/tests/upgrade/rules/test_mesos_executor_removed.py @@ -0,0 +1,48 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from unittest import TestCase + +from airflow.upgrade.rules.mesos_executor_removed import MesosExecutorRemovedRule +from tests.test_utils.config import conf_vars + + +class TestMesosExecutorRemovedRule(TestCase): +@conf_vars({("core", "executor"): "MesosExecutor"}) +def test_invalid_check(self): +rule = MesosExecutorRemovedRule() + +assert isinstance(rule.description, str) +assert isinstance(rule.title, str) + +msg = ( +"The Mesos Executor has been deprecated as it was not widely used and not maintained." +"Please migrate to any of the supported executors." +"See https://airflow.apache.org/docs/stable/executor/index.html for more details." +) + +response = rule.check() +assert response == msg + +@conf_vars({("core", "executor"): "SequentialExecutor"}) +def test_check(self): +rule = MesosExecutorRemovedRule() + +assert isinstance(rule.description, str) +assert isinstance(rule.title, str) + +response = rule.check() +assert response is None
[airflow] 03/07: Add readme for upgrade-check "subpackage". (#12506)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 27e79811a7cd62e495711d6d42fc2533f22857b0 Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 15:21:05 2020 + Add readme for upgrade-check "subpackage". (#12506) The intent here is this content will be visible on the pypi page for apache-airflow-upgrade-check (once it is published.) This is far from perfect, but the _first_ step is the hardest :) (cherry picked from commit 47fc5c474684c9f59924837a7661e44af106943a) --- airflow/upgrade/README.md | 82 +++ 1 file changed, 82 insertions(+) diff --git a/airflow/upgrade/README.md b/airflow/upgrade/README.md new file mode 100644 index 000..e5e201d --- /dev/null +++ b/airflow/upgrade/README.md @@ -0,0 +1,82 @@ + + +# Apache Airflow Upgrade Check + +[![PyPI version](https://badge.fury.io/py/apache-airflow-upgrade-check.svg)](https://badge.fury.io/py/apache-airflow-upgrade-check) +[![License](http://img.shields.io/:license-Apache%202-blue.svg)](http://www.apache.org/licenses/LICENSE-2.0.txt) +[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/apache-airflow-upgrade-check.svg)](https://pypi.org/project/apache-airflow-upgrade-check/) +[![PyPI - Downloads](https://img.shields.io/pypi/dm/apache-airflow-upgrade-check)](https://pypi.org/project/apache-airflow-upgrade-check/) +[![Twitter Follow](https://img.shields.io/twitter/follow/ApacheAirflow.svg?style=social&label=Follow)](https://twitter.com/ApacheAirflow) +[![Slack Status](https://img.shields.io/badge/slack-join_chat-white.svg?logo=slack&style=social)](https://s.apache.org/airflow-slack) + +This package aims to easy the upgrade journey from [Apache Airflow](https://airflow.apache.org/) 1.10 to 2.0. + +While we have put a lot of effort in to making this upgrade as painless as possible, with many changes +providing upgrade path (where the old code continues to work and prints out a deprecation warning) there were +unfortunately some breaking changes where we couldn't provide a compatibility shim. + +The recommended upgrade path to get to Airflow 2.0.0 is to first upgrade to the latest release in the 1.10 +series (at the time of writing: 1.10.13) and to then run this script. + +```bash +pip install apache-airflow-upgrade-check +airflow upgrade_check +``` + +This will then print out a number of action items that you should follow before upgrading to 2.0.0 or above. + +The exit code of the command will be 0 (success) if no problems are reported, or 1 otherwise. + +For example: + +``` += STATUS = + +Legacy UI is deprecated by default..SUCCESS +Users must set a kubernetes.pod_template_file value.FAIL +Changes in import paths of hooks, operators, sensors and others.FAIL +Remove airflow.AirflowMacroPlugin class.SUCCESS +Jinja Template Variables cannot be undefinedSUCCESS +Fernet is enabled by defaultFAIL +Logging configuration has been moved to new section.SUCCESS +Connection.conn_id is not uniqueSUCCESS +GCP service account key deprecation.SUCCESS +Users must delete deprecated configs for KubernetesExecutor.FAIL +Changes in import path of remote task handlers..SUCCESS +Chain between DAG and operator not allowed..SUCCESS +SendGrid email uses old airflow.contrib module..SUCCESS +Connection.conn_type is not nullableSUCCESS +Found 16 problems. + + RECOMMENDATIONS = + +Users must set a kubernetes.pod_template_file value +--- +In Airflow 2.0, KubernetesExecutor Users need to set a pod_template_file as a base +value for all pods launched by the KubernetesExecutor + + +Problems: + + 1. Please create a pod_template_file by running `airflow generate_pod_template`. +This will generate a pod using your aiflow.cfg settings + +... +```
[airflow] 01/07: Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 415cda438e1d21fb9ca3a64864218f711913b014 Author: Ashmeet Lamba AuthorDate: Thu Nov 19 16:33:06 2020 +0530 Create UndefinedJinjaVariablesRule (Resolves #11144) (#11241) Adding a rule to check for undefined jinja variables when upgrading to Airflow2.0 (cherry picked from commit 18100a0ec96692bb4d7c9e80f206b66a30c65e0d) --- airflow/models/dag.py | 4 +- airflow/upgrade/rules/undefined_jinja_varaibles.py | 153 .../rules/test_undefined_jinja_varaibles.py| 192 + 3 files changed, 347 insertions(+), 2 deletions(-) diff --git a/airflow/models/dag.py b/airflow/models/dag.py index 348e19d..a1908e3 100644 --- a/airflow/models/dag.py +++ b/airflow/models/dag.py @@ -224,7 +224,7 @@ class DAG(BaseDag, LoggingMixin): end_date=None, # type: Optional[datetime] full_filepath=None, # type: Optional[str] template_searchpath=None, # type: Optional[Union[str, Iterable[str]]] -template_undefined=jinja2.Undefined, # type: Type[jinja2.Undefined] +template_undefined=None, # type: Optional[Type[jinja2.Undefined]] user_defined_macros=None, # type: Optional[Dict] user_defined_filters=None, # type: Optional[Dict] default_args=None, # type: Optional[Dict] @@ -807,7 +807,7 @@ class DAG(BaseDag, LoggingMixin): # Default values (for backward compatibility) jinja_env_options = { 'loader': jinja2.FileSystemLoader(searchpath), -'undefined': self.template_undefined, +'undefined': self.template_undefined or jinja2.Undefined, 'extensions': ["jinja2.ext.do"], 'cache_size': 0 } diff --git a/airflow/upgrade/rules/undefined_jinja_varaibles.py b/airflow/upgrade/rules/undefined_jinja_varaibles.py new file mode 100644 index 000..b97cfbc --- /dev/null +++ b/airflow/upgrade/rules/undefined_jinja_varaibles.py @@ -0,0 +1,153 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from __future__ import absolute_import + +import re + +import jinja2 +import six + +from airflow import conf +from airflow.models import DagBag, TaskInstance +from airflow.upgrade.rules.base_rule import BaseRule +from airflow.utils import timezone + + +class UndefinedJinjaVariablesRule(BaseRule): + +title = "Jinja Template Variables cannot be undefined" + +description = """\ +The default behavior for DAG's Jinja templates has changed. Now, more restrictive validation +of non-existent variables is applied - `jinja2.StrictUndefined`. + +The user should do either of the following to fix this - +1. Fix the Jinja Templates by defining every variable or providing default values +2. Explicitly declare `template_undefined=jinja2.Undefined` while defining the DAG +""" + +def _check_rendered_content(self, rendered_content, seen_oids=None): +"""Replicates the logic in BaseOperator.render_template() to +cover all the cases needed to be checked. +""" +if isinstance(rendered_content, six.string_types): +return set(re.findall(r"{{(.*?)}}", rendered_content)) + +elif isinstance(rendered_content, (int, float, bool)): +return set() + +elif isinstance(rendered_content, (tuple, list, set)): +debug_error_messages = set() +for element in rendered_content: + debug_error_messages.update(self._check_rendered_content(element)) +return debug_error_messages + +elif isinstance(rendered_content, dict): +debug_error_messages = set() +for key, value in rendered_content.items(): + debug_error_messages.update(self._check_rendered_content(value)) +return debug_error_messages + +else: +if seen_oids is None: +seen_oids = set() +return self._nested_check_rendered(rendered_content, seen_oids) + +def _nested_check_rendered(self, rendered_content, seen_oids): +debug_error_mess
[GitHub] [airflow] pbotros commented on issue #10435: Negsignal.SIGKILL error on macOS
pbotros commented on issue #10435: URL: https://github.com/apache/airflow/issues/10435#issuecomment-731461394 That's likely - indeed, switching to an EC2 instance with more RAM and running fewer concurrent tasks seems to have stopped it so far. Unfortunately I can't confirm since the logs are now gone, but I bet you're right. TIL about overcommit! https://www.etalabs.net/overcommit.html This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12522: Fix wait-for-migrations command in helm chart
github-actions[bot] commented on pull request #12522: URL: https://github.com/apache/airflow/pull/12522#issuecomment-731459401 The PR should be OK to be merged with just subset of tests as it does not modify Core of Airflow. The committers might merge it or can add a label 'full tests needed' and re-run it to run all tests if they see it is needed! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil closed issue #11354: "trying to overwrite a task will raise an exception"
kaxil closed issue #11354: URL: https://github.com/apache/airflow/issues/11354 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on issue #10435: Negsignal.SIGKILL error on macOS
ashb commented on issue #10435: URL: https://github.com/apache/airflow/issues/10435#issuecomment-731454166 Are you also using a lot of memory? On Linux check `dmesg` or syslog and see if the OOMKiller is killing processes. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb opened a new pull request #12522: Fix wait-for-migrations command in helm chart
ashb opened a new pull request #12522: URL: https://github.com/apache/airflow/pull/12522 If the migrations weren't yet applied this would fail with `NameError: name 'log' is not defined`. (I guess no one really noticed as the container would restart, and try again.) --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] maroshmka commented on issue #12520: gcs_to_bq: Support schema in different bucket
maroshmka commented on issue #12520: URL: https://github.com/apache/airflow/issues/12520#issuecomment-731446573 Implemented draft, if its ok to have this feature, I can finish the system tests :) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] maroshmka opened a new pull request #12521: feat(gcs_to_bq): support schema in different bucket
maroshmka opened a new pull request #12521: URL: https://github.com/apache/airflow/pull/12521 Add possibility to specify different bucket when downloading schema file. closes: https://github.com/apache/airflow/issues/12520 related: https://github.com/apache/airflow/issues/8280 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] maroshmka opened a new issue #12520: gcs_to_bq: Support schema in different bucket
maroshmka opened a new issue #12520: URL: https://github.com/apache/airflow/issues/12520 **Description** Schema path (schema_object) for table in `GCSToBigQueryOperator` is downloading schema from source bucket. New feature should support support also pulling schemas from different bucket. **Use case / motivation** I guess there are lot of use-cases. To mention one concrete: - we want to have schemas at 1 place, for us its terraform, where we define all resources and ACLs - those schemas are exported to GCS by CI - we want to use them in loads to BQ tables In case there is a schema change, we make 1 change (in terraform) and it works. Otherwise we would need to do terraform change, apply, then either update `schema_fields` in operator and release it or to allow adding fields on table load (which we don't necessarily want all the time). **Related Issues** Not directly, but at least - https://github.com/apache/airflow/issues/8280 - polish system tests to operator while at it This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on issue #11112: Release airflow upgrade_check as separate package
ashb commented on issue #2: URL: https://github.com/apache/airflow/issues/2#issuecomment-731445394 Release created, but still some tidy up of docs/process left to do so it is repeatable. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal closed issue #9025: default Timezone setting for Web UI (non-RBAC)
eladkal closed issue #9025: URL: https://github.com/apache/airflow/issues/9025 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal commented on issue #9025: default Timezone setting for Web UI (non-RBAC)
eladkal commented on issue #9025: URL: https://github.com/apache/airflow/issues/9025#issuecomment-731441645 The non-RBAC UI is deprecated in Airflow 1.10 and removed from Airflow 2.0 No new features will be added to the non-RBAC UI. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] tomasfarias commented on pull request #11964: Add new datetime branch operator
tomasfarias commented on pull request #11964: URL: https://github.com/apache/airflow/pull/11964#issuecomment-731434999 Not sure why MySQL build is failing, all tests appear to be passing according to logs. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] annotated tag upgrade-check/1.0.0rc1 updated (deb7fc0 -> 1eed191)
This is an automated email from the ASF dual-hosted git repository. ash pushed a change to annotated tag upgrade-check/1.0.0rc1 in repository https://gitbox.apache.org/repos/asf/airflow.git. *** WARNING: tag upgrade-check/1.0.0rc1 was modified! *** from deb7fc0 (commit) to 1eed191 (tag) tagging deb7fc0ffe3ddb9bf9aad6f5f9479d20598e2fb5 (commit) replaces 1.10.12 by Ash Berlin-Taylor on Fri Nov 20 22:01:04 2020 + - Log - Release 1.0.0rc1 of apache-airflow-upgrade-check sub package -BEGIN PGP SIGNATURE- iQJDBAABCAAtFiEEXMrqx1jtZMoyPwU7gHxzGoyCoJUFAl+4PKIPHGFzaEBhcGFj aGUub3JnAAoJEIB8cxqMgqCVw6IP/RksrU+DuyrzRBrBmb+Xv5SwO/vqhlzSmR7W dg6+7qlX0Q6VfHECpdyUPSa+1quvIzd8Mohm/QUPa6H6ZRQ+ZIeSM6ygPFJ4OCDV IyPHOfHo/d3ck3q5odg8t4f+H43oj5VWdVxAAnb9vegdHQWB67+5pYnUX7YYTTjF d0Hnu7rFRD+FJsfX4mWso4JJ1tuTv05SrtfZS5Fb38aYx/XIHJn5vo6IU3tkplyf 3w7HJ9Lhfi9gWHWQq9KtWLVgWxDDSSxXW9MY1NipZ5j33hw6MZ6CcYDSTaypvWrR B2vTcGAaC0RjNjhWOi/ulK5+cKaATcDIPMgQdBOE0KCrI2Jy4f90qVwX0CqOzqd+ SBOU3lpbucCc3Xj42u5Am4AyGn0IPu6zpymfbcjyf9NkPySZYpjwzIlhiVeweD/y i3FfF+W158BhKQUihVLls5Xqib0D2SDhKpR/oRehUKdBIxp9BTeR31klyrdDl8JR ip49aBxTGq6GupejJsBUYrwCLu28bx/7jZY7ad+ExOyshJEvu9RWiWQE8GJYENuc +JzI7wHbAnotkcw2c47z28BeP2k/PD2TSTAN4SiPk8Xuky9Y7bioAO7V83ra6VUH zXons3c1wHKQAxjqht/yXjLccZiH5Aqd7RN6Nlh77DPQq8ZX+f6EqOlTBUr5VP/U VzbyxK3g =Jlzi -END PGP SIGNATURE- --- No new revisions were added by this update. Summary of changes:
svn commit: r44604 - in /dev/airflow/upgrade-check: ./ 1.0.0rc1/
Author: ash Date: Fri Nov 20 22:06:53 2020 New Revision: 44604 Log: Artifacts for apache-airflow-upgrade-check-1.0.0rc1 Added: dev/airflow/upgrade-check/ dev/airflow/upgrade-check/1.0.0rc1/ dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz (with props) dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz.asc (with props) dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz.sha512 dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz (with props) dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz.asc (with props) dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz.sha512 dev/airflow/upgrade-check/1.0.0rc1/apache_airflow_upgrade_check-1.0.0rc1-py2.py3-none-any.whl (with props) dev/airflow/upgrade-check/1.0.0rc1/apache_airflow_upgrade_check-1.0.0rc1-py2.py3-none-any.whl.asc (with props) dev/airflow/upgrade-check/1.0.0rc1/apache_airflow_upgrade_check-1.0.0rc1-py2.py3-none-any.whl.sha512 Added: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz == Binary file - no diff available. Propchange: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz -- svn:mime-type = application/gzip Added: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz.asc == Binary file - no diff available. Propchange: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz.asc -- svn:mime-type = application/pgp-signature Added: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz.sha512 == --- dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz.sha512 (added) +++ dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz.sha512 Fri Nov 20 22:06:53 2020 @@ -0,0 +1,3 @@ +apache-airflow-upgrade-check-1.0.0rc1-bin.tar.gz: +72C50C41 D9BF79E4 0DB98AA6 135FA619 920239F7 16C7DFE0 580B40A0 8FBF151A D2F08691 + 4BAA8DA5 0270123C 48581D6D C002DE2A 0C4AC732 AEF33DD3 B12D8719 Added: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz == Binary file - no diff available. Propchange: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz -- svn:mime-type = application/gzip Added: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz.asc == Binary file - no diff available. Propchange: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz.asc -- svn:mime-type = application/pgp-signature Added: dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz.sha512 == --- dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz.sha512 (added) +++ dev/airflow/upgrade-check/1.0.0rc1/apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz.sha512 Fri Nov 20 22:06:53 2020 @@ -0,0 +1,3 @@ +apache-airflow-upgrade-check-1.0.0rc1-source.tar.gz: +4F67A04C 70C5B962 B82E4703 90D6EC99 B9031268 15C4A7CD 53A32BB2 D6A71DF0 6A175673 + A2291944 0248681F 7D06C30C A418F470 DC4A60F3 70014B55 7A13FE79 Added: dev/airflow/upgrade-check/1.0.0rc1/apache_airflow_upgrade_check-1.0.0rc1-py2.py3-none-any.whl == Binary file - no diff available. Propchange: dev/airflow/upgrade-check/1.0.0rc1/apache_airflow_upgrade_check-1.0.0rc1-py2.py3-none-any.whl -- svn:mime-type = application/zip Added: dev/airflow/upgrade-check/1.0.0rc1/apache_airflow_upgrade_check-1.0.0rc1-py2.py3-none-any.whl.asc == Binary file - no diff available. Propchange: dev/airflow/upgrade-check/1.0.0rc1/apache_airflow_upgrade_check-1.0.0rc1-py2.py3-none-any.whl.asc --
[airflow] branch v1-10-stable updated: Add setup.cfg for apache-airflow-upgrade-check (#12517)
This is an automated email from the ASF dual-hosted git repository. ash pushed a commit to branch v1-10-stable in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/v1-10-stable by this push: new deb7fc0 Add setup.cfg for apache-airflow-upgrade-check (#12517) deb7fc0 is described below commit deb7fc0ffe3ddb9bf9aad6f5f9479d20598e2fb5 Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 21:55:22 2020 + Add setup.cfg for apache-airflow-upgrade-check (#12517) Nothing currently uses this setup.cfg from this folder -- automation for that will follow shortly. Now that there is a place list deps for upgrade-check I have moved `packaging` and `importlib_meta` to test_requires of the main dist. Build a py2+py3 wheel. --- airflow/upgrade/setup.cfg | 64 +++ setup.py | 4 ++- 2 files changed, 67 insertions(+), 1 deletion(-) diff --git a/airflow/upgrade/setup.cfg b/airflow/upgrade/setup.cfg new file mode 100644 index 000..ddd708c --- /dev/null +++ b/airflow/upgrade/setup.cfg @@ -0,0 +1,64 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[metadata] +version=1.0.0 +name = apache-airflow-upgrade-check +description = Check for compatibility between Airflow versions +long_description = file: airflow/upgrade/README.md +long_description_content_type = text/markdown +url = https://airflow.apache.org +author = Apache Airflow PMC +author-email = d...@airflow.apache.org +license = Apache License 2.0 +license_files = + LICENSE + NOTICE +classifiers = +Development Status :: 5 - Production/Stable +Intended Audience :: Developers +License :: OSI Approved :: Apache Software License +Programming Language :: Python :: 2.7 +Programming Language :: Python :: 3 +Programming Language :: Python :: 3.6 +Programming Language :: Python :: 3.7 +Programming Language :: Python :: 3.8 +keywords = airflow, upgrade +project_urls = +Source Code=https://github.com/apache/airflow +Bug Tracker=https://github.com/apache/airflow/issues +Documentation=https://airflow.apache.org/docs/ + +[options] +packages = find: +install_requires = +apache-airflow>=1.10.13,<3 +importlib-metadata~=2.0; python_version<"3.8" +packaging +python_requires = >=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.* +setup_requires = +setuptools>=40.0 +wheel +zip_safe = no + +[options.packages.find] +include = + airflow.upgrade + airflow.upgrade.* + +[bdist_wheel] +universal=1 diff --git a/setup.py b/setup.py index 12a633e..370a92f 100644 --- a/setup.py +++ b/setup.py @@ -426,12 +426,14 @@ devel = [ 'flaky', 'freezegun', 'gitpython', +'importlib-metadata~=2.0; python_version<"3.8"', 'ipdb', 'jira', 'mock;python_version<"3.3"', 'mongomock', 'moto==1.3.14', # TODO - fix Datasync issues to get higher version of moto: #See: https://github.com/apache/airflow/issues/10985 +'packaging', 'parameterized', 'paramiko', 'pre-commit', @@ -445,7 +447,7 @@ devel = [ 'pywinrm', 'qds-sdk>=1.9.6', 'requests_mock', -'yamllint' +'yamllint', ] # IMPORTANT NOTE!!!
[GitHub] [airflow] ashb merged pull request #12517: Add setup.cfg for apache-airflow-upgrade-check
ashb merged pull request #12517: URL: https://github.com/apache/airflow/pull/12517 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated: Fix the default value for VaultBackend's config_path (#12518)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/master by this push: new 36a9b0f Fix the default value for VaultBackend's config_path (#12518) 36a9b0f is described below commit 36a9b0f48baf4a8ef8fc02a450a279948a8c0f02 Author: Kaxil Naik AuthorDate: Fri Nov 20 21:52:28 2020 + Fix the default value for VaultBackend's config_path (#12518) It is `config` not `configs` --- airflow/providers/hashicorp/secrets/vault.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/airflow/providers/hashicorp/secrets/vault.py b/airflow/providers/hashicorp/secrets/vault.py index 776d8b9..f745c83 100644 --- a/airflow/providers/hashicorp/secrets/vault.py +++ b/airflow/providers/hashicorp/secrets/vault.py @@ -51,7 +51,7 @@ class VaultBackend(BaseSecretsBackend, LoggingMixin): (default: 'variables'). If set to None (null), requests for variables will not be sent to Vault. :type variables_path: str :param config_path: Specifies the path of the secret to read Airflow Configurations -(default: 'configs'). If set to None (null), requests for configurations will not be sent to Vault. +(default: 'config'). If set to None (null), requests for configurations will not be sent to Vault. :type config_path: str :param url: Base URL for the Vault instance being addressed. :type url: str
[GitHub] [airflow] kaxil merged pull request #12518: Fix the default value for VaultBackend's config_path
kaxil merged pull request #12518: URL: https://github.com/apache/airflow/pull/12518 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on pull request #12518: Fix the default value for VaultBackend's config_path
kaxil commented on pull request #12518: URL: https://github.com/apache/airflow/pull/12518#issuecomment-731426061 Looks like it was a. transient failure, passed this time This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on issue #11901: DAGs remain in the UI after renaming the dag_id in the same python file
kaxil commented on issue #11901: URL: https://github.com/apache/airflow/issues/11901#issuecomment-731425615 This needs fixing in Master and is not that critical, so will fix it in 2.0.0 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] mik-laj commented on pull request #12498: Impala hook implention
mik-laj commented on pull request #12498: URL: https://github.com/apache/airflow/pull/12498#issuecomment-731424790 Building documentation for new providers is broken, but that's what I'm working on. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] removed a comment on pull request #12516: Housekeeping for www/security.py
github-actions[bot] removed a comment on pull request #12516: URL: https://github.com/apache/airflow/pull/12516#issuecomment-731415020 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch v1-10-test updated: Fix the default value for VaultBackend's config_path (#12518)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch v1-10-test in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/v1-10-test by this push: new 4b299e8 Fix the default value for VaultBackend's config_path (#12518) 4b299e8 is described below commit 4b299e8d75172b148809f017367ecfb57445ca66 Author: Kaxil Naik AuthorDate: Fri Nov 20 21:24:50 2020 + Fix the default value for VaultBackend's config_path (#12518) It is `config` not `configs` --- airflow/contrib/secrets/hashicorp_vault.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/airflow/contrib/secrets/hashicorp_vault.py b/airflow/contrib/secrets/hashicorp_vault.py index 536e7f9..edf48c3 100644 --- a/airflow/contrib/secrets/hashicorp_vault.py +++ b/airflow/contrib/secrets/hashicorp_vault.py @@ -56,7 +56,7 @@ class VaultBackend(BaseSecretsBackend, LoggingMixin): (default: 'variables') :type variables_path: str :param config_path: Specifies the path of the secret to read Airflow Configurations -(default: 'configs'). +(default: 'config'). :type config_path: str :param url: Base URL for the Vault instance being addressed. :type url: str
[GitHub] [airflow] github-actions[bot] commented on pull request #12516: Housekeeping for www/security.py
github-actions[bot] commented on pull request #12516: URL: https://github.com/apache/airflow/pull/12516#issuecomment-731415020 [The Workflow run](https://github.com/apache/airflow/actions/runs/375182012) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil closed issue #11149: SQL Alchemy Conn Secret Fails to parse
kaxil closed issue #11149: URL: https://github.com/apache/airflow/issues/11149 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12516: Housekeeping for www/security.py
github-actions[bot] commented on pull request #12516: URL: https://github.com/apache/airflow/pull/12516#issuecomment-731414636 [The Workflow run](https://github.com/apache/airflow/actions/runs/375180530) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12518: Fix the default value for VaultBackend's config_path
github-actions[bot] commented on pull request #12518: URL: https://github.com/apache/airflow/pull/12518#issuecomment-731414641 [The Workflow run](https://github.com/apache/airflow/actions/runs/375180530) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch v1-10-stable updated: Add upgrade check rule to ensure on "latest" versions (#12514)
This is an automated email from the ASF dual-hosted git repository. ash pushed a commit to branch v1-10-stable in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/v1-10-stable by this push: new d7ace02 Add upgrade check rule to ensure on "latest" versions (#12514) d7ace02 is described below commit d7ace0267c34bd5520d321a495399710c1c49cd1 Author: Ash Berlin-Taylor AuthorDate: Fri Nov 20 21:21:13 2020 + Add upgrade check rule to ensure on "latest" versions (#12514) This checks against PyPI to make sure this is run with the latest non-preview release of apache-airflow-upgrade-check, and the latest 1.10.x of apache-airflow --- airflow/upgrade/rules/__init__.py | 2 +- airflow/upgrade/rules/aaa_airflow_version_check.py | 87 ++ setup.py | 2 + .../rules/test_aaa_airflow_version_check.py| 75 +++ 4 files changed, 165 insertions(+), 1 deletion(-) diff --git a/airflow/upgrade/rules/__init__.py b/airflow/upgrade/rules/__init__.py index 4735c7f..97d0160 100644 --- a/airflow/upgrade/rules/__init__.py +++ b/airflow/upgrade/rules/__init__.py @@ -21,7 +21,7 @@ def get_rules(): """Automatically discover all rules""" rule_classes = [] path = os.path.dirname(os.path.abspath(__file__)) -for file in os.listdir(path): +for file in sorted(os.listdir(path)): if not file.endswith(".py") or file in ("__init__.py", "base_rule.py"): continue py_file = file[:-3] diff --git a/airflow/upgrade/rules/aaa_airflow_version_check.py b/airflow/upgrade/rules/aaa_airflow_version_check.py new file mode 100644 index 000..ad84eb4 --- /dev/null +++ b/airflow/upgrade/rules/aaa_airflow_version_check.py @@ -0,0 +1,87 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +# This module starts with `aaa_` so that it is sorted first alphabetically, but is still a valid python module +# name (starting with digitis is not valid) + +from __future__ import absolute_import + +from packaging.version import Version +import requests + +from airflow.upgrade.rules.base_rule import BaseRule + +try: +import importlib.metadata as importlib_metadata +except ImportError: +import importlib_metadata + + +class VersionCheckRule(BaseRule): + +title = "Check for latest versions of apache-airflow and checker" + +description = """\ +Check that the latest version of apache-airflow-upgrade-check is installed, and +that you are on the latest 1.10.x release of apache-airflow.""" + +def pypi_releases(self, distname): +""" +Get all the non-dev releases of a dist from PyPI +""" + +resp = requests.get("https://pypi.org/pypi/{}/json".format(distname)) +resp.raise_for_status() + +for rel_string in resp.json()["releases"].keys(): +ver = Version(rel_string) +if ver.is_devrelease or ver.is_prerelease: +continue +yield ver + +def check(self): + +current_airflow_version = Version(__import__("airflow").__version__) +try: +upgrade_check_ver = Version( + importlib_metadata.distribution("apache-airflow-upgrade-check").version, +) +except importlib_metadata.PackageNotFoundError: +upgrade_check_ver = Version("0.0.0") + +try: +latest_airflow_v1_release = sorted( +filter(lambda v: v.major == 1, self.pypi_releases("apache-airflow")) +)[-1] + +if current_airflow_version < latest_airflow_v1_release: +yield ( +"There is a more recent version of apache-airflow. Please upgrade to {} and re-run this" +" script" +).format(latest_airflow_v1_release) + +latest_upgrade_check_release = sorted( +self.pypi_releases("apache-airflow-upgrade-check") +)[-1] + +if upgrade_check_ver < latest_upgrade_check_release: +yield ( +"There is a more recent version of apache-airflow-upgrade-check. Please upgrade to
[GitHub] [airflow] github-actions[bot] commented on pull request #12516: Housekeeping for www/security.py
github-actions[bot] commented on pull request #12516: URL: https://github.com/apache/airflow/pull/12516#issuecomment-731413344 [The Workflow run](https://github.com/apache/airflow/actions/runs/375175218) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb merged pull request #12514: Add upgrade check rule to ensure on "latest" versions
ashb merged pull request #12514: URL: https://github.com/apache/airflow/pull/12514 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12518: Fix the default value for VaultBackend's config_path
github-actions[bot] commented on pull request #12518: URL: https://github.com/apache/airflow/pull/12518#issuecomment-731413355 [The Workflow run](https://github.com/apache/airflow/actions/runs/375175218) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on pull request #12518: Fix the default value for VaultBackend's config_path
ashb commented on pull request #12518: URL: https://github.com/apache/airflow/pull/12518#issuecomment-731412625 Not quite sure why spell check build failed -- I can't find the error amongst all the spew about intersphinx This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG commented on a change in pull request #12516: Housekeeping for www/security.py
XD-DENG commented on a change in pull request #12516: URL: https://github.com/apache/airflow/pull/12516#discussion_r527969425 ## File path: airflow/www/security.py ## @@ -520,7 +520,6 @@ def update_admin_perm_view(self): :return: None. """ -all_dag_view = self.find_view_menu(permissions.RESOURCE_DAG) dag_pvs = ( self.get_session.query(sqla_models.ViewMenu) .filter(sqla_models.ViewMenu.name.like(f"{permissions.RESOURCE_DAG_PREFIX}%")) Review comment: A bit more clarification for this change: If the ViewMenu name is like `RESOURCE_DAG_PREFIX` ("DAG:"), we can already be very sure that the resulting `pv_ids` doesn't include the id of `RESOURCE_DAG ` ("DAGs"). Hence in the following SQL query, we don't need to have the 2nd condition in the `and_()` Let me know if this doesn't make sense to you. Thanks. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG commented on a change in pull request #12516: Housekeeping for www/security.py
XD-DENG commented on a change in pull request #12516: URL: https://github.com/apache/airflow/pull/12516#discussion_r527969425 ## File path: airflow/www/security.py ## @@ -520,7 +520,6 @@ def update_admin_perm_view(self): :return: None. """ -all_dag_view = self.find_view_menu(permissions.RESOURCE_DAG) dag_pvs = ( self.get_session.query(sqla_models.ViewMenu) .filter(sqla_models.ViewMenu.name.like(f"{permissions.RESOURCE_DAG_PREFIX}%")) Review comment: if the ViewMenu name is like `RESOURCE_DAG_PREFIX` ("DAG:"), we can already be very sure that the resulting `pv_ids` doesn't include the id of `RESOURCE_DAG ` ("DAGs"). Hence in the following SQL query, we don't need to have the 2nd condition in the `and_()` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12518: Fix the default value for VaultBackend's config_path
github-actions[bot] commented on pull request #12518: URL: https://github.com/apache/airflow/pull/12518#issuecomment-731406710 The PR should be OK to be merged with just subset of tests as it does not modify Core of Airflow. The committers might merge it or can add a label 'full tests needed' and re-run it to run all tests if they see it is needed! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pcandoalmeida commented on issue #11053: Create CustomExecutorsRequireFullPathRule to ease upgrade to Airflow 2.0
pcandoalmeida commented on issue #11053: URL: https://github.com/apache/airflow/issues/11053#issuecomment-731403875 Hi @dimberman yes by all means! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #12517: Add setup.cfg for apache-airflow-upgrade-check
github-actions[bot] commented on pull request #12517: URL: https://github.com/apache/airflow/pull/12517#issuecomment-731403113 The PR needs to run all tests because it modifies core of Airflow! Please rebase it to latest master or ask committer to re-run it! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org