[GitHub] [airflow] DreamyWen edited a comment on issue #7935: scheduler gets stuck without a trace
DreamyWen edited a comment on issue #7935: URL: https://github.com/apache/airflow/issues/7935#issuecomment-786473112 Seeing this on 1.10.14 + CeleryExecutor + python 3.8, will this be fix on 1.10.x? for some reason our company has to use mysql 5.6. ``` ps -ef |grep airflow ``` ``` root 9522 1 1 15:24 ?00:00:13 /data/anaconda3/envs/airflow/bin/python /data/anaconda3/envs/airflow/bin/airflow webserver -D root 9528 1 0 15:24 ?00:00:00 gunicorn: master [airflow-webserver] root 21238 1 0 15:31 ?00:00:04 /data/anaconda3/envs/airflow/bin/python /data/anaconda3/envs/airflow/bin/airflow scheduler -D root 21239 21238 1 15:31 ?00:00:09 airflow scheduler -- DagFileProcessorManager root 38695 9528 1 15:42 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 39492 9528 2 15:43 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 39644 9528 4 15:43 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 40455 9528 51 15:44 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 40503 21239 0 15:44 ?00:00:00 [airflow schedul] root 40504 21239 0 15:44 ?00:00:00 [airflow schedul] ``` the [airflow schedul] defunct process is keep restarting all the time. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] DreamyWen edited a comment on issue #7935: scheduler gets stuck without a trace
DreamyWen edited a comment on issue #7935: URL: https://github.com/apache/airflow/issues/7935#issuecomment-786473112 Seeing this on 1.10.14 + CeleryExecutor + python 3.8, will this be fix on 1.10.x? for some reason our company has to use mysql 5.6. ``` ps -ef |grep airflow ``` ``` root 9522 1 1 15:24 ?00:00:13 /data/anaconda3/envs/airflow/bin/python /data/anaconda3/envs/airflow/bin/airflow webserver -D root 9528 1 0 15:24 ?00:00:00 gunicorn: master [airflow-webserver] root 21238 1 0 15:31 ?00:00:04 /data/anaconda3/envs/airflow/bin/python /data/anaconda3/envs/airflow/bin/airflow scheduler -D root 21239 21238 1 15:31 ?00:00:09 airflow scheduler -- DagFileProcessorManager root 38695 9528 1 15:42 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 39492 9528 2 15:43 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 39644 9528 4 15:43 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 40455 9528 51 15:44 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 40503 21239 0 15:44 ?00:00:00 [airflow schedul] root 40504 21239 0 15:44 ?00:00:00 [airflow schedul] ``` the [airflow schedul] process is keep restarting all the time. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] DreamyWen commented on issue #7935: scheduler gets stuck without a trace
DreamyWen commented on issue #7935: URL: https://github.com/apache/airflow/issues/7935#issuecomment-786473112 Seeing this on 1.10.14 + CeleryExecutor + python 3.8, will this be fix on 1.10.x? for some reason our company has to use mysql 5.6. ``` ps -ef |grep airflow ``` ``` root 9522 1 1 15:24 ?00:00:13 /data/anaconda3/envs/airflow/bin/python /data/anaconda3/envs/airflow/bin/airflow webserver -D root 9528 1 0 15:24 ?00:00:00 gunicorn: master [airflow-webserver] root 21238 1 0 15:31 ?00:00:04 /data/anaconda3/envs/airflow/bin/python /data/anaconda3/envs/airflow/bin/airflow scheduler -D root 21239 21238 1 15:31 ?00:00:09 airflow scheduler -- DagFileProcessorManager root 38695 9528 1 15:42 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 39492 9528 2 15:43 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 39644 9528 4 15:43 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 40455 9528 51 15:44 ?00:00:01 [ready] gunicorn: worker [airflow-webserver] root 40503 21239 0 15:44 ?00:00:00 [airflow schedul] root 40504 21239 0 15:44 ?00:00:00 [airflow schedul] ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy commented on issue #14421: NULL values in the operator column of task_instance table cause API validation failures
ephraimbuddy commented on issue #14421: URL: https://github.com/apache/airflow/issues/14421#issuecomment-786438152 Hi @zachliu, my guess is that you didn't run `airflow db upgrade` after upgrading cause I was not able to reproduce this This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] yohei1126 commented on pull request #14388: Pass region name to superclass of AWSGlueJobHook
yohei1126 commented on pull request #14388: URL: https://github.com/apache/airflow/pull/14388#issuecomment-786433021 There are some errors. https://github.com/apache/airflow/runs/1970983265 https://github.com/apache/airflow/runs/1970983080 ``` Requirement already satisfied: zipp>=0.5 in /home/runner/.local/lib/python3.6/site-packages (from importlib-metadata->pre-commit) (3.4.0) ./scripts/ci/static_checks/run_static_checks.sh: /home/runner/.local/bin/pre-commit: /opt/hostedtoolcache/Python/3.6.12/x64/bin/python: bad interpreter: No such file or directory ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch v1-10-stable updated: Fixed deprecation message for "variables" command (#14457)
This is an automated email from the ASF dual-hosted git repository. xddeng pushed a commit to branch v1-10-stable in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/v1-10-stable by this push: new 75b2fa2 Fixed deprecation message for "variables" command (#14457) 75b2fa2 is described below commit 75b2fa22b75b9879bafe607b15f900b46c4828bc Author: Bruno Guimarães AuthorDate: Fri Feb 26 02:10:57 2021 -0300 Fixed deprecation message for "variables" command (#14457) --- airflow/bin/cli.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py index f829a0e..70a1107 100644 --- a/airflow/bin/cli.py +++ b/airflow/bin/cli.py @@ -418,7 +418,7 @@ def variables_export(args): _vars_wrapper(args, export=args.file) -@cli_utils.deprecated_action(new_name='variables') +@cli_utils.deprecated_action(new_name='variables list') @cli_utils.action_logging def variables(args): if args.get:
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #14457: Fixed deprecation message for "variables" command
boring-cyborg[bot] commented on pull request #14457: URL: https://github.com/apache/airflow/pull/14457#issuecomment-786415291 Awesome work, congrats on your first merged pull request! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG merged pull request #14457: Fixed deprecation message for "variables" command
XD-DENG merged pull request #14457: URL: https://github.com/apache/airflow/pull/14457 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch v1-10-stable updated (4944160 -> 7a5f0e1)
This is an automated email from the ASF dual-hosted git repository. xddeng pushed a change to branch v1-10-stable in repository https://gitbox.apache.org/repos/asf/airflow.git. from 4944160 Fix comparing airflow version to work with older versions of packaging library (#14435) add 7a5f0e1 Add 'airflow variables list' command for 1.10.x transition version (#14462) No new revisions were added by this update. Summary of changes: airflow/bin/cli.py | 10 ++ 1 file changed, 10 insertions(+)
[GitHub] [airflow] XD-DENG merged pull request #14462: Add 'airflow variables list' command for 1.10.x transition version
XD-DENG merged pull request #14462: URL: https://github.com/apache/airflow/pull/14462 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG commented on pull request #14462: Add 'airflow variables list' command for 1.10.x transition version
XD-DENG commented on pull request #14462: URL: https://github.com/apache/airflow/pull/14462#issuecomment-786414589 I tested locally and works fine to me. Gonna merge https://user-images.githubusercontent.com/11539188/109257704-033fb180-77f9-11eb-91a7-2a4777858c42.png";> This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on pull request #14329: Speed up tests by moving app instantiation to class method
jhtimmins commented on pull request #14329: URL: https://github.com/apache/airflow/pull/14329#issuecomment-786408684 @ashb por favor This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on a change in pull request #14470: Log all breeze output to a file automatically
jhtimmins commented on a change in pull request #14470: URL: https://github.com/apache/airflow/pull/14470#discussion_r583372766 ## File path: BREEZE.rst ## @@ -36,6 +36,8 @@ We called it *Airflow Breeze* as **It's a Breeze to contribute to Airflow**. The advantages and disadvantages of using the Breeze environment vs. other ways of testing Airflow are described in `CONTRIBUTING.rst `_. +All the output from last ./breeze command is automatically logged to< ``logs/breeze.out`` file. Review comment: ```suggestion All the output from the last ./breeze command is automatically logged to the ``logs/breeze.out`` file. ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on a change in pull request #14219: Provide login endpoint for the REST API with JWT authentication method
jhtimmins commented on a change in pull request #14219: URL: https://github.com/apache/airflow/pull/14219#discussion_r583371087 ## File path: airflow/api_connexion/openapi/v1.yaml ## @@ -1381,11 +1381,198 @@ paths: schema: $ref: '#/components/schemas/VersionInfo' + /login: +post: + summary: User login + description: | +Verify user and return a user object and JWT token as well + x-openapi-router-controller: airflow.api_connexion.endpoints.user_endpoint + operationId: login + tags: [User] + requestBody: +required: true +content: + application/json: +schema: + $ref: '#/components/schemas/Login' + responses: +'200': + description: Success. + content: +application/json: + schema: +$ref: '#/components/schemas/UserLogin' +'400': + $ref: '#/components/responses/BadRequest' +'401': + $ref: '#/components/responses/Unauthenticated' components: # Reusable schemas (data models) schemas: # Database entities +UserCollectionItem: + description: > +User collection item + type: object + properties: +id: + type: string + description: The user id + readOnly: true +first_name: + type: string + description: The user firstname +last_name: + type: string + description: The user lastname +username: + type: string + description: The username +email: + type: string + description: The user's email +active: + type: boolean + description: Whether the user is active +last_login: + type: string + format: datetime + description: The last user login + readOnly: true +login_count: + type: integer + description: The login count + readOnly: true +failed_login_count: + type: integer + description: The number of times the login failed + readOnly: true +roles: + type: array + description: User roles + items: +$ref: '#/components/schemas/RoleCollectionItem' + readOnly: true + nullable: true +created_on: + type: string + format: datetime + description: The date user was created + readOnly: true +changed_on: + type: string + format: datetime + description: The date user was changed + readOnly: true + +UserCollection: + description: User collection + type: object + properties: +users: + type: array + items: + $ref: '#/components/schemas/UserCollectionItem' + +UserLogin: + description: Login item + allOf: +- $ref: '#/components/schemas/UserCollectionItem' +- type: object + properties: +token: + type: string + nullable: false + description: JWT token + +RoleCollectionItem: + description: Role collection item + type: object + properties: +id: + type: string + description: The role ID +name: + type: string + description: The name of the role +permissions: + type: array + items: +$ref: '#/components/schemas/PermissionView' + +RoleCollection: + description: Role Collections + type: object + properties: +roles: + type: array + items: +$ref: '#/components/schemas/RoleCollectionItem' + +PermissionCollectionItem: + description: Permission Collection Item + type: object + properties: +id: + type: string + description: The permission ID +name: + type: string + description: The name of the permission + nullable: false + +PermissionCollection: + description: Permission Collection + type: object + properties: +permissions: + type: array + items: +$ref: '#/components/schemas/PermissionCollectionItem' + +PermissionView: + description: Permission view item + type: object + properties: +id: + type: string + description: The PermissionView ID +permission_id: + type: string + description: The permission ID +permission: + type: string + description: The name of the permission +view_menu_id: + type: string + description: The view menu id +view_menu_name: + type: string + description: The view menu name + +ViewMenuCollectionItem: + description: ViewMenu Collection
[GitHub] [airflow] jhtimmins commented on pull request #14439: Updated the latest version of the "Access Control" doc with new screenshots/ content
jhtimmins commented on pull request #14439: URL: https://github.com/apache/airflow/pull/14439#issuecomment-786397341 This looks great @jwitz, just some clarification about how DAG-level permissions work. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on a change in pull request #14439: Updated the latest version of the "Access Control" doc with new screenshots/ content
jhtimmins commented on a change in pull request #14439: URL: https://github.com/apache/airflow/pull/14439#discussion_r583360827 ## File path: docs/apache-airflow/security/access-control.rst ## @@ -37,101 +37,113 @@ regarding its security model. Default Roles ' -Airflow ships with a set of roles by default: Admin, User, Op, Viewer, and Public. -Only ``Admin`` users could configure/alter the permissions for other roles. But it is not recommended -that ``Admin`` users alter these default roles in any way by removing -or adding permissions to these roles. + +Airflow uses roles to manage all permissions across your environment. Each role has permissions which provide varying levels of access to Airflow's key resources (DAGs, Connections, etc). + +An environment's roles can be found under the **Security** tab: + +.. image:: /img/list-roles.png + +Airflow ships with a set of roles by default: Admin ^ ``Admin`` users have all possible permissions, including granting or revoking permissions from other users. +Only ``Admin`` users can configure/alter the permissions for other roles. While they can reconfigure Airflow's other default rules, it's not recommended. +The best practice is to instead create a new role with the desired permissions. + Public ^^ ``Public`` users (anonymous) don't have any permissions. Viewer ^^ -``Viewer`` users have limited viewer permissions +``Viewer`` users have limited viewer permissions on limited web views. The following table lists all permissions granted to ``Viewers``: .. exampleinclude:: /../../airflow/www/security.py :language: python :start-after: [START security_viewer_perms] :end-before: [END security_viewer_perms] -on limited web views. User -``User`` users have ``Viewer`` permissions plus additional user permissions +``User`` users have all of the ``Viewer`` permissions listed above, plus additional user permissions for modifying DAGs, task instances, and DAG runs: .. exampleinclude:: /../../airflow/www/security.py :language: python :start-after: [START security_user_perms] :end-before: [END security_user_perms] -on User web views which is the same as Viewer web views. - Op -^^ -``Op`` users have ``User`` permissions plus additional op permissions +^^^ +``Op`` users have all of the permissions that ``Viewers`` and ``Users`` have, plus additional permissions for modifying resources like connections and pools: .. exampleinclude:: /../../airflow/www/security.py :language: python :start-after: [START security_op_perms] :end-before: [END security_op_perms] -on ``User`` web views. - Custom Roles ' -DAG Level Role -^^ -``Admin`` can create a set of roles which are only allowed to view a certain set of dags. This is called DAG level access. Each dag defined in the dag model table -is treated as a ``View`` which has two permissions associated with it (``can_read`` and ``can_edit``. ``can_dag_read`` and ``can_dag_edit`` are deprecated since 2.0.0). -There is a special view called ``DAGs`` (it was called ``all_dags`` in versions 1.10.*) which -allows the role to access all the dags. The default ``Admin``, ``Viewer``, ``User``, ``Op`` roles can all access ``DAGs`` view. +Creating custom roles is the recommended way to customize your environment access. To do so: -.. image:: /img/add-role.png -.. image:: /img/new-role.png +1. In the **List Roles** menu, click the blue button to create a new role. Alternatively, select an existing role and click **Actions** > **Copy Role** to create a role based on your selection. -The image shows the creation of a role which can only write to -``example_python_operator``. You can also create roles via the CLI -using the ``airflow roles create`` command, e.g.: + .. image:: /img/list-roles.png -.. code-block:: bash +2. Specify a name and add permissions for the role. Note that the names for permissions will vary slightly between the UI and the source code. - airflow roles create Role1 Role2 + .. image:: /img/add-permissions.png -And we could assign the given role to a new user using the ``airflow -users add-role`` CLI command. +3. Click Save. +You can also create roles via the CLI using the following command: -Permissions -''' +.. code-block:: bash + + airflow roles create + +You could then assign the given role to a new user using the ``airflow Review comment: ```suggestion You can then assign the given role to a new user using the ``airflow ``` Just for sake of consistency ## File path: docs/apache-airflow/security/access-control.rst ## @@ -37,101 +37,113 @@ regarding its security model. Default Roles ' -Airflow ships with a set of roles by default: Admin, User, Op, Viewer, and Public. -Only ``Admin`` users could configure/alter the permissions for other roles. But it is not recommended -that ``Admi
[GitHub] [airflow-on-k8s-operator] jbampton opened a new pull request #36: Fix pre-commit-golang version
jbampton opened a new pull request #36: URL: https://github.com/apache/airflow-on-k8s-operator/pull/36 https://user-images.githubusercontent.com/418747/109251681-9ef01280-7837-11eb-9b4f-82bbe71ea30e.png";> This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow-on-k8s-operator] jbampton opened a new pull request #35: Fix spelling
jbampton opened a new pull request #35: URL: https://github.com/apache/airflow-on-k8s-operator/pull/35 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] dstandish opened a new issue #14473: DagRun duration only visible in tree view tooltip if currently running
dstandish opened a new issue #14473: URL: https://github.com/apache/airflow/issues/14473 On airflow 2.0.1 On tree view if dag run is running, duration shows as expected: ![image](https://user-images.githubusercontent.com/15932138/109248646-086e1380-779b-11eb-9d00-8cb785d88299.png) But if dag run is complete, duration is null: ![image](https://user-images.githubusercontent.com/15932138/109248752-3ce1cf80-779b-11eb-8784-9a4aaed2209b.png) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch constraints-master updated: Updating constraints. Build id:601285812
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a commit to branch constraints-master in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/constraints-master by this push: new 7c443ff Updating constraints. Build id:601285812 7c443ff is described below commit 7c443ffec823be8c4efe7e448fcff44e8dc65053 Author: Automated GitHub Actions commit AuthorDate: Fri Feb 26 02:46:03 2021 + Updating constraints. Build id:601285812 This update in constraints is automatically committed by the CI 'constraints-push' step based on HEAD of 'refs/heads/master' in 'apache/airflow' with commit sha 40a3e339f3ad11b3a1bb1c71848b964ddca69383. All tests passed in this build so we determined we can push the updated constraints. See https://github.com/apache/airflow/blob/master/README.md#installing-from-pypi for details. --- constraints-3.6.txt | 10 +- constraints-3.7.txt | 10 +- constraints-3.8.txt | 10 +- constraints-no-providers-3.6.txt | 2 +- constraints-no-providers-3.7.txt | 2 +- constraints-no-providers-3.8.txt | 2 +- 6 files changed, 18 insertions(+), 18 deletions(-) diff --git a/constraints-3.6.txt b/constraints-3.6.txt index 7ecd425..077d31e 100644 --- a/constraints-3.6.txt +++ b/constraints-3.6.txt @@ -35,7 +35,7 @@ Unidecode==1.2.0 WTForms==2.3.3 Werkzeug==1.0.1 adal==1.2.6 -aiohttp==3.7.3 +aiohttp==3.7.4 alabaster==0.7.12 alembic==1.5.5 amqp==2.6.1 @@ -252,7 +252,7 @@ google-cloud-vision==1.0.0 google-cloud-workflows==0.2.0 google-crc32c==1.1.2 google-resumable-media==1.2.0 -googleapis-common-protos==1.52.0 +googleapis-common-protos==1.53.0 graphviz==0.16 greenlet==1.0.0 grpc-google-iam-v1==0.12.3 @@ -334,7 +334,7 @@ ntlm-auth==1.5.0 numpy==1.19.5 oauth2client==4.1.3 oauthlib==3.1.0 -openapi-schema-validator==0.1.2 +openapi-schema-validator==0.1.3 openapi-spec-validator==0.3.0 oscrypto==1.2.1 packaging==20.9 @@ -362,7 +362,7 @@ prison==0.1.3 prometheus-client==0.8.0 prompt-toolkit==3.0.16 proto-plus==1.13.0 -protobuf==3.15.2 +protobuf==3.15.3 psutil==5.8.0 psycopg2-binary==2.8.6 ptyprocess==0.7.0 @@ -478,7 +478,7 @@ thrift==0.13.0 toml==0.10.2 toolz==0.11.1 tornado==6.1 -tqdm==4.57.0 +tqdm==4.58.0 traitlets==4.3.3 typed-ast==1.4.2 typing-extensions==3.7.4.3 diff --git a/constraints-3.7.txt b/constraints-3.7.txt index c4bec67..7cdc5f0 100644 --- a/constraints-3.7.txt +++ b/constraints-3.7.txt @@ -35,7 +35,7 @@ Unidecode==1.2.0 WTForms==2.3.3 Werkzeug==1.0.1 adal==1.2.6 -aiohttp==3.7.3 +aiohttp==3.7.4 alabaster==0.7.12 alembic==1.5.5 amqp==2.6.1 @@ -250,7 +250,7 @@ google-cloud-vision==1.0.0 google-cloud-workflows==0.2.0 google-crc32c==1.1.2 google-resumable-media==1.2.0 -googleapis-common-protos==1.52.0 +googleapis-common-protos==1.53.0 graphviz==0.16 greenlet==1.0.0 grpc-google-iam-v1==0.12.3 @@ -330,7 +330,7 @@ ntlm-auth==1.5.0 numpy==1.20.1 oauth2client==4.1.3 oauthlib==3.1.0 -openapi-schema-validator==0.1.2 +openapi-schema-validator==0.1.3 openapi-spec-validator==0.3.0 oscrypto==1.2.1 packaging==20.9 @@ -357,7 +357,7 @@ prison==0.1.3 prometheus-client==0.8.0 prompt-toolkit==3.0.16 proto-plus==1.13.0 -protobuf==3.15.2 +protobuf==3.15.3 psutil==5.8.0 psycopg2-binary==2.8.6 ptyprocess==0.7.0 @@ -473,7 +473,7 @@ thrift==0.13.0 toml==0.10.2 toolz==0.11.1 tornado==6.1 -tqdm==4.57.0 +tqdm==4.58.0 traitlets==5.0.5 typed-ast==1.4.2 typing-extensions==3.7.4.3 diff --git a/constraints-3.8.txt b/constraints-3.8.txt index c4bec67..7cdc5f0 100644 --- a/constraints-3.8.txt +++ b/constraints-3.8.txt @@ -35,7 +35,7 @@ Unidecode==1.2.0 WTForms==2.3.3 Werkzeug==1.0.1 adal==1.2.6 -aiohttp==3.7.3 +aiohttp==3.7.4 alabaster==0.7.12 alembic==1.5.5 amqp==2.6.1 @@ -250,7 +250,7 @@ google-cloud-vision==1.0.0 google-cloud-workflows==0.2.0 google-crc32c==1.1.2 google-resumable-media==1.2.0 -googleapis-common-protos==1.52.0 +googleapis-common-protos==1.53.0 graphviz==0.16 greenlet==1.0.0 grpc-google-iam-v1==0.12.3 @@ -330,7 +330,7 @@ ntlm-auth==1.5.0 numpy==1.20.1 oauth2client==4.1.3 oauthlib==3.1.0 -openapi-schema-validator==0.1.2 +openapi-schema-validator==0.1.3 openapi-spec-validator==0.3.0 oscrypto==1.2.1 packaging==20.9 @@ -357,7 +357,7 @@ prison==0.1.3 prometheus-client==0.8.0 prompt-toolkit==3.0.16 proto-plus==1.13.0 -protobuf==3.15.2 +protobuf==3.15.3 psutil==5.8.0 psycopg2-binary==2.8.6 ptyprocess==0.7.0 @@ -473,7 +473,7 @@ thrift==0.13.0 toml==0.10.2 toolz==0.11.1 tornado==6.1 -tqdm==4.57.0 +tqdm==4.58.0 traitlets==5.0.5 typed-ast==1.4.2 typing-extensions==3.7.4.3 diff --git a/constraints-no-providers-3.6.txt b/constraints-no-providers-3.6.txt index 0e5e5f2..cc026d7 100644 --- a/constraints-no-providers-3.6.txt +++ b/constraints-no-providers-3.6.txt @@ -91,7 +91,7 @@ msgpack==1.0.2 natsort==7.1.1 numpy==1.19.5 oauthlib==2.1
[GitHub] [airflow] jbampton opened a new pull request #14472: Fix spelling
jbampton opened a new pull request #14472: URL: https://github.com/apache/airflow/pull/14472 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] thesuperzapper commented on issue #8179: Airflow LDAP authentication with RBAC features
thesuperzapper commented on issue #8179: URL: https://github.com/apache/airflow/issues/8179#issuecomment-786352937 For those watching, Flask-AppBuilder `3.2.0` now supports role binding, so if we update airflow to this version, we will get group binding. See issue https://github.com/apache/airflow/issues/14469 for upgrading airflows Flask-AppBuilder version. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil closed issue #13768: I set "start_date" as datetime but anyway I got this error
kaxil closed issue #13768: URL: https://github.com/apache/airflow/issues/13768 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on issue #13768: I set "start_date" as datetime but anyway I got this error
kaxil commented on issue #13768: URL: https://github.com/apache/airflow/issues/13768#issuecomment-786311969 This is fixed by https://github.com/apache/airflow/pull/14416 and will be released in 2.0.2 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated (2b5d4e3 -> 40a3e33)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from 2b5d4e3 Unable to trigger backfill or manual jobs with Kubernetes executor. (#14160) add 40a3e33 Rendering of IMAGES.rst was broken due to wrong header (#14471) No new revisions were added by this update. Summary of changes: IMAGES.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
[GitHub] [airflow] potiuk merged pull request #14471: Rendering of IMAGES.rst was broken due to wrong header
potiuk merged pull request #14471: URL: https://github.com/apache/airflow/pull/14471 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #14471: Rendering of IMAGES.rst was broken due to wrong header
potiuk commented on pull request #14471: URL: https://github.com/apache/airflow/pull/14471#issuecomment-786303400 cc: @ecerulm This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request #14471: Rendering of IMAGES.rst was broken due to wrong header
potiuk opened a new pull request #14471: URL: https://github.com/apache/airflow/pull/14471 This PR fixes it. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] omarish commented on issue #14266: Azure Provider related WARNING in webserver (version 2.0.1)
omarish commented on issue #14266: URL: https://github.com/apache/airflow/issues/14266#issuecomment-786298240 I worked around this by adding another Dockerfile with the pip install: ```dockerfile FROM apache/airflow:2.0.1 RUN pip uninstall --yes azure-storage && pip install -U azure-storage-blob apache-airflow-providers-microsoft-azure==1.1.0 ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #14470: Log all breeze output to a file automatically
potiuk commented on pull request #14470: URL: https://github.com/apache/airflow/pull/14470#issuecomment-786296780 cc: @ecerulm This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request #14470: Log all breeze output to a file automatically
potiuk opened a new pull request #14470: URL: https://github.com/apache/airflow/pull/14470 Fixes: #14390 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy commented on issue #14312: Remote log on azure blob storage display issue on microsoft provider airflow/providers/microsoft/azure/log/wasb_task_handler.py
ephraimbuddy commented on issue #14312: URL: https://github.com/apache/airflow/issues/14312#issuecomment-786292924 Yes. You'll have to wait for new provider release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ldacey commented on issue #14312: Remote log on azure blob storage display issue on microsoft provider airflow/providers/microsoft/azure/log/wasb_task_handler.py
ldacey commented on issue #14312: URL: https://github.com/apache/airflow/issues/14312#issuecomment-786289707 Nice, thanks. For future reference with changes that happen in provider packages, would I wait for pypi to have this change reflected here? https://pypi.org/project/apache-airflow-providers-microsoft-azure/ This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #14464: Updates docs to include docker resource requirements for quickstart
github-actions[bot] commented on pull request #14464: URL: https://github.com/apache/airflow/pull/14464#issuecomment-786286024 [The Workflow run](https://github.com/apache/airflow/actions/runs/601005315) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] thesuperzapper opened a new issue #14469: Upgrade Flask-AppBuilder to 3.2.0 for improved OAUTH/LDAP
thesuperzapper opened a new issue #14469: URL: https://github.com/apache/airflow/issues/14469 Version `3.2.0` of Flask-AppBuilder added support for LDAP group binding (see PR: https://github.com/dpgaspar/Flask-AppBuilder/pull/1374), we should update mainly for the `AUTH_ROLES_MAPPING` feature, which lets users bind to RBAC roles based on their LDAP/OAUTH group membership. Here are the docs about Flask-AppBuilder security: https://flask-appbuilder.readthedocs.io/en/latest/security.html#authentication-ldap This will resolve https://github.com/apache/airflow/issues/8179 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #14468: Adds --dry-run-docker flag to just print the docker commands
potiuk commented on pull request #14468: URL: https://github.com/apache/airflow/pull/14468#issuecomment-786284430 cc: @ecerulm This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request #14468: Adds --dry-run-docker flag to just print the docker commands
potiuk opened a new pull request #14468: URL: https://github.com/apache/airflow/pull/14468 Whenever docker commands should be used the --dry-run-docker flag will print the command rather than execute it. Closes: #14460 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk closed issue #12116: Update google-cloud deps to v2
potiuk closed issue #12116: URL: https://github.com/apache/airflow/issues/12116 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk closed issue #13282: Generate per-provider sources during release.
potiuk closed issue #13282: URL: https://github.com/apache/airflow/issues/13282 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk closed issue #13926: The apache-airflow-backport-providers-ssh version 2020.10.29 is broken
potiuk closed issue #13926: URL: https://github.com/apache/airflow/issues/13926 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #13926: The apache-airflow-backport-providers-ssh version 2020.10.29 is broken
potiuk commented on issue #13926: URL: https://github.com/apache/airflow/issues/13926#issuecomment-786281965 Fixed with 2021.2.5 release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy closed issue #14312: Remote log on azure blob storage display issue on microsoft provider airflow/providers/microsoft/azure/log/wasb_task_handler.py
ephraimbuddy closed issue #14312: URL: https://github.com/apache/airflow/issues/14312 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy commented on issue #14312: Remote log on azure blob storage display issue on microsoft provider airflow/providers/microsoft/azure/log/wasb_task_handler.py
ephraimbuddy commented on issue #14312: URL: https://github.com/apache/airflow/issues/14312#issuecomment-786272265 Closed via https://github.com/apache/airflow/pull/14313 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy commented on a change in pull request #14254: Features/sftp to wasb: Transfer to scan files from sftp source path and upload them to Azure Blob Storage
ephraimbuddy commented on a change in pull request #14254: URL: https://github.com/apache/airflow/pull/14254#discussion_r583233428 ## File path: tests/providers/microsoft/azure/transfers/test_sftp_to_wasb_system.py ## @@ -0,0 +1,66 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + + +import os + +import pytest + +from airflow.providers.microsoft.azure.example_dags.example_sftp_to_wasb import ( +AZURE_CONTAINER_NAME, +BLOB_PREFIX, +FILE_COMPLETE_PATH, +LOCAL_FILE_PATH, +SAMPLE_FILE_NAME, +SFTP_FILE_COMPLETE_PATH, +) +from airflow.providers.microsoft.azure.hooks.wasb import WasbHook +from airflow.providers.sftp.hooks.sftp import SFTPHook +from tests.test_utils.azure_system_helpers import ( +AZURE_DAG_FOLDER, +AzureSystemTest, +provide_wasb_default_connection, +) +from tests.test_utils.sftp_system_helpers import provide_sftp_default_connection + +CREDENTIALS_DIR = os.environ.get('CREDENTIALS_DIR', '/files/airflow-breeze-config/keys') +SFTP_DEFAULT_KEY = 'sftp_key.json' +WASB_DEFAULT_KEY = 'wasb_key.json' +CREDENTIALS_SFTP_PATH = os.path.join(CREDENTIALS_DIR, SFTP_DEFAULT_KEY) +CREDENTIALS_WASB_PATH = os.path.join(CREDENTIALS_DIR, WASB_DEFAULT_KEY) + + +@pytest.mark.backend('postgres', 'mysql') +@pytest.mark.credential_file(WASB_DEFAULT_KEY) +@pytest.mark.credential_file(SFTP_DEFAULT_KEY) +class TestSFTPToWasbSystem(AzureSystemTest): +def setUp(self): +super().setUp() +self.create_dummy_file(SAMPLE_FILE_NAME, LOCAL_FILE_PATH) + +def tearDown(self): +os.remove(FILE_COMPLETE_PATH) +super().tearDown() + +@provide_wasb_default_connection(CREDENTIALS_WASB_PATH) +@provide_sftp_default_connection(CREDENTIALS_SFTP_PATH) +def test_run_example_file_to_wasb(self): +self.run_dag('example_sftp_to_wasb', AZURE_DAG_FOLDER) +WasbHook(wasb_conn_id="wasb_default").delete_file( +AZURE_CONTAINER_NAME, BLOB_PREFIX + SAMPLE_FILE_NAME +) + SFTPHook(ssh_conn_id="sftp_default").delete_file(SFTP_FILE_COMPLETE_PATH) Review comment: ```suggestion ``` Can you move this to the example dag? We have Wasb delete operator. Idealy we should have sftp hook delete operator too but for now, you can use the python operator to run it. https://github.com/apache/airflow/blob/2b5d4e3ff3c61ea6074caa300bbb8d16027408a6/airflow/providers/microsoft/azure/example_dags/example_fileshare.py#L37-L40 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy commented on a change in pull request #14254: Features/sftp to wasb: Transfer to scan files from sftp source path and upload them to Azure Blob Storage
ephraimbuddy commented on a change in pull request #14254: URL: https://github.com/apache/airflow/pull/14254#discussion_r583233428 ## File path: tests/providers/microsoft/azure/transfers/test_sftp_to_wasb_system.py ## @@ -0,0 +1,66 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + + +import os + +import pytest + +from airflow.providers.microsoft.azure.example_dags.example_sftp_to_wasb import ( +AZURE_CONTAINER_NAME, +BLOB_PREFIX, +FILE_COMPLETE_PATH, +LOCAL_FILE_PATH, +SAMPLE_FILE_NAME, +SFTP_FILE_COMPLETE_PATH, +) +from airflow.providers.microsoft.azure.hooks.wasb import WasbHook +from airflow.providers.sftp.hooks.sftp import SFTPHook +from tests.test_utils.azure_system_helpers import ( +AZURE_DAG_FOLDER, +AzureSystemTest, +provide_wasb_default_connection, +) +from tests.test_utils.sftp_system_helpers import provide_sftp_default_connection + +CREDENTIALS_DIR = os.environ.get('CREDENTIALS_DIR', '/files/airflow-breeze-config/keys') +SFTP_DEFAULT_KEY = 'sftp_key.json' +WASB_DEFAULT_KEY = 'wasb_key.json' +CREDENTIALS_SFTP_PATH = os.path.join(CREDENTIALS_DIR, SFTP_DEFAULT_KEY) +CREDENTIALS_WASB_PATH = os.path.join(CREDENTIALS_DIR, WASB_DEFAULT_KEY) + + +@pytest.mark.backend('postgres', 'mysql') +@pytest.mark.credential_file(WASB_DEFAULT_KEY) +@pytest.mark.credential_file(SFTP_DEFAULT_KEY) +class TestSFTPToWasbSystem(AzureSystemTest): +def setUp(self): +super().setUp() +self.create_dummy_file(SAMPLE_FILE_NAME, LOCAL_FILE_PATH) + +def tearDown(self): +os.remove(FILE_COMPLETE_PATH) +super().tearDown() + +@provide_wasb_default_connection(CREDENTIALS_WASB_PATH) +@provide_sftp_default_connection(CREDENTIALS_SFTP_PATH) +def test_run_example_file_to_wasb(self): +self.run_dag('example_sftp_to_wasb', AZURE_DAG_FOLDER) +WasbHook(wasb_conn_id="wasb_default").delete_file( +AZURE_CONTAINER_NAME, BLOB_PREFIX + SAMPLE_FILE_NAME +) + SFTPHook(ssh_conn_id="sftp_default").delete_file(SFTP_FILE_COMPLETE_PATH) Review comment: ```suggestion ``` Can you move this to the example dag? We have Wasb delete operator. Idealy we should have sftp hook operator too but for now, you can use the python operator to run it. https://github.com/apache/airflow/blob/2b5d4e3ff3c61ea6074caa300bbb8d16027408a6/airflow/providers/microsoft/azure/example_dags/example_fileshare.py#L37-L40 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] SamWheating commented on issue #13768: I set "start_date" as datetime but anyway I got this error
SamWheating commented on issue #13768: URL: https://github.com/apache/airflow/issues/13768#issuecomment-786250451 > The traceback suggest that you are using get_previous_start_date in your DAG. We've seen this issue on one of our DAGs as well (when viewing task details from a run with many preceeding runs) and I can confirm we are not using the `get_previous_start_date` in the DAG directly - In our case (as well as the above stack trace) it looks like this is called by the get_attr when rendering the task instance details page. I'll see if I can gather more information on the cause of this issue and provide a simple example DAG. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal commented on issue #9373: path to file as sql parameter in PostgresOperator not working anymore
eladkal commented on issue #9373: URL: https://github.com/apache/airflow/issues/9373#issuecomment-786202782 I'm unable to reproduce. ``` with DAG( dag_id="testing_postgres", default_args=default_args, schedule_interval=None ) as dag: t3 = PostgresOperator( task_id='get_tables_size', postgres_conn_id='postgres_new', sql='maintenance/queries/my.sql' ) ``` ![Screen Shot 2021-02-25 at 23 05 05](https://user-images.githubusercontent.com/45845474/109216999-ecc83480-77bd-11eb-84a4-0a0cc6fe5413.png) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] r-richmond commented on issue #14396: Make context less nebulous
r-richmond commented on issue #14396: URL: https://github.com/apache/airflow/issues/14396#issuecomment-786194069 >Not really. We are not planning to add anything to the context any time soon. and even if we do it's the same for dict/field. If someone extends data class with a new field the problem is the same. >`A safer way for maintainers to add new fields to context` > Not really. it's the same kind of problems you get. I don't follow here. In the V2 dataclass design users would add stuff to the `user_defined` field which has no chance of collision with anything airflow adds later to the base dataclass. The same can't be said for dicts. What am I missing? > `A clean way to implement deprecation warnings with detailed warning messages about potential silent bugs` We do not need deprecation warnings in case we do not change from Dict True regarding the deprecation warning, but isn't their value in warning users about doing things could cause bugs? (i.e. overwriting the wrong keys in context). > `More flexibility down the road (dataclasses are more flexible than dictionaries)` This sentence is meaningless. I argue that dicts are more flexible and probably we would both be right. I like the meet in the middle vibe but I kind of cheated with the `user_defined` field being a dict which I think pushes dataclass to the winners column. > `A solution that is easier to maintain in the future` Again - meaningless - maintenance is also to go trough the hassle of changing and informing users. I think we are just stuck on 2 different points here, you rightly point to the burden of migrating users and I'm stuck on what I perceive are the long-term benefits of switching (safer modifications, better IDE integration usability improvements, warning users about things that could cause bugs). > `Or said another way we shouldn't optimize for airflow 2.x maintainability we should optimize for airflow maintainability.` I do not agree. I carefully weighted pros/cons and as maintainer i agree with @kaxil TypedDict is much better solution and we will have no plans to change to Dataclass. You have not convinced us. I'm just a user so apologies if I came across as forceful. It is clear we have a difference of opinion in which case the maintainer usually and rightly decides upon the path forward for the project. Lastly at this point I'm gonna slow down on this topic for a while to allow other users & maintainers a chance to give their opinion/thoughts. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal commented on issue #10122: SMTP connection in email utils has no default timeout which causes the connection to hang indefinitely if it can't reach the SMTP server.
eladkal commented on issue #10122: URL: https://github.com/apache/airflow/issues/10122#issuecomment-786177390 default timeout was added in https://github.com/apache/airflow/pull/12801 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal closed issue #10122: SMTP connection in email utils has no default timeout which causes the connection to hang indefinitely if it can't reach the SMTP server.
eladkal closed issue #10122: URL: https://github.com/apache/airflow/issues/10122 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] mik-laj commented on issue #9855: Import connections/variables from a file
mik-laj commented on issue #9855: URL: https://github.com/apache/airflow/issues/9855#issuecomment-786162914 Here is PR:https://github.com/apache/airflow/pull/9907 Feel free to continue it This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ferruzzi commented on a change in pull request #14402: Implemented S3 Bucket Tagging
ferruzzi commented on a change in pull request #14402: URL: https://github.com/apache/airflow/pull/14402#discussion_r583121536 ## File path: tests/providers/amazon/aws/operators/test_s3_bucket_tagging.py ## @@ -0,0 +1,120 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +import os +import unittest +from unittest import mock + +from moto import mock_s3 + +from airflow.providers.amazon.aws.hooks.s3 import S3Hook +from airflow.providers.amazon.aws.operators.s3_bucket_tagging import ( +S3DeleteBucketTaggingOperator, +S3GetBucketTaggingOperator, +S3PutBucketTaggingOperator, +) + +BUCKET_NAME = os.environ.get("BUCKET_NAME", "test-airflow-bucket") +TAG_SET = [{'Key': 'Color', 'Value': 'Green'}] +TASK_ID = os.environ.get("TASK_ID", "test-s3-operator") + + +class TestS3GetBucketTaggingOperator(unittest.TestCase): +def setUp(self): +self.get_bucket_tagging_operator = S3GetBucketTaggingOperator( +task_id=TASK_ID, +bucket_name=BUCKET_NAME, +) + +@mock_s3 +@mock.patch.object(S3Hook, "get_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_bucket_exist(self, mock_check_for_bucket, get_bucket_tagging): +mock_check_for_bucket.return_value = True +# execute s3 get bucket tagging operator +self.get_bucket_tagging_operator.execute({}) +mock_check_for_bucket.assert_called_once_with(BUCKET_NAME) +get_bucket_tagging.assert_called_once_with(BUCKET_NAME) + +@mock_s3 +@mock.patch.object(S3Hook, "get_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_not_bucket_exist(self, mock_check_for_bucket, get_bucket_tagging): +mock_check_for_bucket.return_value = False +# execute s3 get bucket tagging operator +self.get_bucket_tagging_operator.execute({}) +mock_check_for_bucket.assert_called_once_with(BUCKET_NAME) +get_bucket_tagging.assert_not_called() + + +class TestS3PutBucketTaggingOperator(unittest.TestCase): +def setUp(self): +self.put_bucket_tagging_operator = S3PutBucketTaggingOperator( +task_id=TASK_ID, +tag_set=TAG_SET, +bucket_name=BUCKET_NAME, +) + +@mock_s3 +@mock.patch.object(S3Hook, "put_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_bucket_exist(self, mock_check_for_bucket, put_bucket_tagging): +mock_check_for_bucket.return_value = True +# execute s3 put bucket tagging operator +self.put_bucket_tagging_operator.execute({}) +mock_check_for_bucket.assert_called_once_with(BUCKET_NAME) +put_bucket_tagging.assert_called_once_with( +key=None, value=None, tag_set=TAG_SET, bucket_name=BUCKET_NAME +) + +@mock_s3 +@mock.patch.object(S3Hook, "put_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_not_bucket_exist(self, mock_check_for_bucket, put_bucket_tagging): +mock_check_for_bucket.return_value = False +# execute s3 put bucket tagging operator +self.put_bucket_tagging_operator.execute({}) +mock_check_for_bucket.assert_called_once_with(BUCKET_NAME) +put_bucket_tagging.assert_not_called() + + +class TestS3DeleteBucketTaggingOperator(unittest.TestCase): +def setUp(self): +self.delete_bucket_tagging_operator = S3DeleteBucketTaggingOperator( +task_id=TASK_ID, +bucket_name=BUCKET_NAME, +) + +@mock_s3 +@mock.patch.object(S3Hook, "delete_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_bucket_exist(self, mock_check_for_bucket, delete_bucket_tagging): +mock_check_for_bucket.return_value = True +# execute s3 get bucket tagging operator Review comment: Indeed. Corrected, thanks. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructur
[GitHub] [airflow] github-actions[bot] commented on pull request #14454: lazy load stats logger instance
github-actions[bot] commented on pull request #14454: URL: https://github.com/apache/airflow/pull/14454#issuecomment-786159220 [The Workflow run](https://github.com/apache/airflow/actions/runs/600571017) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] lewismc commented on issue #14261: Airflow Scheduler liveness probe crashing (version 2.0)
lewismc commented on issue #14261: URL: https://github.com/apache/airflow/issues/14261#issuecomment-786154745 We are experiencing this when deploying the chart into a local installations of k3d ``` % kubectl get pods -n airflow-test-local NAME READY STATUS RESTARTS AGE airflow-statsd-5556dc96bc-w28cz 1/1Running 0 7m29s airflow-postgresql-0 1/1Running 0 7m29s airflow-webserver-7d5fbc5675-x6dc7 1/1Running 0 7m29s airflow-scheduler-7f59d9c69c-5v9pl 2/3CrashLoopBackOff 7 7m29s airflow-cleanup-1614276000-xbcmz 0/1Completed 0 39s airflow-scheduler-7f59d9c69c-cvzvx 2/3CrashLoopBackOff 7 7m29s ``` We also found some interesting `WARNING`'s when looking into the `wait-for-airflow-migrations` container... ``` % kubectl logs airflow-webserver-7d5fbc5675-x6dc7 -c wait-for-airflow-migrations -n airflow-test-local BACKEND=postgresql DB_HOST=airflow-postgresql.airflow-test-local.svc.cluster.local DB_PORT=5432 [2021-02-25 17:53:43,435] {migration.py:163} INFO - Context impl PostgresqlImpl. [2021-02-25 17:53:43,436] {migration.py:170} INFO - Will assume transactional DDL. [2021-02-25 17:53:49,416] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.microsoft.azure.hooks.wasb.WasbHook' from 'apache-airflow-providers-microsoft-azure' package: No module named 'azure.storage.blob' [2021-02-25 17:53:50,300] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.microsoft.azure.hooks.wasb.WasbHook' from 'apache-airflow-providers-microsoft-azure' package: No module named 'azure.storage.blob' [2021-02-25 17:53:51,345] {:35} INFO - Waiting for migrations... 1 second(s) [2021-02-25 17:53:52,349] {:35} INFO - Waiting for migrations... 2 second(s) [2021-02-25 17:53:53,352] {:35} INFO - Waiting for migrations... 3 second(s) [2021-02-25 17:53:54,355] {:35} INFO - Waiting for migrations... 4 second(s) [2021-02-25 17:53:55,358] {:35} INFO - Waiting for migrations... 5 second(s) echiu@MT-308022 chart % ``` I don't think that Azure hooks should be interpreted by default... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #14463: Release Kubernetes Providers 1.0.2
potiuk commented on issue #14463: URL: https://github.com/apache/airflow/issues/14463#issuecomment-786143337 If you run the 'prepare readme' without specifying the providers you should see which of the providers require release @ashb and gives you a chance to decide which version should each released provider be released at - with the possibility of looking at the commits to make sure. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #14463: Release Kubernetes Providers 1.0.2
potiuk commented on issue #14463: URL: https://github.com/apache/airflow/issues/14463#issuecomment-786141776 BTW Should not we release the next wave of the providers ? It is about the time we would plan to do it and then we could vote on all of them at once ? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated (4455f14 -> 2b5d4e3)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from 4455f14 Fix failing docs build on Master (#14465) add 2b5d4e3 Unable to trigger backfill or manual jobs with Kubernetes executor. (#14160) No new revisions were added by this update. Summary of changes: airflow/jobs/backfill_job.py| 1 + airflow/www/views.py| 1 + tests/jobs/test_backfill_job.py | 17 + 3 files changed, 19 insertions(+)
[GitHub] [airflow] kaxil closed issue #13805: Could not get scheduler_job_id
kaxil closed issue #13805: URL: https://github.com/apache/airflow/issues/13805 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #14160: Unable to trigger backfill or manual jobs with Kubernetes executor.
boring-cyborg[bot] commented on pull request #14160: URL: https://github.com/apache/airflow/pull/14160#issuecomment-786139422 Awesome work, congrats on your first merged pull request! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil merged pull request #14160: Unable to trigger backfill or manual jobs with Kubernetes executor.
kaxil merged pull request #14160: URL: https://github.com/apache/airflow/pull/14160 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk edited a comment on issue #14396: Make context less nebulous
potiuk edited a comment on issue #14396: URL: https://github.com/apache/airflow/issues/14396#issuecomment-786135586 > Better IDE support (find usages/ refactoring / name highlighting) for users and mainters (see above for image example) There won't by any renames for those. That would further break compatibility. so making it "easier" is false sense of security when you rename > A safer interface for users who want to add custom fields to context Not really. We are not planning to add anything to the context any time soon. and even if we do it's the same for dict/field. If someone extends data class with a new field the problem is the same. > A safer way for maintainers to add new fields to context Not really. it's the same kind of problems you get. > A clean way to implement deprecation warnings with detailed warning messages about potential silent bugs We do not need deprecation warnings in case we do not change from Dict > More flexibility down the road (dataclasses are more flexible than dictionaries) This sentence is meaningless. I argue that dicts are more flexible and probably we would both be right. > A solution that is easier to maintain in the future Again - meaningless - maintenance is also to go trough the hassle of changing and informing users. > Or said another way we shouldn't optimize for airflow 2.x maintainability we should optimize for airflow maintainability. I do not agree. I carefully weighted pros/cons and as maintainer i agree with @kaxil TypedDict is much better solution and we will have no plans to change to Dataclass. You have not convinced us. > p.s. sorry for another wall of text. I guess it turns out that I'm a little passionate on this one... No problems with being passionate (I am very passionate myself as some other committers might attest to), but I think it's good to realise that passion might easily turn into obsession and be able to say 'meh'. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #14396: Make context less nebulous
potiuk commented on issue #14396: URL: https://github.com/apache/airflow/issues/14396#issuecomment-786135586 > Better IDE support (find usages/ refactoring / name highlighting) for users and mainters (see above for image example) There won't by any renames for those. That would further break compatibility. so making it "easier" is false sense of security when you rename > A safer interface for users who want to add custom fields to context Not really. We are not planning to add anything to the context any time soon. and even if we do it's the same for dict/field. If someone extends data class with a new field the problem is the same. > A safer way for maintainers to add new fields to context Not really. it's the same kind of problems you get. A clean way to implement deprecation warnings with detailed warning messages about potential silent bugs > We do not need deprecation warnings in case we do not change from Dict > More flexibility down the road (dataclasses are more flexible than dictionaries) This sentence is meaningless. I argue that dicts are more flexible and probably we would both be right. * A solution that is easier to maintain in the future Again - meaningless - maintenance is also to go trough the hassle of changing and informing users. > Or said another way we shouldn't optimize for airflow 2.x maintainability we should optimize for airflow maintainability. I do not agree. I carefully weighted pros/cons and as maintainer i agree with @kaxil TypedDict is much better solution and we will have no plans to change to Dataclass. You have not convinced us. > p.s. sorry for another wall of text. I guess it turns out that I'm a little passionate on this one... No problems with being passionate (I am very passionate myself as some other committers might attest to), but I think it's good to realise that passion might easily turn into obsession and be able to say 'meh'. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] RosterIn commented on pull request #11223: Create stat name handler not supported rule
RosterIn commented on pull request #11223: URL: https://github.com/apache/airflow/pull/11223#issuecomment-786131839 @FHoffmannCode test is failing ``` === FAILURES === ___ TestAirflowMacroPluginRemovedRule.test_bad_file_failure self = mock_list_files = def test_bad_file_failure(self, mock_list_files): # Write a binary file with NamedTemporaryFile("wb+", suffix=".py") as temp_file: mock_list_files.return_value = [temp_file.name] temp_file.write(b"{\x03\xff\x00d") temp_file.flush() rule = AirflowMacroPluginRemovedRule() msgs = rule.check() > assert 1 == len(msgs) E AssertionError: assert 1 == 0 E+ where 0 = len([]) tests/upgrade/rules/test_airflow_macro_plugin_removed.py:92: AssertionError ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy commented on a change in pull request #14219: Provide login endpoint for the REST API with JWT authentication method
ephraimbuddy commented on a change in pull request #14219: URL: https://github.com/apache/airflow/pull/14219#discussion_r583090567 ## File path: airflow/api_connexion/openapi/v1.yaml ## @@ -1381,11 +1381,198 @@ paths: schema: $ref: '#/components/schemas/VersionInfo' + /login: +post: + summary: User login + description: | +Verify user and return a user object and JWT token as well + x-openapi-router-controller: airflow.api_connexion.endpoints.user_endpoint + operationId: login + tags: [User] + requestBody: +required: true +content: + application/json: +schema: + $ref: '#/components/schemas/Login' + responses: +'200': + description: Success. + content: +application/json: + schema: +$ref: '#/components/schemas/UserLogin' +'400': + $ref: '#/components/responses/BadRequest' +'401': + $ref: '#/components/responses/Unauthenticated' components: # Reusable schemas (data models) schemas: # Database entities +UserCollectionItem: + description: > +User collection item + type: object + properties: +id: + type: string + description: The user id + readOnly: true +first_name: + type: string + description: The user firstname +last_name: + type: string + description: The user lastname +username: + type: string + description: The username +email: + type: string + description: The user's email +active: + type: boolean + description: Whether the user is active +last_login: + type: string + format: datetime + description: The last user login + readOnly: true +login_count: + type: integer + description: The login count + readOnly: true +failed_login_count: + type: integer + description: The number of times the login failed + readOnly: true +roles: + type: array + description: User roles + items: +$ref: '#/components/schemas/RoleCollectionItem' + readOnly: true + nullable: true +created_on: + type: string + format: datetime + description: The date user was created + readOnly: true +changed_on: + type: string + format: datetime + description: The date user was changed + readOnly: true + +UserCollection: + description: User collection + type: object + properties: +users: + type: array + items: + $ref: '#/components/schemas/UserCollectionItem' + +UserLogin: + description: Login item + allOf: +- $ref: '#/components/schemas/UserCollectionItem' +- type: object + properties: +token: + type: string + nullable: false + description: JWT token + +RoleCollectionItem: + description: Role collection item + type: object + properties: +id: + type: string + description: The role ID +name: + type: string + description: The name of the role +permissions: + type: array + items: +$ref: '#/components/schemas/PermissionView' + +RoleCollection: + description: Role Collections + type: object + properties: +roles: + type: array + items: +$ref: '#/components/schemas/RoleCollectionItem' + +PermissionCollectionItem: + description: Permission Collection Item + type: object + properties: +id: + type: string + description: The permission ID +name: + type: string + description: The name of the permission + nullable: false + +PermissionCollection: + description: Permission Collection + type: object + properties: +permissions: + type: array + items: +$ref: '#/components/schemas/PermissionCollectionItem' + +PermissionView: + description: Permission view item + type: object + properties: +id: + type: string + description: The PermissionView ID +permission_id: + type: string + description: The permission ID +permission: + type: string + description: The name of the permission +view_menu_id: + type: string + description: The view menu id +view_menu_name: + type: string + description: The view menu name + +ViewMenuCollectionItem: + description: ViewMenu Collecti
[GitHub] [airflow] kaxil commented on pull request #11015: Add Azure Data Factory hook
kaxil commented on pull request #11015: URL: https://github.com/apache/airflow/pull/11015#issuecomment-786128643 The PR was some 200 commits behind of Master -- so rebased and pushed --let's wait for the CI results again This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated (c71f707 -> 4455f14)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from c71f707 Make airflow dags show command display TaskGroup (#14269) add 4455f14 Fix failing docs build on Master (#14465) No new revisions were added by this update. Summary of changes: docs/apache-airflow-providers-tableau/index.rst | 6 -- 1 file changed, 6 deletions(-)
[GitHub] [airflow] potiuk commented on issue #14463: Release Kubernetes Providers 1.0.2
potiuk commented on issue #14463: URL: https://github.com/apache/airflow/issues/14463#issuecomment-786127534 Ah. I totally missed it @kaxil! sorry. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil merged pull request #14465: Fix failing docs build on Master
kaxil merged pull request #14465: URL: https://github.com/apache/airflow/pull/14465 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on pull request #14465: Fix failing docs build on Master
kaxil commented on pull request #14465: URL: https://github.com/apache/airflow/pull/14465#issuecomment-786126958 Tested it locally ``` [1/1] apache-airflow-providers-tableau Building docs: apache-airflow-providers-tableau Executing cmd: sphinx-build -T --color -b html -d /opt/airflow/docs/_doctrees/docs/apache-airflow-providers-tableau -c /opt/airflow/docs -w /tmp/tmp1pee13_f /opt/airflow/docs/apache-airflow-providers-tableau /opt/airflow/docs/_build/docs/apache-airflow-providers-tableau/latest The output is hidden until an error occurs. Check spelling: apache-airflow-providers-tableau Executing cmd: sphinx-build -W --color -T -b spelling -c /opt/airflow/docs -d /opt/airflow/docs/_doctrees/docs/apache-airflow-providers-tableau /opt/airflow/docs/apache-airflow-providers-tableau /tmp/tmpxyhbegbt The output is hidden until an error occurs. ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] caioalmeida97 commented on issue #9855: Import connections/variables from a file
caioalmeida97 commented on issue #9855: URL: https://github.com/apache/airflow/issues/9855#issuecomment-786125920 @mik-laj is there any updates on this feature? I tried to run `airflow connections import $JSON_FILE` but it didn't work :( This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #14461: BugFix: Set correct Pod State in queue after processing status.
github-actions[bot] commented on pull request #14461: URL: https://github.com/apache/airflow/pull/14461#issuecomment-786125852 [The Workflow run](https://github.com/apache/airflow/actions/runs/600437415) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil opened a new pull request #14465: Fix failing docs build on Master
kaxil opened a new pull request #14465: URL: https://github.com/apache/airflow/pull/14465 https://github.com/apache/airflow/pull/14030 caused this issue --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ShawnMcGough commented on pull request #11015: Add Azure Data Factory hook
ShawnMcGough commented on pull request #11015: URL: https://github.com/apache/airflow/pull/11015#issuecomment-786124835 @kaxil - does this need to be re-queued or just waiting for the reviewers? Thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #14464: Updates docs to include docker resource requirements for quickstart
boring-cyborg[bot] commented on pull request #14464: URL: https://github.com/apache/airflow/pull/14464#issuecomment-786123119 Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst) Here are some useful points: - Pay attention to the quality of your code (flake8, pylint and type annotations). Our [pre-commits]( https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that. - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short [guide](https://github.com/apache/airflow/blob/master/docs/apache-airflow/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it. - Consider using [Breeze environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations. - Be patient and persistent. It might take some time to get a review or get the final approval from Committers. - Please follow [ASF Code of Conduct](https://www.apache.org/foundation/policies/conduct) for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack. - Be sure to read the [Airflow Coding style]( https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices). Apache Airflow is a community-driven project and together we are making it better 🚀. In case of doubts contact the developers at: Mailing List: d...@airflow.apache.org Slack: https://s.apache.org/airflow-slack This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] cmarteepants opened a new pull request #14464: Updates docs to include docker resource requirements for quickstart
cmarteepants opened a new pull request #14464: URL: https://github.com/apache/airflow/pull/14464 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on issue #14463: Release Kubernetes Providers 1.0.2
ashb commented on issue #14463: URL: https://github.com/apache/airflow/issues/14463#issuecomment-786114329 On it This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #14436: BugFix: Serialize max_retry_delay as a timedelta
github-actions[bot] commented on pull request #14436: URL: https://github.com/apache/airflow/pull/14436#issuecomment-786107364 [The Workflow run](https://github.com/apache/airflow/actions/runs/600360473) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil closed issue #14287: Migration of env_vars from dict to a List[V1EnvVar] for KubernetesPodOperator breaks Jinja template processing of env_vars with templates like {{ dags.con
kaxil closed issue #14287: URL: https://github.com/apache/airflow/issues/14287 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on issue #14287: Migration of env_vars from dict to a List[V1EnvVar] for KubernetesPodOperator breaks Jinja template processing of env_vars with templates like {{ da
kaxil commented on issue #14287: URL: https://github.com/apache/airflow/issues/14287#issuecomment-786103943 Created a separate ticket for it: https://github.com/apache/airflow/issues/14463 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil opened a new issue #14463: Release Kubernetes Providers 1.0.2
kaxil opened a new issue #14463: URL: https://github.com/apache/airflow/issues/14463 This should be fixed by https://github.com/apache/airflow/pull/14123 @ashb / @dimberman / @potiuk Can one of you release new version of cncf.kubernetes provider please _Originally posted by @kaxil in https://github.com/apache/airflow/issues/14287#issuecomment-781675876_ This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] xinbinhuang edited a comment on pull request #14418: Fix #14417 : Failing to submit Druid ingestion task
xinbinhuang edited a comment on pull request #14418: URL: https://github.com/apache/airflow/pull/14418#issuecomment-786093693 This is caused by #7127. It would be nice if you can also add a test case in [test_druid.py](https://github.com/apache/airflow/blob/master/tests/providers/apache/druid/operators/test_druid.py) to read `inde_spec` from a JSON file. You can do it roughly like this ```python def test_render_template_from_file(self): with NamedTemporaryFile("w") as f: index_json_str = ''' { "type": "{{ params.index_type }}", "datasource": "{{ params.datasource }}", "spec": { "dataSchema": { "granularitySpec": { "intervals": ["{{ ds }}/{{ macros.ds_add(ds, 1) }}"] } } } } ''' f.write(index_json_str) f.flush() operator = DruidOperator( task_id='spark_submit_job', json_index_file=f.name, params={ 'index_type': 'index_hadoop', 'datasource': 'datasource_prd' }, dag=self.dag ) ti = TaskInstance(operator, DEFAULT_DATE) ti.render_templates() expected = ... ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on pull request #14420: Don't create unittest.cfg when not running in unit test mode
ashb commented on pull request #14420: URL: https://github.com/apache/airflow/pull/14420#issuecomment-786093971 Oh wat. using Pep562 in the module somehow makes AirflowConfigParser unpickelable! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] xinbinhuang edited a comment on pull request #14418: Fix #14417 : Failing to submit Druid ingestion task
xinbinhuang edited a comment on pull request #14418: URL: https://github.com/apache/airflow/pull/14418#issuecomment-786093693 This is caused by #7127. It would be nice if you can also add a test case in [test_druid.py](https://github.com/apache/airflow/blob/master/tests/providers/apache/druid/operators/test_druid.py) to read `inde_spec` from a JSON file. You can do it roughly like this ```python def test_render_template_from_file(self): with NamedTemporaryFile("w") as f: index_json_str = ''' { "type": "{{ params.index_type }}", "datasource": "{{ params.datasource }}", "spec": { "dataSchema": { "granularitySpec": { "intervals": ["{{ ds }}/{{ macros.ds_add(ds, 1) }}"] } } } } ''' f.write(cloud_build_config) f.flush() operator = DruidOperator( task_id='spark_submit_job', json_index_file=f.name, params={ 'index_type': 'index_hadoop', 'datasource': 'datasource_prd' }, dag=self.dag ) ti = TaskInstance(operator, DEFAULT_DATE) ti.render_templates() expected = ... ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] xinbinhuang commented on pull request #14418: Fix #14417 : Failing to submit Druid ingestion task
xinbinhuang commented on pull request #14418: URL: https://github.com/apache/airflow/pull/14418#issuecomment-786093693 This is caused by #7127. It would be nice if you can also add a test case in [test_druid.py](https://github.com/apache/airflow/blob/master/tests/providers/apache/druid/operators/test_druid.py) to read `inde_spec` from a JSON file. You can do it roughly like this ```python def test_read_spec_from_file(self): with NamedTemporaryFile("w") as f: index_json_str = ''' { "type": "{{ params.index_type }}", "datasource": "{{ params.datasource }}", "spec": { "dataSchema": { "granularitySpec": { "intervals": ["{{ ds }}/{{ macros.ds_add(ds, 1) }}"] } } } } ''' f.write(cloud_build_config) f.flush() operator = DruidOperator( task_id='spark_submit_job', json_index_file=f.name, params={ 'index_type': 'index_hadoop', 'datasource': 'datasource_prd' }, dag=self.dag ) ti = TaskInstance(operator, DEFAULT_DATE) ti.render_templates() expected = ... ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #14462: Add 'airflow variables list' command for 1.10.x transition version
github-actions[bot] commented on pull request #14462: URL: https://github.com/apache/airflow/pull/14462#issuecomment-786093326 [The Workflow run](https://github.com/apache/airflow/actions/runs/600054803) is cancelling this PR. Building images for the PR has failed. Follow the the workflow link to check the reason. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] o-nikolas commented on a change in pull request #14402: Implemented S3 Bucket Tagging
o-nikolas commented on a change in pull request #14402: URL: https://github.com/apache/airflow/pull/14402#discussion_r583046506 ## File path: tests/providers/amazon/aws/operators/test_s3_bucket_tagging.py ## @@ -0,0 +1,120 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +import os +import unittest +from unittest import mock + +from moto import mock_s3 + +from airflow.providers.amazon.aws.hooks.s3 import S3Hook +from airflow.providers.amazon.aws.operators.s3_bucket_tagging import ( +S3DeleteBucketTaggingOperator, +S3GetBucketTaggingOperator, +S3PutBucketTaggingOperator, +) + +BUCKET_NAME = os.environ.get("BUCKET_NAME", "test-airflow-bucket") +TAG_SET = [{'Key': 'Color', 'Value': 'Green'}] +TASK_ID = os.environ.get("TASK_ID", "test-s3-operator") + + +class TestS3GetBucketTaggingOperator(unittest.TestCase): +def setUp(self): +self.get_bucket_tagging_operator = S3GetBucketTaggingOperator( +task_id=TASK_ID, +bucket_name=BUCKET_NAME, +) + +@mock_s3 +@mock.patch.object(S3Hook, "get_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_bucket_exist(self, mock_check_for_bucket, get_bucket_tagging): +mock_check_for_bucket.return_value = True +# execute s3 get bucket tagging operator +self.get_bucket_tagging_operator.execute({}) +mock_check_for_bucket.assert_called_once_with(BUCKET_NAME) +get_bucket_tagging.assert_called_once_with(BUCKET_NAME) + +@mock_s3 +@mock.patch.object(S3Hook, "get_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_not_bucket_exist(self, mock_check_for_bucket, get_bucket_tagging): +mock_check_for_bucket.return_value = False +# execute s3 get bucket tagging operator +self.get_bucket_tagging_operator.execute({}) +mock_check_for_bucket.assert_called_once_with(BUCKET_NAME) +get_bucket_tagging.assert_not_called() + + +class TestS3PutBucketTaggingOperator(unittest.TestCase): +def setUp(self): +self.put_bucket_tagging_operator = S3PutBucketTaggingOperator( +task_id=TASK_ID, +tag_set=TAG_SET, +bucket_name=BUCKET_NAME, +) + +@mock_s3 +@mock.patch.object(S3Hook, "put_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_bucket_exist(self, mock_check_for_bucket, put_bucket_tagging): +mock_check_for_bucket.return_value = True +# execute s3 put bucket tagging operator +self.put_bucket_tagging_operator.execute({}) +mock_check_for_bucket.assert_called_once_with(BUCKET_NAME) +put_bucket_tagging.assert_called_once_with( +key=None, value=None, tag_set=TAG_SET, bucket_name=BUCKET_NAME +) + +@mock_s3 +@mock.patch.object(S3Hook, "put_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_not_bucket_exist(self, mock_check_for_bucket, put_bucket_tagging): +mock_check_for_bucket.return_value = False +# execute s3 put bucket tagging operator +self.put_bucket_tagging_operator.execute({}) +mock_check_for_bucket.assert_called_once_with(BUCKET_NAME) +put_bucket_tagging.assert_not_called() + + +class TestS3DeleteBucketTaggingOperator(unittest.TestCase): +def setUp(self): +self.delete_bucket_tagging_operator = S3DeleteBucketTaggingOperator( +task_id=TASK_ID, +bucket_name=BUCKET_NAME, +) + +@mock_s3 +@mock.patch.object(S3Hook, "delete_bucket_tagging") +@mock.patch.object(S3Hook, "check_for_bucket") +def test_execute_if_bucket_exist(self, mock_check_for_bucket, delete_bucket_tagging): +mock_check_for_bucket.return_value = True +# execute s3 get bucket tagging operator Review comment: nit: Looks like this comment was copy/pasted from the Get test class, s/get/delete/g Same in the method below. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go
[GitHub] [airflow] github-actions[bot] commented on pull request #14462: Add 'airflow variables list' command for 1.10.x transition version
github-actions[bot] commented on pull request #14462: URL: https://github.com/apache/airflow/pull/14462#issuecomment-786090909 The PR most likely needs to run full matrix of tests because it modifies parts of the core of Airflow. However, committers might decide to merge it quickly and take the risk. If they don't merge it quickly - please rebase it to the latest master at your convenience, or amend the last commit of the PR, and push it with --force-with-lease. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] HusseinSamao commented on issue #14438: Airflow + CockroachDB throws DDL issues.
HusseinSamao commented on issue #14438: URL: https://github.com/apache/airflow/issues/14438#issuecomment-786085605 Hi @xinbinhuang , I would like to deploy Airflow to multiple regions backed up with a database that supports that topology. CockroachDB provides capability to deploy my database in multiple regions, all regions serve read and write operations. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on pull request #14420: Don't create unittest.cfg when not running in unit test mode
ashb commented on pull request #14420: URL: https://github.com/apache/airflow/pull/14420#issuecomment-786079326 Dang, consistent failure in `tests/operators/test_python.py::TestPythonVirtualenvOperator::test_airflow_context` 🤔 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] SpicySyntax commented on issue #13262: Dataflow Flex Template Operator
SpicySyntax commented on issue #13262: URL: https://github.com/apache/airflow/issues/13262#issuecomment-786053243 @TobKed @matthieucham Taking airflow/providers/google/cloud/hooks/dataflow.py from [fix](https://github.com/apache/airflow/pull/13478) @terekete and putting next to my dag has been working for me as a short term work-around. (I had to make some small tweaks to get it from throwing any exceptions. See [this gist](https://gist.github.com/SpicySyntax/5e58efa3f1c1e8a3cffa4ee211abde98) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on a change in pull request #14457: Fixed deprecation message for "variables" command
kaxil commented on a change in pull request #14457: URL: https://github.com/apache/airflow/pull/14457#discussion_r582994664 ## File path: airflow/bin/cli.py ## @@ -414,7 +414,7 @@ def variables_export(args): _vars_wrapper(args, export=args.file) -@cli_utils.deprecated_action(new_name='variables') +@cli_utils.deprecated_action(new_name='variables list') Review comment: Cool, approved #14462 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] r-richmond commented on issue #14396: Make context less nebulous
r-richmond commented on issue #14396: URL: https://github.com/apache/airflow/issues/14396#issuecomment-786041115 >Yeah. I was expecting exactly this answer :). So summarizing - what you've done now, you created hybrid dataclass and dictionary put together. Now, my goal is to show you what further consequences you have to deal with. [0_O](https://www.youtube.com/watch?v=4F4qzPbcFiA) >There are hundreds of thousands custom operators our there that are using this "public API". This means that any change we introduce now is going to be around at least a year from now. And until then 1.10 is still there as well (and will be there for quite some time). So if someone develops custom operators for their 1.10 Airflow, they will still use dictionary - so we have probably tens of thousands custom operators created still using the 'Dictionary' context for another year or two. Yes, this is true. Thats why I mentioned that now is the "easiest" time for this change as it only gets harder in the future; (In a perfect world, this would have been raised before 2.0 :s). > Most likely many of the currently released operators are using the context to pass data (as custom dictionary values) between those methods - one can set a custom value in pre_execute() and retrieve it in execute() or post_execute() reads whatever execute sets in the context. It was easy to use, we have not forbidden it, it is part of the API (this is the basic "property" of dictionary - unlike dataclass - that you can set any value with any key there). By introducing Dataclass we are breaking this property. You will not be able to set arbitrary key in the context in pre_execute so that it is available in execute. If we implement the interrim (lasting at least a year or more) hybrid dataclass <-> dictionary proposed above, this will continue to work but with deprecation warnings. Again dataclass does not break this property and in fact after thinking, I think dataclasses provides the following significant advantages in this area which dict and typedDict do not. 1. A safer way for users to set custom fields in context 1. A safer way for airflow maintainers to add new fields to context Take the following example Context Dataclass MVP V2 ```python @dataclass class Demo: # context replacement id: str value_dc: int user_defined: Dict[str, Any] = field(default_factory=dict) def __getitem__(self, item): if item in self.__dict__.keys(): logging.warning(msg=f"dictionary interface getitem on context is deprecated; update to use the dataclass interface for standard fields like `{item}`") return self.__dict__[item] elif item in self.user_defined: logging.warning(msg=f"dictionary interface getitem on context is deprecated; update to use context.user_defined for custom fields like `{item}`") return self.user_defined[item] else: raise KeyError def __setitem__(self, key: str, value): if key in self.__dict__.keys(): msg = f"""dictionary interface setitem for standard fields is deprecated; update to use the dataclass interface for standard fields like `{key}` note: changing standard context fields is not supported and may have undefined behavior. If this is meant to be a custom field use context.user_defined instead""" logging.warning(msg=msg) self.__dict__[key] = value else: logging.warning( msg=f"dictionary interface setitem on context is deprecated; update to use context.user_defined for custom fields like `{key}`") self.user_defined[key] = value def keys(self): # added as an example to show how far we could go to have a non-breaking change for 2.1 logging.warning(msg=f"dictionary interface keys is deprecated; update this to use the dataclass interface") temp = self.__dict__ temp.update(self.user_defined) return temp d = Demo(id="long_id", value_dc=1337) print(d["id"]) d["new"] = 3 print(d["new"]) print(d.keys()) d["id"] = "warn" ``` returns ``` WARNING:root:dictionary interface getitem on context is deprecated; update to use the dataclass interface for standard fields like `id` WARNING:root:dictionary interface setitem on context is deprecated; update to use context.user_defined for custom fields like `new` WARNING:root:dictionary interface getitem on context is deprecated; update to use context.user_defined for custom fields like `new` WARNING:root:dictionary interface keys is deprecated; update this to use the dataclass interface WARNING:root:dictionary interface setitem for standard fields is deprecated; update to use the dataclass interface for standard fields l
[GitHub] [airflow] XD-DENG commented on a change in pull request #14457: Fixed deprecation message for "variables" command
XD-DENG commented on a change in pull request #14457: URL: https://github.com/apache/airflow/pull/14457#discussion_r582979227 ## File path: airflow/bin/cli.py ## @@ -414,7 +414,7 @@ def variables_export(args): _vars_wrapper(args, export=args.file) -@cli_utils.deprecated_action(new_name='variables') +@cli_utils.deprecated_action(new_name='variables list') Review comment: Prepared PR https://github.com/apache/airflow/pull/14462 to supplement this, as mentioned above. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG opened a new pull request #14462: Add 'airflow variables list' command for 1.10.x transition version
XD-DENG opened a new pull request #14462: URL: https://github.com/apache/airflow/pull/14462 This PR is supplementing PR https://github.com/apache/airflow/pull/14457, as discussed in https://github.com/apache/airflow/pull/14457#discussion_r582968411 Let me know if I missed anything or there is better way to handle this. Thanks. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on a change in pull request #14439: Updated the latest version of the "Access Control" doc with new screenshots/ content
ashb commented on a change in pull request #14439: URL: https://github.com/apache/airflow/pull/14439#discussion_r582976024 ## File path: docs/apache-airflow/security/access-control.rst ## @@ -37,101 +37,113 @@ regarding its security model. Default Roles ' -Airflow ships with a set of roles by default: Admin, User, Op, Viewer, and Public. -Only ``Admin`` users could configure/alter the permissions for other roles. But it is not recommended -that ``Admin`` users alter these default roles in any way by removing -or adding permissions to these roles. + +Airflow uses roles to manage all permissions across your environment. Each role has permissions which provide varying levels of access to Airflow's key resources (DAGs, Connections, etc). + +An environment's roles can be found under the **Security** tab: + +.. image:: /img/list-roles.png + +Airflow ships with a set of roles by default: Admin ^ ``Admin`` users have all possible permissions, including granting or revoking permissions from other users. +Only ``Admin`` users can configure/alter the permissions for other roles. While they can reconfigure Airflow's other default rules, it's not recommended. +The best practice is to instead create a new role with the desired permissions. + Public ^^ ``Public`` users (anonymous) don't have any permissions. Review comment: Public is the role you get when you aren't logged in -- if you gave some permissions to this you could let all people with access to your Airflow webserver view things. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG commented on a change in pull request #14457: Fixed deprecation message for "variables" command
XD-DENG commented on a change in pull request #14457: URL: https://github.com/apache/airflow/pull/14457#discussion_r582968411 ## File path: airflow/bin/cli.py ## @@ -414,7 +414,7 @@ def variables_export(args): _vars_wrapper(args, export=args.file) -@cli_utils.deprecated_action(new_name='variables') +@cli_utils.deprecated_action(new_name='variables list') Review comment: I will shortly prepare a PR against branch v1-10-stable to add `airflow variables list` for `1.10.15`. Meanwhile, I think this PR is ok to be merged. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG commented on a change in pull request #14457: Fixed deprecation message for "variables" command
XD-DENG commented on a change in pull request #14457: URL: https://github.com/apache/airflow/pull/14457#discussion_r582968411 ## File path: airflow/bin/cli.py ## @@ -414,7 +414,7 @@ def variables_export(args): _vars_wrapper(args, export=args.file) -@cli_utils.deprecated_action(new_name='variables') +@cli_utils.deprecated_action(new_name='variables list') Review comment: I will shortly prepare a PR against branch v1-10-stable to add `airflow variables list`. Meanwhile, I think this PR is ok to be merged. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on issue #14438: Airflow + CockroachDB throws DDL issues.
ashb commented on issue #14438: URL: https://github.com/apache/airflow/issues/14438#issuecomment-786019186 And while Cockroach is not officially supported, we will accept PRs to improve support for it. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ashb commented on a change in pull request #14457: Fixed deprecation message for "variables" command
ashb commented on a change in pull request #14457: URL: https://github.com/apache/airflow/pull/14457#discussion_r582963752 ## File path: airflow/bin/cli.py ## @@ -414,7 +414,7 @@ def variables_export(args): _vars_wrapper(args, export=args.file) -@cli_utils.deprecated_action(new_name='variables') +@cli_utils.deprecated_action(new_name='variables list') Review comment: Old style: `airflow variables -l` New `airflow variables list` etc. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] XD-DENG commented on a change in pull request #14457: Fixed deprecation message for "variables" command
XD-DENG commented on a change in pull request #14457: URL: https://github.com/apache/airflow/pull/14457#discussion_r582957758 ## File path: airflow/bin/cli.py ## @@ -414,7 +414,7 @@ def variables_export(args): _vars_wrapper(args, export=args.file) -@cli_utils.deprecated_action(new_name='variables') +@cli_utils.deprecated_action(new_name='variables list') Review comment: In such a case, we will also need to make further change to ensure `variables list` present in 1.10.x (say 1.10.15) as well This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org