[GitHub] [airflow] sudohainguyen closed issue #25456: Rendered SQL duplicated in log when using VerticaOperator
sudohainguyen closed issue #25456: Rendered SQL duplicated in log when using VerticaOperator URL: https://github.com/apache/airflow/issues/25456 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #23658: Fix k8s pod.execute randomly stuck indefinitely by logs consumption (#23497)
github-actions[bot] commented on PR #23658: URL: https://github.com/apache/airflow/pull/23658#issuecomment-1207302337 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #24452: Update dag-run.rst to ref dag_run.conf
github-actions[bot] commented on PR #24452: URL: https://github.com/apache/airflow/pull/24452#issuecomment-1207302323 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #24601: doc: update concepts overview diagram
github-actions[bot] commented on PR #24601: URL: https://github.com/apache/airflow/pull/24601#issuecomment-1207302318 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #25572: Better behaviour for self-update of Breeze
potiuk commented on PR #25572: URL: https://github.com/apache/airflow/pull/25572#issuecomment-1207282733 Yeah. I think that in case of breeze reinstallation, just reinstalling without asking makes much more sense. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Taragolis commented on pull request #25494: Deprecate usage of `extra[host]` in AWS's connection
Taragolis commented on PR #25494: URL: https://github.com/apache/airflow/pull/25494#issuecomment-1207280912 @gmcrocetti Since https://github.com/apache/airflow/pull/25416 merged would be nice also update placeholder for UI: https://github.com/apache/airflow/blob/6657684ae0b558906e0da0693e6644511a419e0d/airflow/providers/amazon/aws/hooks/base_aws.py#L626-L628 > but after discussions it occurred me host is not semantically correct for the proposed use case. I just thought that we could inform users if they somehow use `Connection.host` (might be defined in current version of provider) that this option won't work. Same as we inform that if they use `profile` which related to deprecated value, and do not related to actual `profile_name` but it just an idea. https://github.com/apache/airflow/blob/2e2e86d9e43989ed039afc07fa8efe29bf5d170c/airflow/providers/amazon/aws/utils/connection_wrapper.py#L138-L147 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated (5a68213bc7 -> 6657684ae0)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from 5a68213bc7 Add map index to task logs api (#25568) add 6657684ae0 Update AIP confluence link (#25571) No new revisions were added by this update. Summary of changes: .github/PULL_REQUEST_TEMPLATE.md | 2 +- README.md| 2 +- docs/apache-airflow/project.rst | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-)
[GitHub] [airflow] potiuk merged pull request #25571: Update AIP confluence link
potiuk merged PR #25571: URL: https://github.com/apache/airflow/pull/25571 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: Add map index to task logs api (#25568)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 5a68213bc7 Add map index to task logs api (#25568) 5a68213bc7 is described below commit 5a68213bc7cd833ce027d7445a0eb8bb8d22635c Author: pierrejeambrun AuthorDate: Sat Aug 6 22:11:33 2022 +0200 Add map index to task logs api (#25568) --- airflow/api_connexion/endpoints/log_endpoint.py| 3 +- airflow/api_connexion/openapi/v1.yaml | 12 +- airflow/www/static/js/types/api-generated.ts | 6 + tests/api_connexion/endpoints/test_log_endpoint.py | 153 - 4 files changed, 135 insertions(+), 39 deletions(-) diff --git a/airflow/api_connexion/endpoints/log_endpoint.py b/airflow/api_connexion/endpoints/log_endpoint.py index 171cacb076..0ae7b2540f 100644 --- a/airflow/api_connexion/endpoints/log_endpoint.py +++ b/airflow/api_connexion/endpoints/log_endpoint.py @@ -48,6 +48,7 @@ def get_log( task_id: str, task_try_number: int, full_content: bool = False, +map_index: int = -1, token: Optional[str] = None, session: Session = NEW_SESSION, ) -> APIResponse: @@ -72,13 +73,13 @@ def get_log( task_log_reader = TaskLogReader() if not task_log_reader.supports_read: raise BadRequest("Task log handler does not support read logs.") - ti = ( session.query(TaskInstance) .filter( TaskInstance.task_id == task_id, TaskInstance.dag_id == dag_id, TaskInstance.run_id == dag_run_id, +TaskInstance.map_index == map_index, ) .join(TaskInstance.dag_run) .one_or_none() diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml index 1e8d6dd73b..74baeaa122 100644 --- a/airflow/api_connexion/openapi/v1.yaml +++ b/airflow/api_connexion/openapi/v1.yaml @@ -1461,6 +1461,7 @@ paths: - $ref: '#/components/parameters/TaskID' - $ref: '#/components/parameters/TaskTryNumber' - $ref: '#/components/parameters/FullContent' + - $ref: '#/components/parameters/FilterMapIndex' - $ref: '#/components/parameters/ContinuationToken' get: @@ -4364,7 +4365,8 @@ components: type: string required: true description: The XCom key. -# Filter + +# Filters FilterExecutionDateGTE: in: query name: execution_date_gte @@ -4528,6 +4530,13 @@ components: type: integer description: The map index that updated the dataset. +FilterMapIndex: + in: query + name: map_index + schema: +type: integer + description: Filter on map index for mapped task. + OrderBy: in: query name: order_by @@ -4553,7 +4562,6 @@ components: *New in version 2.1.1* # Other parameters - FileToken: in: path name: file_token diff --git a/airflow/www/static/js/types/api-generated.ts b/airflow/www/static/js/types/api-generated.ts index a4296ec137..0b016f430f 100644 --- a/airflow/www/static/js/types/api-generated.ts +++ b/airflow/www/static/js/types/api-generated.ts @@ -422,6 +422,8 @@ export interface paths { * By default, only the first fragment will be returned. */ full_content?: components["parameters"]["FullContent"]; +/** Filter on map index for mapped task. */ +map_index?: components["parameters"]["FilterMapIndex"]; /** * A token that allows you to continue fetching logs. * If passed, it will specify the location from which the download should be continued. @@ -2113,6 +2115,8 @@ export interface components { FilterSourceRunID: string; /** @description The map index that updated the dataset. */ FilterSourceMapIndex: number; +/** @description Filter on map index for mapped task. */ +FilterMapIndex: number; /** * @description The name of the field to order the results by. * Prefix a field name with `-` to reverse the sort order. @@ -3432,6 +3436,8 @@ export interface operations { * By default, only the first fragment will be returned. */ full_content?: components["parameters"]["FullContent"]; +/** Filter on map index for mapped task. */ +map_index?: components["parameters"]["FilterMapIndex"]; /** * A token that allows you to continue fetching logs. * If passed, it will specify the location from which the download should be continued. diff --git a/tests/api_connexion/endpoints/test_log_endpoint.py b/tests/api_connexion/endpoints/test_log_endpoint.py index 1b226be96f..d9a5276306 100644 --- a/tests/api_connexion/endpoints/test_log_endpoint.py +++ b/tests/api_connexion/endpoints/test_log_endpoint.py @@ -26,6 +26,7 @@ from itsdangerous.url_safe import URLSafeSeri
[GitHub] [airflow] potiuk merged pull request #25568: Add support for mapped task on logs API
potiuk merged PR #25568: URL: https://github.com/apache/airflow/pull/25568 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated (a9b492b28e -> db20423eec)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from a9b492b28e Fix yamllint check with lines too long (#25573) add db20423eec Optimize log when using VerticaOperator (#25566) No new revisions were added by this update. Summary of changes: airflow/providers/vertica/operators/vertica.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
[GitHub] [airflow] potiuk commented on a diff in pull request #25566: Optimize log when using VerticaOperator
potiuk commented on code in PR #25566: URL: https://github.com/apache/airflow/pull/25566#discussion_r939570957 ## airflow/providers/vertica/operators/vertica.py: ## @@ -48,5 +48,5 @@ def __init__( def execute(self, context: 'Context') -> None: self.log.info('Executing: %s', self.sql) -hook = VerticaHook(vertica_conn_id=self.vertica_conn_id) +hook = VerticaHook(vertica_conn_id=self.vertica_conn_id, log_sql=False) Review Comment: But it really doesn't matter :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk merged pull request #25566: Optimize log when using VerticaOperator
potiuk merged PR #25566: URL: https://github.com/apache/airflow/pull/25566 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #25566: Optimize log when using VerticaOperator
boring-cyborg[bot] commented on PR #25566: URL: https://github.com/apache/airflow/pull/25566#issuecomment-1207276066 Awesome work, congrats on your first merged pull request! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated (a796d9377d -> a9b492b28e)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from a796d9377d Turn Airflow versions into a free-form field for Helm/Providers (#25564) add a9b492b28e Fix yamllint check with lines too long (#25573) No new revisions were added by this update. Summary of changes: .github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml | 4 +++- .github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml | 4 +++- 2 files changed, 6 insertions(+), 2 deletions(-)
[GitHub] [airflow] potiuk merged pull request #25573: Fix yamllint check with lines too long
potiuk merged PR #25573: URL: https://github.com/apache/airflow/pull/25573 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request, #25573: Fix yamllint check with lines too long
potiuk opened a new pull request, #25573: URL: https://github.com/apache/airflow/pull/25573 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] dwreeves commented on pull request #25432: Avoid requirement that AWS Secret Manager JSON values be urlencoded.
dwreeves commented on PR #25432: URL: https://github.com/apache/airflow/pull/25432#issuecomment-1207272769 Thank you for the good feedback @vincbeck, and thank you for the approval @potiuk! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #25572: Better behaviour for self-update of Breeze
potiuk commented on PR #25572: URL: https://github.com/apache/airflow/pull/25572#issuecomment-1207272030 CC: @blag - you might be interested, but I think this one will improve the Breeze experience when there are multiple workspaces. Still upgrading, but it will automatically re-install and re-run Breeze if it is installed from different folder or with older dependencies. I think that maybe (following some of your earlier comment) we should simple force-reinstall breeze in all those cases? I see no big problem with that, especially that this should be a) rather rare case b) rather fast (usually reinstalling breeze takes just couple of seconds). This will avoid asking qeustion and timeout (which you had complained about before. I'd love to hear what you think. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request, #25572: Better behaviour for self-update of Breeze
potiuk opened a new pull request, #25572: URL: https://github.com/apache/airflow/pull/25572 Breeze self-upgrade behaved somewhat erratically: 1) when executed manually as `self-upgrade` it printed "repaat the command" - which made no sense 2) whe triggered as part of an environment check (i.e. when breeze version was different or installed from another workspaece) it upgraded - and also printed "repeat the command" - which made more sense but added a need to repeat the command. This change fixes it for both cases: 1) there is no "repat the command" when you run self-upgrade command 2) when self-upgrade is triggered as part of another command, the original command is automatically re-run (with execvl - so replacing previous process) after the self-upgrde completed. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] gmcrocetti commented on pull request #25494: `AwsBaseHook` not using Connection's host
gmcrocetti commented on PR #25494: URL: https://github.com/apache/airflow/pull/25494#issuecomment-1207265249 The original goal of this PR was to allow one using `Connection.host`, as described in #17833, but after discussions it occurred me `host` is not semantically correct for the proposed use case. As we all know, `host` is just one part of a [URL](https://www.rfc-editor.org/rfc/rfc1738#section-3.3) in the HTTP scheme. As boto expects a complete HTTP url, `host` is not the most correct description for this parameter and thus, that's why I'm proposing renaming it to `endpoint_url`. Please, let me know if that doesn't make any sense. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] dstandish opened a new pull request, #25571: Update AIP confluence link
dstandish opened a new pull request, #25571: URL: https://github.com/apache/airflow/pull/25571 Was renamed from "Improvements" to "Improvement" -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: Turn Airflow versions into a free-form field for Helm/Providers (#25564)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new a796d9377d Turn Airflow versions into a free-form field for Helm/Providers (#25564) a796d9377d is described below commit a796d9377dca1aaf30d2b6dbad65bf66c6fb18fb Author: Jarek Potiuk AuthorDate: Sat Aug 6 20:52:48 2022 +0200 Turn Airflow versions into a free-form field for Helm/Providers (#25564) * Turn Airflow versions in free-form field for Helm/Providers --- .../airflow_helmchart_bug_report.yml | 27 ++ .../airflow_providers_bug_report.yml | 27 ++ dev/README_RELEASE_AIRFLOW.md | 2 +- 3 files changed, 5 insertions(+), 51 deletions(-) diff --git a/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml b/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml index 27ce1cad56..2a700d8991 100644 --- a/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml +++ b/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml @@ -38,33 +38,10 @@ body: - "main (development)" validations: required: true - - type: dropdown + - type: input attributes: label: Apache Airflow version - description: > -What Apache Airflow version are you using? Only Airflow 2 is supported for bugs. If you wish to -discuss Airflow 1.10, open a [discussion](https://github.com/apache/airflow/discussions) instead! - multiple: false - options: -- "2.3.3 (latest released)" -- "2.3.2" -- "2.3.1" -- "2.3.0" -- "2.2.5" -- "2.2.4" -- "2.2.3" -- "2.2.2" -- "2.2.1" -- "2.2.0" -- "2.1.4" -- "2.1.3" -- "2.1.2" -- "2.1.1" -- "2.1.0" -- "2.0.2" -- "2.0.1" -- "2.0.0" -- "main (development)" + description: What Apache Airflow version are you using? [Only Airflow 2 is supported](https://github.com/apache/airflow#version-life-cycle) for bugs. validations: required: true - type: input diff --git a/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml b/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml index 20845ea0c7..65939ba81a 100644 --- a/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml +++ b/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml @@ -106,33 +106,10 @@ body: label: Versions of Apache Airflow Providers description: What Apache Airflow Providers versions are you using? placeholder: You can use `pip freeze | grep apache-airflow-providers` (you can leave only relevant ones) - - type: dropdown + - type: input attributes: label: Apache Airflow version - description: > -What Apache Airflow version are you using? Only Airflow 2 is supported for bugs. If you wish to -discuss Airflow 1.10, open a [discussion](https://github.com/apache/airflow/discussions) instead! - multiple: false - options: -- "2.3.3 (latest released)" -- "2.3.2" -- "2.3.1" -- "2.3.0" -- "2.2.5" -- "2.2.4" -- "2.2.3" -- "2.2.2" -- "2.2.1" -- "2.2.0" -- "2.1.4" -- "2.1.3" -- "2.1.2" -- "2.1.1" -- "2.1.0" -- "2.0.2" -- "2.0.1" -- "2.0.0" -- "main (development)" + description: What Apache Airflow version are you using? [Only Airflow 2 is supported](https://github.com/apache/airflow#version-life-cycle) for bugs. validations: required: true - type: input diff --git a/dev/README_RELEASE_AIRFLOW.md b/dev/README_RELEASE_AIRFLOW.md index 98aedbcfb3..c74cc3a8d8 100644 --- a/dev/README_RELEASE_AIRFLOW.md +++ b/dev/README_RELEASE_AIRFLOW.md @@ -1172,7 +1172,7 @@ This includes: - For major/minor release, Update version in `setup.py` and `docs/docker-stack/` to the next likely minor version release. - Update the `REVISION_HEADS_MAP` at airflow/utils/db.py to include the revision head of the release even if there are no migrations. - Sync `RELEASE_NOTES.rst` (including deleting relevant `newsfragments`) and `README.md` changes -- Updating issue templates in `.github/ISSUE_TEMPLATE/` with the new version +- Updating `airflow_bug_report.yml` issue template in `.github/ISSUE_TEMPLATE/` with the new version - Updating `Dockerfile` with the new version ## Update default Airflow version in the helm chart
[GitHub] [airflow] potiuk merged pull request #25564: Turn Airflow versions into a free-form field for Helm/Providers
potiuk merged PR #25564: URL: https://github.com/apache/airflow/pull/25564 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #25564: Turn Airflow versions into a free-form field for Helm/Providers
potiuk commented on code in PR #25564: URL: https://github.com/apache/airflow/pull/25564#discussion_r939561309 ## .github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml: ## @@ -106,33 +106,10 @@ body: label: Versions of Apache Airflow Providers description: What Apache Airflow Providers versions are you using? placeholder: You can use `pip freeze | grep apache-airflow-providers` (you can leave only relevant ones) - - type: dropdown + - type: input attributes: label: Apache Airflow version - description: > -What Apache Airflow version are you using? Only Airflow 2 is supported for bugs. If you wish to -discuss Airflow 1.10, open a [discussion](https://github.com/apache/airflow/discussions) instead! - multiple: false - options: -- "2.3.3 (latest released)" -- "2.3.2" -- "2.3.1" -- "2.3.0" -- "2.2.5" -- "2.2.4" -- "2.2.3" -- "2.2.2" -- "2.2.1" -- "2.2.0" -- "2.1.4" -- "2.1.3" -- "2.1.2" -- "2.1.1" -- "2.1.0" -- "2.0.2" -- "2.0.1" -- "2.0.0" -- "main (development)" + description: What Apache Airflow version are you using? Only Airflow 2 is supported for bugs. Review Comment: ```suggestion description: What Apache Airflow version are you using? [Only Airflow 2 is supported](https://github.com/apache/airflow#version-life-cycle) for bugs. ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: Correct compile assets command in tmux welcome message (#25570)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 06f35ea3d4 Correct compile assets command in tmux welcome message (#25570) 06f35ea3d4 is described below commit 06f35ea3d4104ba5e15d5b6b25ed62ced8847a8c Author: Josh Fell <48934154+josh-f...@users.noreply.github.com> AuthorDate: Sat Aug 6 14:50:36 2022 -0400 Correct compile assets command in tmux welcome message (#25570) --- scripts/in_container/run_tmux_welcome.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/scripts/in_container/run_tmux_welcome.sh b/scripts/in_container/run_tmux_welcome.sh index 41e18ef12d..1fef909296 100755 --- a/scripts/in_container/run_tmux_welcome.sh +++ b/scripts/in_container/run_tmux_welcome.sh @@ -25,7 +25,7 @@ echo " NOTE! If you want to rebuild webserver assets dynamically:" echo echo "* Restart airflow webserver with '-d' flag." echo " AND (in a separate terminal in your host):" -echo "* Run 'breeze www-compile-assets --dev'." +echo "* Run 'breeze compile-www-assets --dev'." echo " OR" echo "* Run 'yarn dev' in the 'airflow/www' if you have yarn installed and want to watch recompiling happens." echo
[GitHub] [airflow] potiuk merged pull request #25570: Correct compile assets command in tmux welcome message
potiuk merged PR #25570: URL: https://github.com/apache/airflow/pull/25570 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] josh-fell opened a new pull request, #25570: Correct compile assets command in tmux welcome message
josh-fell opened a new pull request, #25570: URL: https://github.com/apache/airflow/pull/25570 The current message had a couple words reversed in the compile assets command. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] josh-fell commented on a diff in pull request #25564: Turn Airflow versions into a free-form field for Helm/Providers
josh-fell commented on code in PR #25564: URL: https://github.com/apache/airflow/pull/25564#discussion_r939557490 ## .github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml: ## @@ -38,33 +38,10 @@ body: - "main (development)" validations: required: true - - type: dropdown + - type: input attributes: label: Apache Airflow version - description: > -What Apache Airflow version are you using? Only Airflow 2 is supported for bugs. If you wish to -discuss Airflow 1.10, open a [discussion](https://github.com/apache/airflow/discussions) instead! - multiple: false - options: -- "2.3.3 (latest released)" -- "2.3.2" -- "2.3.1" -- "2.3.0" -- "2.2.5" -- "2.2.4" -- "2.2.3" -- "2.2.2" -- "2.2.1" -- "2.2.0" -- "2.1.4" -- "2.1.3" -- "2.1.2" -- "2.1.1" -- "2.1.0" -- "2.0.2" -- "2.0.1" -- "2.0.0" -- "main (development)" + description: What Apache Airflow version are you using? Only Airflow 2 is supported for bugs. Review Comment: ```suggestion description: What Apache Airflow version are you using? [Only Airflow 2 is supported](https://github.com/apache/airflow#version-life-cycle) for bugs. ``` WDYT linking to the version lifecycle section of the README in the description for both templates? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Taragolis opened a new pull request, #25569: Get boto3.session.Session by appropriate method
Taragolis opened a new pull request, #25569: URL: https://github.com/apache/airflow/pull/25569 Right now AWS Hooks obtain `boto3.session.Session` usual by call `_get_credentials`. This method return `Tuple[boto3.session.Session, Optional[str]]` and actually not return credentials, only session and `endpoint_url`. Since #25336 all hook/connections configs stored in `conn_config` cached property, we do not need to call `_get_credentials` to obtain `endpoint_url` as well as other connection attributes. Also change all aws hooks which uses this `_get_credentials`. Right now (main branch) other providers do not use this method directly, however previous version of PostgreSQL provider use this method (#25424), so we can't remove `_get_credentials`. Some minor changes - remove information about that S3 Bucket has not region (#20463). In fact AWS S3 Buckets has region in other hand you could access to any buckets from any region if you have a permission (might be only one limitation endpoint should be in the same AWS Region Partition). cc: @vincbeck @ferruzzi @o-nikolas @potiuk -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch constraints-main updated: Updating constraints. Build id:2809560098
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a commit to branch constraints-main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/constraints-main by this push: new 4a5bf7b501 Updating constraints. Build id:2809560098 4a5bf7b501 is described below commit 4a5bf7b501b0c8cf3fcd8c0ce73fbbf8a146ab36 Author: Automated GitHub Actions commit AuthorDate: Sat Aug 6 17:25:19 2022 + Updating constraints. Build id:2809560098 This update in constraints is automatically committed by the CI 'constraints-push' step based on HEAD of 'refs/heads/main' in 'apache/airflow' with commit sha f1d1914d73dfa9832ff69ed413f4ec39814e5837. All tests passed in this build so we determined we can push the updated constraints. See https://github.com/apache/airflow/blob/main/README.md#installing-from-pypi for details. --- constraints-3.10.txt | 16 constraints-3.7.txt | 12 ++-- constraints-3.8.txt | 16 constraints-3.9.txt | 16 constraints-no-providers-3.10.txt | 12 ++-- constraints-no-providers-3.7.txt | 8 constraints-no-providers-3.8.txt | 12 ++-- constraints-no-providers-3.9.txt | 12 ++-- constraints-source-providers-3.10.txt | 16 constraints-source-providers-3.7.txt | 12 ++-- constraints-source-providers-3.8.txt | 16 constraints-source-providers-3.9.txt | 16 12 files changed, 82 insertions(+), 82 deletions(-) diff --git a/constraints-3.10.txt b/constraints-3.10.txt index 5ede29b3eb..6f91e5a287 100644 --- a/constraints-3.10.txt +++ b/constraints-3.10.txt @@ -1,5 +1,5 @@ # -# This constraints file was automatically generated on 2022-08-05T18:45:02Z +# This constraints file was automatically generated on 2022-08-06T17:24:58Z # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -211,7 +211,7 @@ cron-descriptor==1.2.31 croniter==1.3.5 cryptography==36.0.2 curlify==2.2.1 -dask==2022.7.1 +dask==2022.8.0 databricks-sql-connector==2.0.2 datadog==0.44.0 db-dtypes==1.0.2 @@ -219,7 +219,7 @@ decorator==5.1.1 defusedxml==0.7.1 dill==0.3.1.1 distlib==0.3.5 -distributed==2022.7.1 +distributed==2022.8.0 dnspython==2.2.1 docker==5.0.3 docopt==0.6.2 @@ -256,7 +256,7 @@ google-api-core==2.8.2 google-api-python-client==1.12.11 google-auth-httplib2==0.1.0 google-auth-oauthlib==0.5.2 -google-auth==2.9.1 +google-auth==2.10.0 google-cloud-aiplatform==1.16.1 google-cloud-appengine-logging==1.1.3 google-cloud-audit-log==0.2.3 @@ -311,7 +311,7 @@ hmsclient==0.1.1 httpcore==0.15.0 httplib2==0.20.4 httpx==0.23.0 -humanize==4.2.3 +humanize==4.3.0 hvac==0.11.2 identify==2.5.3 idna==3.3 @@ -483,7 +483,7 @@ pytzdata==2020.1 pywinrm==0.4.3 pyzmq==23.2.0 qds-sdk==1.16.1 -readme-renderer==35.0 +readme-renderer==36.0 redis==3.5.3 redshift-connector==2.0.908 requests-file==1.5.1 @@ -506,7 +506,7 @@ scrapbook==0.5.0 semver==2.13.0 sendgrid==6.9.7 sentinels==1.0.0 -sentry-sdk==1.9.1 +sentry-sdk==1.9.2 setproctitle==1.3.1 simple-salesforce==1.12.1 six==1.16.0 @@ -607,7 +607,7 @@ websocket-client==1.3.3 wrapt==1.14.1 xmltodict==0.13.0 yamllint==1.27.1 -yandexcloud==0.176.0 +yandexcloud==0.177.0 yarl==1.8.1 zeep==4.1.0 zenpy==2.0.24 diff --git a/constraints-3.7.txt b/constraints-3.7.txt index 534afa9212..c254e583b2 100644 --- a/constraints-3.7.txt +++ b/constraints-3.7.txt @@ -1,5 +1,5 @@ # -# This constraints file was automatically generated on 2022-08-05T18:45:22Z +# This constraints file was automatically generated on 2022-08-06T17:25:16Z # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -255,7 +255,7 @@ google-api-core==2.8.2 google-api-python-client==1.12.11 google-auth-httplib2==0.1.0 google-auth-oauthlib==0.5.2 -google-auth==2.9.1 +google-auth==2.10.0 google-cloud-aiplatform==1.16.1 google-cloud-appengine-logging==1.1.3 google-cloud-audit-log==0.2.3 @@ -310,7 +310,7 @@ hmsclient==0.1.1 httpcore==0.15.0 httplib2==0.20.4 httpx==0.23.0 -humanize==4.2.3 +humanize==4.3.0 hvac==0.11.2 identify==2.5.3 idna==3.3 @@ -483,7 +483,7 @@ pytzdata==2020.1 pywinrm==0.4.3 pyzmq==23.2.0 qds-sdk==1.16.1 -readme-renderer==35.0 +readme-renderer==36.0 redis==3.5.3 redshift-connector==2.0.908 requests-file==1.5.1 @@ -506,7 +506,7 @@ scrapbook==0.5.0 semver==2.13.0 sendgrid==6.9.7
[GitHub] [airflow] potiuk commented on a diff in pull request #25566: Optimize log when using VerticaOperator
potiuk commented on code in PR #25566: URL: https://github.com/apache/airflow/pull/25566#discussion_r939552045 ## airflow/providers/vertica/operators/vertica.py: ## @@ -48,5 +48,5 @@ def __init__( def execute(self, context: 'Context') -> None: self.log.info('Executing: %s', self.sql) -hook = VerticaHook(vertica_conn_id=self.vertica_conn_id) +hook = VerticaHook(vertica_conn_id=self.vertica_conn_id, log_sql=False) Review Comment: Ah but it is hook, so yeah maybe better. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #25566: Optimize log when using VerticaOperator
potiuk commented on code in PR #25566: URL: https://github.com/apache/airflow/pull/25566#discussion_r939551982 ## airflow/providers/vertica/operators/vertica.py: ## @@ -48,5 +48,5 @@ def __init__( def execute(self, context: 'Context') -> None: self.log.info('Executing: %s', self.sql) -hook = VerticaHook(vertica_conn_id=self.vertica_conn_id) +hook = VerticaHook(vertica_conn_id=self.vertica_conn_id, log_sql=False) Review Comment: I tihnk keep it in operator is better - it will be logged in our logger and will get to the "airflow.task" output. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal commented on a diff in pull request #25566: Optimize log when using VerticaOperator
eladkal commented on code in PR #25566: URL: https://github.com/apache/airflow/pull/25566#discussion_r939550473 ## airflow/providers/vertica/operators/vertica.py: ## @@ -48,5 +48,5 @@ def __init__( def execute(self, context: 'Context') -> None: self.log.info('Executing: %s', self.sql) -hook = VerticaHook(vertica_conn_id=self.vertica_conn_id) +hook = VerticaHook(vertica_conn_id=self.vertica_conn_id, log_sql=False) Review Comment: Wouldn't removing `self.log.info('Executing: %s', self.sql)` line be easier? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pierrejeambrun opened a new pull request, #25568: Add support for mapped task on logs API
pierrejeambrun opened a new pull request, #25568: URL: https://github.com/apache/airflow/pull/25568 This PR adds support for retrieving task logs for mapped task. You can now provide an additional query_param `map_index` that default to -1 (for unmapped task). Added a few tests as well. @bbovenzi this is the first step to be able to bring logs tab to the mapped task in the Grid UI. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated (3fc895b9df -> f1d1914d73)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from 3fc895b9df Do not declare a volume for sshKeySecret if dag persistence is enabled (#22913) add f1d1914d73 Move "additional" build args from required to optional in Breeze (#25567) No new revisions were added by this update. Summary of changes: .../src/airflow_breeze/params/build_ci_params.py | 18 +- .../src/airflow_breeze/params/build_prod_params.py | 18 +- 2 files changed, 18 insertions(+), 18 deletions(-)
[GitHub] [airflow] potiuk merged pull request #25567: Move "additional" build args from required to optional in Breeze
potiuk merged PR #25567: URL: https://github.com/apache/airflow/pull/25567 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #25565: Separate instruction to install OS dependencies in images
potiuk commented on PR #25565: URL: https://github.com/apache/airflow/pull/25565#issuecomment-1207237719 I need to get https://github.com/apache/airflow/pull/25567 to main and rebase for the CI to succeed (We are overiding default parameters in the CI image in the main version of breeze. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request, #25567: Move "additional" build args from required to optional in Breeze
potiuk opened a new pull request, #25567: URL: https://github.com/apache/airflow/pull/25567 The "required" build args in Breeze are replaced with empty `--build-arg arg=`. This is problematic if those parameters will have default values set. We move them from required to optional to skip the build args entirely when "build" command is run. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #24825: Dockerfile centos
potiuk commented on PR #24825: URL: https://github.com/apache/airflow/pull/24825#issuecomment-1207236417 Hey @mik-laj @sfc-gh-mkmak - I looked a bit closer to that one, and I have a concern. It looks like the base container image that you used for the image is reatehr old. In Airflow, we strive for releasing our images based on the latest and greatest (i.e. with all known fixed security issues) released by the Python Software Foundation: https://hub.docker.com/_/python?tab=tags For example, the latest version of 3.7-3.10 debian images has been pushed 2 days ago (and our CI system will automatically refresh our base images we publish to use the latest version in ~ 1 day. The centos base python image you used `centos/python-38-centos7:20210726-fad62e9` is ratehr old in comparision and unfortunately it looks like: 1) It's been updated last time > 1 year ago; https://hub.docker.com/r/centos/python-38-centos7 2) There are no 3.9/3.10 Python images at all released by centos organisation I am a little concerned with using those (and I am a little concerned you are not concerned :) ). It does not only miss the latest security fixes, but also the Python 3.8 version there is rather old there were likely 6 or 8 patchlevel releases there bringin bugfixes to the 3.8 line Do you have any thoughts/ideass/concerns about an up-to-date base for such a centos image? I even looked at the "official centos image" and even that seems to be very out-dated (6-12 months) - which in the world of Security/IT and especially Supply Chain attacks is an eternity. I am afraid we would not be able to put our "trust" in such rarely released images - especially that our users are deeply concerned about security and we had many requests and questions about up-todatednesss and handingl some known and published CVEs in the images. Can you think about a good/reliable/updated source for an up-todated centOS based images we could use as a base? Aren't you concerned about it in Snowflake BTW? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #25566: Optimize log when using VerticaOperator
boring-cyborg[bot] commented on PR #25566: URL: https://github.com/apache/airflow/pull/25566#issuecomment-1207233820 Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst) Here are some useful points: - Pay attention to the quality of your code (flake8, mypy and type annotations). Our [pre-commits]( https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that. - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short [guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it. - Consider using [Breeze environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations. - Be patient and persistent. It might take some time to get a review or get the final approval from Committers. - Please follow [ASF Code of Conduct](https://www.apache.org/foundation/policies/conduct) for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack. - Be sure to read the [Airflow Coding style]( https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices). Apache Airflow is a community-driven project and together we are making it better 🚀. In case of doubts contact the developers at: Mailing List: d...@airflow.apache.org Slack: https://s.apache.org/airflow-slack -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] sudohainguyen opened a new pull request, #25566: Optimize log when using VerticaOperator
sudohainguyen opened a new pull request, #25566: URL: https://github.com/apache/airflow/pull/25566 This PR is about passing `False` to `log_sql` argument when initializing VerticaHook to execute VerticaOperator. Related to issue #25456, SQL queries are duplicated in task log. I'm trying to disable one of them. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: Do not declare a volume for sshKeySecret if dag persistence is enabled (#22913)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 3fc895b9df Do not declare a volume for sshKeySecret if dag persistence is enabled (#22913) 3fc895b9df is described below commit 3fc895b9dfe8e7b77538bd80754fb17ccf92db49 Author: Ephraim Anierobi AuthorDate: Sat Aug 6 16:26:25 2022 +0100 Do not declare a volume for sshKeySecret if dag persistence is enabled (#22913) * Do not declare a volume for sshKeySecret if dag persistence is enabled In scheduler and triggerer components, git-sync-ssh-key volume was created even when persistence is enabled. This PR fixes that and added tests in other components to avoid regression --- .../templates/scheduler/scheduler-deployment.yaml | 4 +-- .../templates/triggerer/triggerer-deployment.yaml | 4 +-- tests/charts/test_git_sync_scheduler.py| 18 ++ tests/charts/test_git_sync_triggerer.py| 42 ++ tests/charts/test_git_sync_webserver.py| 18 ++ tests/charts/test_git_sync_worker.py | 19 ++ 6 files changed, 101 insertions(+), 4 deletions(-) diff --git a/chart/templates/scheduler/scheduler-deployment.yaml b/chart/templates/scheduler/scheduler-deployment.yaml index 8aaa24e203..3e9b94e7d8 100644 --- a/chart/templates/scheduler/scheduler-deployment.yaml +++ b/chart/templates/scheduler/scheduler-deployment.yaml @@ -253,10 +253,10 @@ spec: {{- else if .Values.dags.gitSync.enabled }} - name: dags emptyDir: {} -{{- end }} -{{- if and .Values.dags.gitSync.enabled .Values.dags.gitSync.sshKeySecret }} +{{- if .Values.dags.gitSync.sshKeySecret }} {{- include "git_sync_ssh_key_volume" . | indent 8 }} {{- end }} +{{- end}} {{- end }} {{- if .Values.scheduler.extraVolumes }} {{ toYaml .Values.scheduler.extraVolumes | indent 8 }} diff --git a/chart/templates/triggerer/triggerer-deployment.yaml b/chart/templates/triggerer/triggerer-deployment.yaml index 7842e76de8..b0464dbc13 100644 --- a/chart/templates/triggerer/triggerer-deployment.yaml +++ b/chart/templates/triggerer/triggerer-deployment.yaml @@ -199,10 +199,10 @@ spec: {{- else if .Values.dags.gitSync.enabled }} - name: dags emptyDir: {} -{{- end }} -{{- if and .Values.dags.gitSync.enabled .Values.dags.gitSync.sshKeySecret }} +{{- if .Values.dags.gitSync.sshKeySecret }} {{- include "git_sync_ssh_key_volume" . | indent 8 }} {{- end }} +{{- end }} {{- if .Values.triggerer.extraVolumes }} {{- toYaml .Values.triggerer.extraVolumes | nindent 8 }} {{- end }} diff --git a/tests/charts/test_git_sync_scheduler.py b/tests/charts/test_git_sync_scheduler.py index ba4ca833b7..a7dd5ad5c9 100644 --- a/tests/charts/test_git_sync_scheduler.py +++ b/tests/charts/test_git_sync_scheduler.py @@ -131,6 +131,24 @@ class GitSyncSchedulerTest(unittest.TestCase): "secret": {"secretName": "ssh-secret", "defaultMode": 288}, } in jmespath.search("spec.template.spec.volumes", docs[0]) +def test_validate_sshkeysecret_not_added_when_persistence_is_enabled(self): +docs = render_chart( +values={ +"dags": { +"gitSync": { +"enabled": True, +"containerName": "git-sync-test", +"sshKeySecret": "ssh-secret", +"knownHosts": None, +"branch": "test-branch", +}, +"persistence": {"enabled": True}, +} +}, +show_only=["templates/scheduler/scheduler-deployment.yaml"], +) +assert "git-sync-ssh-key" not in jmespath.search("spec.template.spec.volumes[].name", docs[0]) + def test_should_set_username_and_pass_env_variables(self): docs = render_chart( values={ diff --git a/tests/charts/test_git_sync_triggerer.py b/tests/charts/test_git_sync_triggerer.py new file mode 100644 index 00..23f89b350f --- /dev/null +++ b/tests/charts/test_git_sync_triggerer.py @@ -0,0 +1,42 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is
[GitHub] [airflow] potiuk merged pull request #22913: Do not declare a volume for sshKeySecret if dag persistence is enabled
potiuk merged PR #22913: URL: https://github.com/apache/airflow/pull/22913 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #25565: Separate instruction to install OS dependencies in images
potiuk commented on PR #25565: URL: https://github.com/apache/airflow/pull/25565#issuecomment-1207233034 This is a refactoring for installation of `apt' dependencies in our Dockerfile and it will be really useful to explore a possibility of building a CentOS image which has been proposed by the Snowflake team in https://github.com/apache/airflow/pull/24825 cc: @mik-laj @sfc-gh-mkmak - it's not yet adding CentOS but it should make it easy for us to be able to choose Debian/CentOS when building the image. Once this one is merged, I will attempt to incorporate what you've done at Snowflake and when I succceed I will make a propsal at the devlist to introduce an experimental CentOS support (once I know if we can easily have 1-1 parity for our PROD images). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request, #25565: Separate instruction to install OS dependencies in images
potiuk opened a new pull request, #25565: URL: https://github.com/apache/airflow/pull/25565 This change will allow to experiment with other base images (for example CentOS that our users highly demand) but also it has a few nice simplifications and improvements along the way: * no more runtime parameters for CI image (they only make sense for PROD image) * no more support for Buster image (it is end of life in August) * Dockerfile has now less embedded default values (most of them moved to inlined bash script) * configuration for yarn sources is removed (we do not need yarn any more in our images) * additional pure-dev dependencies in CI image are passed through ADDITIONAL_DEV_DEPS * dev installation does not remove installation cache, making the CI image slightly bigger but easier for devel use - to install new dependencies (no need for `apt-get update` before installation) * latest patchlevels of various tools we use for CI were bumped --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request, #25564: Turn Airflow versions in free-form field for Helm/Providers
potiuk opened a new pull request, #25564: URL: https://github.com/apache/airflow/pull/25564 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #25432: Avoid requirement that AWS Secret Manager JSON values be urlencoded.
potiuk commented on PR #25432: URL: https://github.com/apache/airflow/pull/25432#issuecomment-1207230493 Let me know @vincbeck if you have more comments, otherwise I merge it before releasing providers. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pdebelak commented on pull request #25556: Cache the custom secrets backend so the same instance gets re-used
pdebelak commented on PR #25556: URL: https://github.com/apache/airflow/pull/25556#issuecomment-1207225740 👍 I appreciate you digging into this with me. Thank you! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: Cache the custom secrets backend so the same instance gets re-used (#25556)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 5863c42962 Cache the custom secrets backend so the same instance gets re-used (#25556) 5863c42962 is described below commit 5863c42962404607013422a40118d8b9f4603f0b Author: Peter Debelak AuthorDate: Sat Aug 6 09:21:22 2022 -0500 Cache the custom secrets backend so the same instance gets re-used (#25556) * Cache the custom secrets backend so the same instance gets re-used Fixes #2 This uses `functools.lru_cache` to re-use the same secrets backend instance between the `conf` global when it loads configuration from secrets and uses outside the `configuration` module like variables and connections. Previously, each fetch of a configuration value from secrets would use its own secrets backend instance. Also add unit test to confirm that only one secrets backend instance gets created. --- airflow/configuration.py | 9 - tests/core/test_configuration.py | 81 2 files changed, 88 insertions(+), 2 deletions(-) diff --git a/airflow/configuration.py b/airflow/configuration.py index 139172d0f3..5e84de6ea4 100644 --- a/airflow/configuration.py +++ b/airflow/configuration.py @@ -1534,7 +1534,6 @@ def ensure_secrets_loaded() -> List[BaseSecretsBackend]: def get_custom_secret_backend() -> Optional[BaseSecretsBackend]: """Get Secret Backend if defined in airflow.cfg""" secrets_backend_cls = conf.getimport(section='secrets', key='backend') - if secrets_backend_cls: try: backends: Any = conf.get(section='secrets', key='backend_kwargs', fallback='{}') @@ -1542,10 +1541,16 @@ def get_custom_secret_backend() -> Optional[BaseSecretsBackend]: except JSONDecodeError: alternative_secrets_config_dict = {} -return secrets_backend_cls(**alternative_secrets_config_dict) +return _custom_secrets_backend(secrets_backend_cls, **alternative_secrets_config_dict) return None +@functools.lru_cache(maxsize=2) +def _custom_secrets_backend(secrets_backend_cls, **alternative_secrets_config_dict): +"""Separate function to create secrets backend instance to allow caching""" +return secrets_backend_cls(**alternative_secrets_config_dict) + + def initialize_secrets_backends() -> List[BaseSecretsBackend]: """ * import secrets backend classes diff --git a/tests/core/test_configuration.py b/tests/core/test_configuration.py index 15c8bbffc1..1759788654 100644 --- a/tests/core/test_configuration.py +++ b/tests/core/test_configuration.py @@ -33,6 +33,7 @@ from airflow import configuration from airflow.configuration import ( AirflowConfigException, AirflowConfigParser, +_custom_secrets_backend, conf, expand_env_var, get_airflow_config, @@ -296,6 +297,7 @@ sql_alchemy_conn = airflow def test_config_raise_exception_from_secret_backend_connection_error(self, mock_hvac): """Get Config Value from a Secret Backend""" +_custom_secrets_backend.cache_clear() mock_client = mock.MagicMock() # mock_client.side_effect = AirflowConfigException mock_hvac.Client.return_value = mock_client @@ -322,6 +324,7 @@ sql_alchemy_conn = airflow ), ): test_conf.get('test', 'sql_alchemy_conn') +_custom_secrets_backend.cache_clear() def test_getboolean(self): """Test AirflowConfigParser.getboolean""" @@ -1297,3 +1300,81 @@ sql_alchemy_conn=sqlite://test conf.read_dict(dictionary=cfg_dict) os.environ.clear() assert conf.get('database', 'sql_alchemy_conn') == f'sqlite:///{HOME_DIR}/airflow/airflow.db' + + @mock.patch("airflow.providers.hashicorp._internal_client.vault_client.hvac") +@conf_vars( +{ +("secrets", "backend"): "airflow.providers.hashicorp.secrets.vault.VaultBackend", +("secrets", "backend_kwargs"): '{"url": "http://127.0.0.1:8200";, "token": "token"}', +} +) +def test_config_from_secret_backend_caches_instance(self, mock_hvac): +"""Get Config Value from a Secret Backend""" +_custom_secrets_backend.cache_clear() + +test_config = '''[test] +sql_alchemy_conn_secret = sql_alchemy_conn +secret_key_secret = secret_key +''' +test_config_default = '''[test] +sql_alchemy_conn = airflow +secret_key = airflow +''' + +mock_client = mock.MagicMock() +mock_hvac.Client.return_value = mock_client + +def fake_read_secret(path, mount_point, version): +if path.endswith('sql_alchemy_conn'): +return { +'request_id': '2d48a2ad-6bcb-e5b6-429d-da35fdf31f56', +'
[GitHub] [airflow] potiuk merged pull request #25556: Cache the custom secrets backend so the same instance gets re-used
potiuk merged PR #25556: URL: https://github.com/apache/airflow/pull/25556 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #25556: Cache the custom secrets backend so the same instance gets re-used
potiuk commented on PR #25556: URL: https://github.com/apache/airflow/pull/25556#issuecomment-1207223254 Nice. Thanks for being persistent! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #25426: Update azure-storage-blob version
potiuk commented on PR #25426: URL: https://github.com/apache/airflow/pull/25426#issuecomment-1207198979 > Maybe if the assertion was under TYPE_CHECKING scope the unit test wouldn't fail? Just an idea since the assert statement is to hopefully make mypy happy. I think not really - the idea is to make it consistent, I think :). It's going to change when the fixed version of liubrary is out, so it will change then and someone might rely on it being bytes. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #25556: Cache the custom secrets backend so the same instance gets re-used
potiuk commented on PR #25556: URL: https://github.com/apache/airflow/pull/25556#issuecomment-1207198649 OK. Let's see. You convinced me - but you have to fix tests failing (there were some) and it woudl be great to add unit test covering it showing that there is one instance after (and likely failing before the change). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: System test for EMR Serverless (#25559)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 33fbe75dd5 System test for EMR Serverless (#25559) 33fbe75dd5 is described below commit 33fbe75dd5100539c697d705552b088e568d52e4 Author: syedahsn <103602455+syeda...@users.noreply.github.com> AuthorDate: Sat Aug 6 05:28:49 2022 -0600 System test for EMR Serverless (#25559) * System test for EMR Serverless following the template in #24643 (AIP-47) * Remove example_emr_serverless.py from example_dags --- .../operators/emr_serverless.rst | 10 +-- .../amazon/aws}/example_emr_serverless.py | 80 -- 2 files changed, 65 insertions(+), 25 deletions(-) diff --git a/docs/apache-airflow-providers-amazon/operators/emr_serverless.rst b/docs/apache-airflow-providers-amazon/operators/emr_serverless.rst index 2496af2c40..15b0c6de0f 100644 --- a/docs/apache-airflow-providers-amazon/operators/emr_serverless.rst +++ b/docs/apache-airflow-providers-amazon/operators/emr_serverless.rst @@ -41,7 +41,7 @@ Create an EMR Serverless Application You can use :class:`~airflow.providers.amazon.aws.operators.emr.EmrServerlessCreateApplicationOperator` to create a new EMR Serverless Application. -.. exampleinclude:: /../../airflow/providers/amazon/aws/example_dags/example_emr_serverless.py +.. exampleinclude:: /../../tests/system/providers/amazon/aws/example_emr_serverless.py :language: python :dedent: 4 :start-after: [START howto_operator_emr_serverless_create_application] @@ -55,7 +55,7 @@ Start an EMR Serverless Job You can use :class:`~airflow.providers.amazon.aws.operators.emr.EmrServerlessStartJobOperator` to start an EMR Serverless Job. -.. exampleinclude:: /../../airflow/providers/amazon/aws/example_dags/example_emr_serverless.py +.. exampleinclude:: /../../tests/system/providers/amazon/aws/example_emr_serverless.py :language: python :dedent: 4 :start-after: [START howto_operator_emr_serverless_start_job] @@ -69,7 +69,7 @@ Delete an EMR Serverless Application You can use :class:`~airflow.providers.amazon.aws.operators.emr.EmrServerlessDeleteApplicationOperator` to delete an EMR Serverless Application. -.. exampleinclude:: /../../airflow/providers/amazon/aws/example_dags/example_emr_serverless.py +.. exampleinclude:: /../../tests/system/providers/amazon/aws/example_emr_serverless.py :language: python :dedent: 4 :start-after: [START howto_operator_emr_serverless_delete_application] @@ -86,7 +86,7 @@ Wait on an EMR Serverless Job state To monitor the state of an EMR Serverless Job you can use :class:`~airflow.providers.amazon.aws.sensors.emr.EmrServerlessJobSensor`. -.. exampleinclude:: /../../airflow/providers/amazon/aws/example_dags/example_emr_serverless.py +.. exampleinclude:: /../../tests/system/providers/amazon/aws/example_emr_serverless.py :language: python :dedent: 4 :start-after: [START howto_sensor_emr_serverless_job] @@ -100,7 +100,7 @@ Wait on an EMR Serverless Application state To monitor the state of an EMR Serverless Application you can use :class:`~airflow.providers.amazon.aws.sensors.emr.EmrServerlessApplicationSensor`. -.. exampleinclude:: /../../airflow/providers/amazon/aws/example_dags/example_emr_serverless.py +.. exampleinclude:: /../../tests/system/providers/amazon/aws/example_emr_serverless.py :language: python :dedent: 4 :start-after: [START howto_sensor_emr_serverless_application] diff --git a/airflow/providers/amazon/aws/example_dags/example_emr_serverless.py b/tests/system/providers/amazon/aws/example_emr_serverless.py similarity index 54% rename from airflow/providers/amazon/aws/example_dags/example_emr_serverless.py rename to tests/system/providers/amazon/aws/example_emr_serverless.py index b8c0618014..0d931a752c 100644 --- a/airflow/providers/amazon/aws/example_dags/example_emr_serverless.py +++ b/tests/system/providers/amazon/aws/example_emr_serverless.py @@ -15,40 +15,55 @@ # specific language governing permissions and limitations # under the License. + from datetime import datetime -from os import getenv -from airflow import DAG from airflow.models.baseoperator import chain +from airflow.models.dag import DAG from airflow.providers.amazon.aws.operators.emr import ( EmrServerlessCreateApplicationOperator, EmrServerlessDeleteApplicationOperator, EmrServerlessStartJobOperator, ) +from airflow.providers.amazon.aws.operators.s3 import S3CreateBucketOperator, S3DeleteBucketOperator from airflow.providers.amazon.aws.sensors.emr import EmrServerlessApplicationSensor, EmrServerlessJobSensor +from airflow.utils.trigger_rule import TriggerRule +from tests.system.providers.amazon.aws.utils import ENV_ID_KEY, SystemTestContextBuilder + +DAG_ID = 'example_emr_serverless
[GitHub] [airflow] potiuk merged pull request #25559: System test for EMR Serverless
potiuk merged PR #25559: URL: https://github.com/apache/airflow/pull/25559 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #25560: Failed to manage per-DAG permission in clearing DAG run
potiuk commented on issue #25560: URL: https://github.com/apache/airflow/issues/25560#issuecomment-1207198192 Feel free to take a look and see you can provide a PR -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] josh-fell commented on pull request #25426: Update azure-storage-blob version
josh-fell commented on PR #25426: URL: https://github.com/apache/airflow/pull/25426#issuecomment-1207196743 > So it solved the static issue but created problems for the unit tests. > > I think the best option here is just to wait for upstream to release 12.14.0 should be in a month+- Maybe if the assertion was under `TYPE_CHECKING` scope the unit test wouldn't fail? Just an idea since the assert statement is to hopefully make mypy happy. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk merged pull request #25563: feat: breeze - support compose v2
potiuk merged PR #25563: URL: https://github.com/apache/airflow/pull/25563 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: feat: breeze - support compose v2 (#25563)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 439d9ba1bc feat: breeze - support compose v2 (#25563) 439d9ba1bc is described below commit 439d9ba1bc867c69cc5d10a86fec10ee1461d3d0 Author: raphaelauv AuthorDate: Sat Aug 6 13:09:06 2022 +0200 feat: breeze - support compose v2 (#25563) --- .../airflow_breeze/utils/docker_command_utils.py | 25 -- 1 file changed, 18 insertions(+), 7 deletions(-) diff --git a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py index 164caf546c..dca6139291 100644 --- a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py @@ -271,13 +271,24 @@ def check_docker_compose_version(verbose: bool): """ version_pattern = re.compile(r'(\d+)\.(\d+)\.(\d+)') docker_compose_version_command = ["docker-compose", "--version"] -docker_compose_version_result = run_command( -docker_compose_version_command, -verbose=verbose, -no_output_dump_on_exception=True, -capture_output=True, -text=True, -) +try: +docker_compose_version_result = run_command( +docker_compose_version_command, +verbose=verbose, +no_output_dump_on_exception=True, +capture_output=True, +text=True, +) +except FileNotFoundError: +docker_compose_version_command = ["docker", "compose", "version"] +docker_compose_version_result = run_command( +docker_compose_version_command, +verbose=verbose, +no_output_dump_on_exception=True, +capture_output=True, +text=True, +) + if docker_compose_version_result.returncode == 0: docker_compose_version = docker_compose_version_result.stdout version_extracted = version_pattern.search(docker_compose_version)
[GitHub] [airflow] rrcrrcrrc commented on pull request #25484: HostAliases support for scheduler and webserver
rrcrrcrrc commented on PR #25484: URL: https://github.com/apache/airflow/pull/25484#issuecomment-1207193948 Already edited what you comment and added hostAlises to the triggerer component too. This would make hostAliases available for Scheduler, Worker, Triggerer and Webserver. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow-client-go] DrFaust92 commented on pull request #22: Generated api client from the latest airflow main branch.
DrFaust92 commented on PR #22: URL: https://github.com/apache/airflow-client-go/pull/22#issuecomment-1207191466 https://github.com/apache/airflow-client-go/pull/23 closes this as it has a bit newer spec -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: Update API & Python Client versions (#25562)
This is an automated email from the ASF dual-hosted git repository. msumit pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 3bdc4577c9 Update API & Python Client versions (#25562) 3bdc4577c9 is described below commit 3bdc4577c97342f163070c331fd928f28f9e23a6 Author: Sumit Maheshwari AuthorDate: Sat Aug 6 15:39:47 2022 +0530 Update API & Python Client versions (#25562) --- airflow/api_connexion/openapi/v1.yaml | 2 +- clients/gen/python.sh | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml index 4b9f349a60..1e8d6dd73b 100644 --- a/airflow/api_connexion/openapi/v1.yaml +++ b/airflow/api_connexion/openapi/v1.yaml @@ -229,7 +229,7 @@ info: This means that the server encountered an unexpected condition that prevented it from fulfilling the request. - version: '1.0.0' + version: '2.3.0' license: name: Apache 2.0 url: http://www.apache.org/licenses/LICENSE-2.0.html diff --git a/clients/gen/python.sh b/clients/gen/python.sh index 4a4b2d5d6b..04b95abfc6 100755 --- a/clients/gen/python.sh +++ b/clients/gen/python.sh @@ -26,7 +26,7 @@ readonly CLEANUP_DIRS # shellcheck source=./clients/gen/common.sh source "${CLIENTS_GEN_DIR}/common.sh" -VERSION=2.2.0 +VERSION=2.3.0 readonly VERSION python_config=(
[GitHub] [airflow] msumit merged pull request #25562: Update API & Python Client versions
msumit merged PR #25562: URL: https://github.com/apache/airflow/pull/25562 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch constraints-2-3 updated: Updating constraints. Build id:2808098829
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a commit to branch constraints-2-3 in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/constraints-2-3 by this push: new 5b971f8989 Updating constraints. Build id:2808098829 5b971f8989 is described below commit 5b971f89894d8b73b4b85f24e7d5cdc40dad8dbb Author: Automated GitHub Actions commit AuthorDate: Sat Aug 6 09:37:46 2022 + Updating constraints. Build id:2808098829 This update in constraints is automatically committed by the CI 'constraints-push' step based on HEAD of 'refs/heads/v2-3-test' in 'apache/airflow' with commit sha 99a464e23989e39c35001bfbe8838c04de284af0. All tests passed in this build so we determined we can push the updated constraints. See https://github.com/apache/airflow/blob/main/README.md#installing-from-pypi for details. --- constraints-3.10.txt | 16 constraints-3.7.txt | 12 ++-- constraints-3.8.txt | 16 constraints-3.9.txt | 16 constraints-no-providers-3.10.txt | 14 +++--- constraints-no-providers-3.7.txt | 10 +- constraints-no-providers-3.8.txt | 14 +++--- constraints-no-providers-3.9.txt | 14 +++--- constraints-source-providers-3.10.txt | 16 constraints-source-providers-3.7.txt | 12 ++-- constraints-source-providers-3.8.txt | 16 constraints-source-providers-3.9.txt | 16 12 files changed, 86 insertions(+), 86 deletions(-) diff --git a/constraints-3.10.txt b/constraints-3.10.txt index 9b46d3b861..67a38bdaf1 100644 --- a/constraints-3.10.txt +++ b/constraints-3.10.txt @@ -1,5 +1,5 @@ # -# This constraints file was automatically generated on 2022-08-05T14:15:03Z +# This constraints file was automatically generated on 2022-08-06T09:37:24Z # via "eager-upgrade" mechanism of PIP. For the "v2-3-test" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -13,7 +13,7 @@ APScheduler==3.6.3 Authlib==0.15.5 Babel==2.10.3 Deprecated==1.2.13 -Flask-AppBuilder==4.1.2 +Flask-AppBuilder==4.1.3 Flask-Babel==2.0.0 Flask-Bcrypt==1.0.1 Flask-Caching==2.0.1 @@ -210,7 +210,7 @@ croniter==1.3.5 cryptography==36.0.2 curlify==2.2.1 cx-Oracle==8.3.0 -dask==2022.7.1 +dask==2022.8.0 databricks-sql-connector==2.0.2 datadog==0.44.0 db-dtypes==1.0.2 @@ -218,7 +218,7 @@ decorator==5.1.1 defusedxml==0.7.1 dill==0.3.1.1 distlib==0.3.5 -distributed==2022.7.1 +distributed==2022.8.0 dnspython==2.2.1 docker==5.0.3 docopt==0.6.2 @@ -255,7 +255,7 @@ google-api-core==2.8.2 google-api-python-client==1.12.11 google-auth-httplib2==0.1.0 google-auth-oauthlib==0.5.2 -google-auth==2.9.1 +google-auth==2.10.0 google-cloud-aiplatform==1.16.1 google-cloud-appengine-logging==1.1.3 google-cloud-audit-log==0.2.3 @@ -310,7 +310,7 @@ hmsclient==0.1.1 httpcore==0.15.0 httplib2==0.20.4 httpx==0.23.0 -humanize==4.2.3 +humanize==4.3.0 hvac==0.11.2 identify==2.5.3 idna==3.3 @@ -506,7 +506,7 @@ scrapbook==0.5.0 semver==2.13.0 sendgrid==6.9.7 sentinels==1.0.0 -sentry-sdk==1.9.1 +sentry-sdk==1.9.2 setproctitle==1.3.1 simple-salesforce==1.12.1 six==1.16.0 @@ -607,7 +607,7 @@ websocket-client==1.3.3 wrapt==1.14.1 xmltodict==0.13.0 yamllint==1.27.1 -yandexcloud==0.176.0 +yandexcloud==0.177.0 yarl==1.8.1 zeep==4.1.0 zenpy==2.0.24 diff --git a/constraints-3.7.txt b/constraints-3.7.txt index b6f93bbc9c..b449bbab03 100644 --- a/constraints-3.7.txt +++ b/constraints-3.7.txt @@ -1,5 +1,5 @@ # -# This constraints file was automatically generated on 2022-08-05T14:15:36Z +# This constraints file was automatically generated on 2022-08-06T09:37:44Z # via "eager-upgrade" mechanism of PIP. For the "v2-3-test" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -13,7 +13,7 @@ APScheduler==3.6.3 Authlib==0.15.5 Babel==2.10.3 Deprecated==1.2.13 -Flask-AppBuilder==4.1.2 +Flask-AppBuilder==4.1.3 Flask-Babel==2.0.0 Flask-Bcrypt==1.0.1 Flask-Caching==2.0.1 @@ -254,7 +254,7 @@ google-api-core==2.8.2 google-api-python-client==1.12.11 google-auth-httplib2==0.1.0 google-auth-oauthlib==0.5.2 -google-auth==2.9.1 +google-auth==2.10.0 google-cloud-aiplatform==1.16.1 google-cloud-appengine-logging==1.1.3 google-cloud-audit-log==0.2.3 @@ -309,7 +309,7 @@ hmsclient==0.1.1 httpcore==0.15.0 httplib2==0.20.4 httpx==0.23.0 -humanize==4.2.3 +humanize==4.3.0 hvac==0.11.2 identify==2.5.3 idna==3.3 @@ -506,7 +506,7 @@ scrapbook==0.5.0 semver
[GitHub] [airflow] msumit opened a new pull request, #25562: Update API & Python Client versions
msumit opened a new pull request, #25562: URL: https://github.com/apache/airflow/pull/25562 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal commented on pull request #25426: Update azure-storage-blob version
eladkal commented on PR #25426: URL: https://github.com/apache/airflow/pull/25426#issuecomment-1207175736 So it solved the static issue but created problems for the unit tests. I think the best option here is just to wait for upstream to release 12.14.0 should be in a month+- -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal commented on issue #21225: Tasks stuck in queued state
eladkal commented on issue #21225: URL: https://github.com/apache/airflow/issues/21225#issuecomment-1207170050 Is anyone experienced this issue after 2.3.1? I would imagine that https://github.com/apache/airflow/pull/23690 should make things better... even if not fixing the issue but making sure that tasks don't get stuck in queue state. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal commented on issue #19556: Checking roles and users list in Web UI, give me an error
eladkal commented on issue #19556: URL: https://github.com/apache/airflow/issues/19556#issuecomment-1207166978 Is this still happens in main version? We upgraded flask versions. @nediGit can you please check? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk closed issue #25557: SQL error when pulling a list of Task Instances sorted by Logical Date (execution_date)
potiuk closed issue #25557: SQL error when pulling a list of Task Instances sorted by Logical Date (execution_date) URL: https://github.com/apache/airflow/issues/25557 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #25557: SQL error when pulling a list of Task Instances sorted by Logical Date (execution_date)
potiuk commented on issue #25557: URL: https://github.com/apache/airflow/issues/25557#issuecomment-1207166563 Ah. Right. This is indeed not an airflow API . this mist be something you have in own installation -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch v2-3-test updated: Add common.sql to list of expected providers
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch v2-3-test in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/v2-3-test by this push: new 99a464e239 Add common.sql to list of expected providers 99a464e239 is described below commit 99a464e23989e39c35001bfbe8838c04de284af0 Author: Jarek Potiuk AuthorDate: Sat Aug 6 09:23:23 2022 +0200 Add common.sql to list of expected providers --- scripts/ci/installed_providers.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/scripts/ci/installed_providers.txt b/scripts/ci/installed_providers.txt index c6b02bfae1..013fb587e3 100644 --- a/scripts/ci/installed_providers.txt +++ b/scripts/ci/installed_providers.txt @@ -1,6 +1,7 @@ amazon celery cncf.kubernetes +common.sql docker elasticsearch ftp
[GitHub] [airflow] eladkal commented on pull request #25183: Add auth_type to LivyHook
eladkal commented on PR #25183: URL: https://github.com/apache/airflow/pull/25183#issuecomment-1207165517 @bdsoha are you still working on this PR? It's missing tests to cover the new functionality. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated: Adding support for owner links in the Dags view UI (#25280)
This is an automated email from the ASF dual-hosted git repository. eladkal pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 6630357fd1 Adding support for owner links in the Dags view UI (#25280) 6630357fd1 is described below commit 6630357fd1b798945dc538552dd03e5031870fec Author: Alex Kruchkov <36231027+alexk...@users.noreply.github.com> AuthorDate: Sat Aug 6 10:10:14 2022 +0300 Adding support for owner links in the Dags view UI (#25280) * Adding support for owner links in the Dags view UI Co-authored-by: Tzu-ping Chung --- .../0116_2_4_0_add_dag_owner_attributes_table.py | 52 airflow/models/__init__.py | 3 +- airflow/models/abstractoperator.py | 1 - airflow/models/dag.py | 69 - airflow/serialization/schema.json | 1 + airflow/www/templates/airflow/dag_details.html | 10 +++ airflow/www/templates/airflow/dags.html| 4 ++ airflow/www/views.py | 21 ++- docs/apache-airflow/howto/add-owner-links.rst | 49 +++ docs/apache-airflow/howto/index.rst| 1 + docs/apache-airflow/img/howto-owner-links.gif | Bin 0 -> 829619 bytes docs/apache-airflow/migrations-ref.rst | 4 +- docs/spelling_wordlist.txt | 1 + tests/models/test_dag.py | 41 ++-- tests/test_utils/db.py | 2 + tests/utils/test_db_cleanup.py | 1 + tests/www/views/test_views_base.py | 2 +- 17 files changed, 250 insertions(+), 12 deletions(-) diff --git a/airflow/migrations/versions/0116_2_4_0_add_dag_owner_attributes_table.py b/airflow/migrations/versions/0116_2_4_0_add_dag_owner_attributes_table.py new file mode 100644 index 00..85020350f2 --- /dev/null +++ b/airflow/migrations/versions/0116_2_4_0_add_dag_owner_attributes_table.py @@ -0,0 +1,52 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +"""add dag_owner_attributes table + +Revision ID: 1486deb605b4 +Revises: f4ff391becb5 +Create Date: 2022-08-04 16:59:45.406589 + +""" + +import sqlalchemy as sa +from alembic import op + +# revision identifiers, used by Alembic. +revision = '1486deb605b4' +down_revision = 'f4ff391becb5' +branch_labels = None +depends_on = None +airflow_version = '2.4.0' + + +def upgrade(): +"""Apply Add ``DagOwnerAttributes`` table""" +op.create_table( +'dag_owner_attributes', +sa.Column('dag_id', sa.String(length=250), nullable=False), +sa.Column('owner', sa.String(length=500), nullable=False), +sa.Column('link', sa.String(length=500), nullable=False), +sa.ForeignKeyConstraint(['dag_id'], ['dag.dag_id'], ondelete='CASCADE'), +sa.PrimaryKeyConstraint('dag_id', 'owner'), +) + + +def downgrade(): +"""Unapply Add Dataset model""" +op.drop_table('dag_owner_attributes') diff --git a/airflow/models/__init__.py b/airflow/models/__init__.py index 2a12cbba35..84b3399334 100644 --- a/airflow/models/__init__.py +++ b/airflow/models/__init__.py @@ -21,7 +21,7 @@ from typing import Union from airflow.models.base import ID_LEN, Base from airflow.models.baseoperator import BaseOperator, BaseOperatorLink from airflow.models.connection import Connection -from airflow.models.dag import DAG, DagModel, DagTag +from airflow.models.dag import DAG, DagModel, DagOwnerAttributes, DagTag from airflow.models.dagbag import DagBag from airflow.models.dagpickle import DagPickle from airflow.models.dagrun import DagRun @@ -58,6 +58,7 @@ __all__ = [ "DagPickle", "DagRun", "DagTag", +"DagOwnerAttributes", "Dataset", "DbCallbackRequest", "ImportError", diff --git a/airflow/models/abstractoperator.py b/airflow/models/abstractoperator.py index 50e234def7..bae9322ef7 100644 --- a/airflow/models/abstractoperator.py +++ b/airflow/models/abstractoperator.py @@ -15,7 +15,6 @@ # KIND, either express or implied. See the License for the # s
[GitHub] [airflow] eladkal merged pull request #25280: Adding support for owner links in the Dags view UI
eladkal merged PR #25280: URL: https://github.com/apache/airflow/pull/25280 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal closed issue #24728: DAGs View: make Owner column hyperlink for a predefined URL
eladkal closed issue #24728: DAGs View: make Owner column hyperlink for a predefined URL URL: https://github.com/apache/airflow/issues/24728 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] eladkal commented on pull request #25280: Adding support for owner links in the Dags view UI
eladkal commented on PR #25280: URL: https://github.com/apache/airflow/pull/25280#issuecomment-1207164143 I'm not sure why Up-to-date Checker is failing? The PR is rebased. seems like an issue with the checker code. Merging -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org