[GitHub] [airflow] harishkrao commented on a diff in pull request #28950: Sensor for Databricks partition and table changes

2023-02-20 Thread via GitHub


harishkrao commented on code in PR #28950:
URL: https://github.com/apache/airflow/pull/28950#discussion_r603884


##
airflow/providers/databricks/sensors/databricks_sql.py:
##
@@ -0,0 +1,136 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+"""This module contains Databricks sensors."""
+
+from __future__ import annotations
+
+from typing import Any, Callable, Sequence
+
+from airflow.exceptions import AirflowException
+from airflow.providers.common.sql.hooks.sql import fetch_all_handler
+from airflow.providers.databricks.hooks.databricks_sql import DatabricksSqlHook
+from airflow.sensors.base import BaseSensorOperator
+from airflow.utils.context import Context
+
+
+class DatabricksSqlSensor(BaseSensorOperator):
+"""Generic SQL sensor for Databricks
+
+:param databricks_conn_id: connection id from Airflow to databricks,
+defaults to DatabricksSqlHook.default_conn_name
+:param http_path: Optional string specifying HTTP path of
+Databricks SQL Endpoint or cluster.If not specified, it should be 
either specified
+in the Databricks connection's extra parameters, or 
``sql_endpoint_name`` must be specified.
+:param sql_endpoint_name: Optional name of Databricks SQL Endpoint.
+If not specified, ``http_path`` must be provided as described above.
+:param session_configuration: An optional dictionary of Spark session 
parameters.
+Defaults to None. If not specified, it could be specified in the
+Databricks connection's extra parameters.
+:param http_headers: An optional list of (k, v) pairs that will be set as 
HTTP headers on every request.
+:param client_parameters: Additional parameters internal to Databricks SQL 
Connector parameters.
+:param sql: SQL query to be executed.
+:param catalog: Database name/catalog name, defaults to "hive_metastore"
+:param schema: Schema name, defaults to "default"
+:param handler: The result handler which is called with the result of each 
statement.
+"""
+
+template_fields: Sequence[str] = (
+"databricks_conn_id",
+"sql",
+"catalog",
+"schema",
+"http_headers",
+)
+
+template_ext: Sequence[str] = (".sql",)
+template_fields_renderers = {"sql": "sql"}
+
+def __init__(
+self,
+*,
+databricks_conn_id: str = DatabricksSqlHook.default_conn_name,
+http_path: str | None = None,
+sql_endpoint_name: str | None = None,
+session_configuration=None,
+http_headers: list[tuple[str, str]] | None = None,
+catalog: str = "hive_metastore",
+schema: str = "default",
+sql: str | None = None,
+handler: Callable[[Any], Any] = fetch_all_handler,
+client_parameters: dict[str, Any] | None = None,
+**kwargs,
+) -> None:
+"""_summary_
+
+:param databricks_conn_id: _description_, defaults to 
DatabricksSqlHook.default_conn_name
+:param http_path: _description_, defaults to None
+:param sql_endpoint_name: _description_, defaults to None
+:param session_configuration: _description_, defaults to None
+:param http_headers: _description_, defaults to None
+:param catalog: _description_, defaults to "hive_metastore"
+:param schema: _description_, defaults to "default"
+:param sql: _description_, defaults to None
+:param handler: _description_, defaults to fetch_all_handler
+:param client_parameters: _description_, defaults to None
+"""
+super().__init__(**kwargs)
+self.databricks_conn_id = databricks_conn_id
+self._http_path = http_path
+self._sql_endpoint_name = sql_endpoint_name
+self.session_config = session_configuration
+self.http_headers = http_headers
+self.catalog = catalog
+self.schema = schema
+self.sql = sql
+self.caller = "DatabricksSqlSensor"
+self.client_parameters = client_parameters or {}
+self.hook_params = kwargs.pop("hook_params", {})
+self.handler = handler
+
+def _get_hook(self) -> DatabricksSqlHook:
+return DatabricksSqlHook(
+self.databricks_co

[GitHub] [airflow] harishkrao commented on a diff in pull request #28950: Sensor for Databricks partition and table changes

2023-02-20 Thread via GitHub


harishkrao commented on code in PR #28950:
URL: https://github.com/apache/airflow/pull/28950#discussion_r605205


##
airflow/providers/databricks/sensors/databricks_sql.py:
##
@@ -0,0 +1,136 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+"""This module contains Databricks sensors."""
+
+from __future__ import annotations
+
+from typing import Any, Callable, Sequence
+
+from airflow.exceptions import AirflowException
+from airflow.providers.common.sql.hooks.sql import fetch_all_handler
+from airflow.providers.databricks.hooks.databricks_sql import DatabricksSqlHook
+from airflow.sensors.base import BaseSensorOperator
+from airflow.utils.context import Context
+
+
+class DatabricksSqlSensor(BaseSensorOperator):
+"""Generic SQL sensor for Databricks
+
+:param databricks_conn_id: connection id from Airflow to databricks,
+defaults to DatabricksSqlHook.default_conn_name
+:param http_path: Optional string specifying HTTP path of
+Databricks SQL Endpoint or cluster.If not specified, it should be 
either specified
+in the Databricks connection's extra parameters, or 
``sql_endpoint_name`` must be specified.
+:param sql_endpoint_name: Optional name of Databricks SQL Endpoint.
+If not specified, ``http_path`` must be provided as described above.
+:param session_configuration: An optional dictionary of Spark session 
parameters.
+Defaults to None. If not specified, it could be specified in the
+Databricks connection's extra parameters.
+:param http_headers: An optional list of (k, v) pairs that will be set as 
HTTP headers on every request.
+:param client_parameters: Additional parameters internal to Databricks SQL 
Connector parameters.
+:param sql: SQL query to be executed.
+:param catalog: Database name/catalog name, defaults to "hive_metastore"
+:param schema: Schema name, defaults to "default"
+:param handler: The result handler which is called with the result of each 
statement.
+"""
+
+template_fields: Sequence[str] = (
+"databricks_conn_id",
+"sql",
+"catalog",
+"schema",
+"http_headers",
+)
+
+template_ext: Sequence[str] = (".sql",)
+template_fields_renderers = {"sql": "sql"}
+
+def __init__(
+self,
+*,
+databricks_conn_id: str = DatabricksSqlHook.default_conn_name,
+http_path: str | None = None,
+sql_endpoint_name: str | None = None,
+session_configuration=None,
+http_headers: list[tuple[str, str]] | None = None,
+catalog: str = "hive_metastore",
+schema: str = "default",
+sql: str | None = None,
+handler: Callable[[Any], Any] = fetch_all_handler,
+client_parameters: dict[str, Any] | None = None,
+**kwargs,
+) -> None:
+"""_summary_
+
+:param databricks_conn_id: _description_, defaults to 
DatabricksSqlHook.default_conn_name
+:param http_path: _description_, defaults to None
+:param sql_endpoint_name: _description_, defaults to None
+:param session_configuration: _description_, defaults to None
+:param http_headers: _description_, defaults to None
+:param catalog: _description_, defaults to "hive_metastore"
+:param schema: _description_, defaults to "default"
+:param sql: _description_, defaults to None
+:param handler: _description_, defaults to fetch_all_handler
+:param client_parameters: _description_, defaults to None
+"""
+super().__init__(**kwargs)
+self.databricks_conn_id = databricks_conn_id
+self._http_path = http_path
+self._sql_endpoint_name = sql_endpoint_name
+self.session_config = session_configuration
+self.http_headers = http_headers
+self.catalog = catalog
+self.schema = schema
+self.sql = sql
+self.caller = "DatabricksSqlSensor"
+self.client_parameters = client_parameters or {}
+self.hook_params = kwargs.pop("hook_params", {})
+self.handler = handler
+
+def _get_hook(self) -> DatabricksSqlHook:
+return DatabricksSqlHook(
+self.databricks_co

[GitHub] [airflow] ephraimbuddy commented on a diff in pull request #28256: Include full path to Python files under zip path while clearing import errors.

2023-02-20 Thread via GitHub


ephraimbuddy commented on code in PR #28256:
URL: https://github.com/apache/airflow/pull/28256#discussion_r619848


##
airflow/dag_processing/manager.py:
##
@@ -782,7 +782,11 @@ def clear_nonexistent_import_errors(file_paths: list[str] 
| None, session=NEW_SE
 """
 query = session.query(errors.ImportError)
 if file_paths:
-query = query.filter(~errors.ImportError.filename.in_(file_paths))
+for file_path in file_paths:
+if file_path.endswith(".zip"):
+query = 
query.filter(~(errors.ImportError.filename.startswith(file_path)))
+else:
+query = query.filter(errors.ImportError.filename != 
file_path)

Review Comment:
   Should we make it so that if there's no zip file in file_paths then we don't 
need to iterate the file_paths but run the removed query: `query = 
query.filter(~errors.ImportError.filename.in_(file_paths))`, because now we run 
a query for every file. I think we still have performance issue here 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] yjijn opened a new issue, #29626: Using ProcessPoolExecutor in my task get an Exception: A process in the process pool was terminated abruptly while the future was running or pendin

2023-02-20 Thread via GitHub


yjijn opened a new issue, #29626:
URL: https://github.com/apache/airflow/issues/29626

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   airflow version:2.1.3
   python version:3.8.0
   
   I use this function in my task and get the an exception
   
![image](https://user-images.githubusercontent.com/35212739/220051884-272cdcad-a6e0-4458-a2c7-cf5205ae4730.png)
   
   
![image](https://user-images.githubusercontent.com/35212739/220052346-4f602548-f3b9-4b58-969a-6304866be175.png)
   
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   run a task which using ProcessPoolExecutor
   
   ### Operating System
   
   linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #29626: Using ProcessPoolExecutor in my task get an Exception: A process in the process pool was terminated abruptly while the future was runni

2023-02-20 Thread via GitHub


boring-cyborg[bot] commented on issue #29626:
URL: https://github.com/apache/airflow/issues/29626#issuecomment-1436542943

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on a diff in pull request #29409: Fix Rest API update user output

2023-02-20 Thread via GitHub


ephraimbuddy commented on code in PR #29409:
URL: https://github.com/apache/airflow/pull/29409#discussion_r622051


##
airflow/api_connexion/openapi/v1.yaml:
##
@@ -2288,7 +2288,7 @@ paths:
   content:
 application/json:
   schema:
-$ref: '#/components/schemas/Role'
+$ref: '#/components/schemas/User'

Review Comment:
   It's strange to me that the response is not enforced in the spec. I had 
thought that the spec provides input and output validation but this proves 
otherwise



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis opened a new pull request, #29627: Clarify `service_config` in AWS Connection

2023-02-20 Thread via GitHub


Taragolis opened a new pull request, #29627:
URL: https://github.com/apache/airflow/pull/29627

   We add information about potential future implementation in `service_config` 
Extra AWS Connection config.
   The sample include information about S3, STS and EMR however we only 
implement bucket_name for S3.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk closed issue #29561: CeleryKubernetesExecutor @task(queue="kubernetes")  not work

2023-02-20 Thread via GitHub


potiuk closed issue #29561: CeleryKubernetesExecutor @task(queue="kubernetes")  
not work
URL: https://github.com/apache/airflow/issues/29561


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #29441: datasets, next_run_datasets, remove unnecessary timestamp filter

2023-02-20 Thread via GitHub


potiuk merged PR #29441:
URL: https://github.com/apache/airflow/pull/29441


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk closed issue #26892: Dataset Next Trigger Modal Not Populating Latest Update

2023-02-20 Thread via GitHub


potiuk closed issue #26892: Dataset Next Trigger Modal Not Populating Latest 
Update
URL: https://github.com/apache/airflow/issues/26892


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: datasets, next_run_datasets, remove unnecessary timestamp filter (#29441)

2023-02-20 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 6f9efbd053 datasets, next_run_datasets, remove unnecessary timestamp 
filter (#29441)
6f9efbd053 is described below

commit 6f9efbd0537944102cd4a1cfef06e11fe0a3d03d
Author: Michael Petro <40223998+michaelmich...@users.noreply.github.com>
AuthorDate: Mon Feb 20 03:42:42 2023 -0500

datasets, next_run_datasets, remove unnecessary timestamp filter (#29441)

* datasets, next_run_datasets, remove unnecessary timestamp filter

* remove redundant `_and`
---
 airflow/www/views.py | 5 +
 1 file changed, 1 insertion(+), 4 deletions(-)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index c4fa14ebc5..5335d7bd37 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -3706,10 +3706,7 @@ class Airflow(AirflowBaseView):
 )
 .join(
 DatasetEvent,
-and_(
-DatasetEvent.dataset_id == DatasetModel.id,
-DatasetEvent.timestamp > DatasetDagRunQueue.created_at,
-),
+DatasetEvent.dataset_id == DatasetModel.id,
 isouter=True,
 )
 .filter(DagScheduleDatasetReference.dag_id == dag_id, 
~DatasetModel.is_orphaned)



[GitHub] [airflow] potiuk merged pull request #29409: Fix Rest API update user output

2023-02-20 Thread via GitHub


potiuk merged PR #29409:
URL: https://github.com/apache/airflow/pull/29409


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Fix Rest API update user output (#29409)

2023-02-20 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 20206e9a93 Fix Rest API update user output (#29409)
20206e9a93 is described below

commit 20206e9a934c623b4936a66c1c784083f0f16f71
Author: Vincent <97131062+vincb...@users.noreply.github.com>
AuthorDate: Mon Feb 20 03:42:55 2023 -0500

Fix Rest API update user output (#29409)

* Fix patch user API

* Use UserCollectionItem
---
 airflow/api_connexion/openapi/v1.yaml| 2 +-
 airflow/www/static/js/types/api-generated.ts | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/api_connexion/openapi/v1.yaml 
b/airflow/api_connexion/openapi/v1.yaml
index f8f7e98621..f64cd7c8c8 100644
--- a/airflow/api_connexion/openapi/v1.yaml
+++ b/airflow/api_connexion/openapi/v1.yaml
@@ -2288,7 +2288,7 @@ paths:
   content:
 application/json:
   schema:
-$ref: '#/components/schemas/Role'
+$ref: '#/components/schemas/UserCollectionItem'
 '400':
   $ref: '#/components/responses/BadRequest'
 '401':
diff --git a/airflow/www/static/js/types/api-generated.ts 
b/airflow/www/static/js/types/api-generated.ts
index 2380e7fef5..7228d5fdf9 100644
--- a/airflow/www/static/js/types/api-generated.ts
+++ b/airflow/www/static/js/types/api-generated.ts
@@ -4479,7 +4479,7 @@ export interface operations {
   /** Success. */
   200: {
 content: {
-  "application/json": components["schemas"]["Role"];
+  "application/json": components["schemas"]["UserCollectionItem"];
 };
   };
   400: components["responses"]["BadRequest"];



[GitHub] [airflow] potiuk closed issue #29626: Using ProcessPoolExecutor in my task get an Exception: A process in the process pool was terminated abruptly while the future was running or pending.

2023-02-20 Thread via GitHub


potiuk closed issue #29626: Using ProcessPoolExecutor in my task get an 
Exception: A process in the process pool was terminated abruptly while the 
future was running or pending.
URL: https://github.com/apache/airflow/issues/29626


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] harishkrao commented on a diff in pull request #28950: Sensor for Databricks partition and table changes

2023-02-20 Thread via GitHub


harishkrao commented on code in PR #28950:
URL: https://github.com/apache/airflow/pull/28950#discussion_r636261


##
airflow/providers/databricks/sensors/databricks_sql.py:
##
@@ -0,0 +1,136 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+"""This module contains Databricks sensors."""
+
+from __future__ import annotations
+
+from typing import Any, Callable, Sequence
+
+from airflow.exceptions import AirflowException
+from airflow.providers.common.sql.hooks.sql import fetch_all_handler
+from airflow.providers.databricks.hooks.databricks_sql import DatabricksSqlHook
+from airflow.sensors.base import BaseSensorOperator
+from airflow.utils.context import Context
+
+
+class DatabricksSqlSensor(BaseSensorOperator):
+"""Generic SQL sensor for Databricks
+
+:param databricks_conn_id: connection id from Airflow to databricks,
+defaults to DatabricksSqlHook.default_conn_name
+:param http_path: Optional string specifying HTTP path of
+Databricks SQL Endpoint or cluster.If not specified, it should be 
either specified
+in the Databricks connection's extra parameters, or 
``sql_endpoint_name`` must be specified.
+:param sql_endpoint_name: Optional name of Databricks SQL Endpoint.
+If not specified, ``http_path`` must be provided as described above.
+:param session_configuration: An optional dictionary of Spark session 
parameters.
+Defaults to None. If not specified, it could be specified in the
+Databricks connection's extra parameters.
+:param http_headers: An optional list of (k, v) pairs that will be set as 
HTTP headers on every request.
+:param client_parameters: Additional parameters internal to Databricks SQL 
Connector parameters.
+:param sql: SQL query to be executed.
+:param catalog: Database name/catalog name, defaults to "hive_metastore"
+:param schema: Schema name, defaults to "default"
+:param handler: The result handler which is called with the result of each 
statement.
+"""
+
+template_fields: Sequence[str] = (
+"databricks_conn_id",
+"sql",
+"catalog",
+"schema",
+"http_headers",
+)
+
+template_ext: Sequence[str] = (".sql",)
+template_fields_renderers = {"sql": "sql"}
+
+def __init__(
+self,
+*,
+databricks_conn_id: str = DatabricksSqlHook.default_conn_name,
+http_path: str | None = None,
+sql_endpoint_name: str | None = None,
+session_configuration=None,
+http_headers: list[tuple[str, str]] | None = None,
+catalog: str = "hive_metastore",
+schema: str = "default",
+sql: str | None = None,
+handler: Callable[[Any], Any] = fetch_all_handler,
+client_parameters: dict[str, Any] | None = None,
+**kwargs,
+) -> None:
+"""_summary_
+
+:param databricks_conn_id: _description_, defaults to 
DatabricksSqlHook.default_conn_name
+:param http_path: _description_, defaults to None
+:param sql_endpoint_name: _description_, defaults to None
+:param session_configuration: _description_, defaults to None
+:param http_headers: _description_, defaults to None
+:param catalog: _description_, defaults to "hive_metastore"
+:param schema: _description_, defaults to "default"
+:param sql: _description_, defaults to None
+:param handler: _description_, defaults to fetch_all_handler
+:param client_parameters: _description_, defaults to None
+"""
+super().__init__(**kwargs)
+self.databricks_conn_id = databricks_conn_id
+self._http_path = http_path
+self._sql_endpoint_name = sql_endpoint_name
+self.session_config = session_configuration
+self.http_headers = http_headers
+self.catalog = catalog
+self.schema = schema
+self.sql = sql
+self.caller = "DatabricksSqlSensor"
+self.client_parameters = client_parameters or {}
+self.hook_params = kwargs.pop("hook_params", {})
+self.handler = handler
+
+def _get_hook(self) -> DatabricksSqlHook:
+return DatabricksSqlHook(
+self.databricks_co

[GitHub] [airflow] ephraimbuddy commented on a diff in pull request #29446: Scheduler, make stale DAG deactivation threshold configurable instead of using dag processing timeout

2023-02-20 Thread via GitHub


ephraimbuddy commented on code in PR #29446:
URL: https://github.com/apache/airflow/pull/29446#discussion_r634767


##
airflow/dag_processing/manager.py:
##
@@ -523,13 +525,17 @@ def deactivate_stale_dags(
 query = query.filter(DagModel.processor_subdir == dag_directory)
 dags_parsed = query.all()
 
+processor_timeout_seconds: int = conf.getint("core", 
"dag_file_processor_timeout")
+
+

Review Comment:
   ```suggestion
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] harishkrao commented on pull request #28950: Sensor for Databricks partition and table changes

2023-02-20 Thread via GitHub


harishkrao commented on PR #28950:
URL: https://github.com/apache/airflow/pull/28950#issuecomment-1436560053

   > See my comment about returning false vs. throwing an exception when there 
is no results.
   > 
   > But primary request for changes is for adding missing pieces:
   > 
   > * We need documentation be added as well
   > 
   > * Documentation should include examples - add a sensor example to 
`tests/system/providers/databricks` - it will be used for integration tests
   
   @alexott just pushed an example DAG file, similar to the ones for Operators.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tirkarthi commented on a diff in pull request #28256: Include full path to Python files under zip path while clearing import errors.

2023-02-20 Thread via GitHub


tirkarthi commented on code in PR #28256:
URL: https://github.com/apache/airflow/pull/28256#discussion_r644805


##
airflow/dag_processing/manager.py:
##
@@ -782,7 +782,11 @@ def clear_nonexistent_import_errors(file_paths: list[str] 
| None, session=NEW_SE
 """
 query = session.query(errors.ImportError)
 if file_paths:
-query = query.filter(~errors.ImportError.filename.in_(file_paths))
+for file_path in file_paths:
+if file_path.endswith(".zip"):
+query = 
query.filter(~(errors.ImportError.filename.startswith(file_path)))
+else:
+query = query.filter(errors.ImportError.filename != 
file_path)

Review Comment:
   As per my understanding the query is not executed for every file. The query 
is only constructed and it's executed only once in the delete statement below. 
There is a difference in construction that this will use multiple "not equal 
to" joined by "and" while the current one uses in.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tirkarthi commented on issue #29561: CeleryKubernetesExecutor @task(queue="kubernetes")  not work

2023-02-20 Thread via GitHub


tirkarthi commented on issue #29561:
URL: https://github.com/apache/airflow/issues/29561#issuecomment-1436569555

   You're welcome @takersk 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on a diff in pull request #28256: Include full path to Python files under zip path while clearing import errors.

2023-02-20 Thread via GitHub


ephraimbuddy commented on code in PR #28256:
URL: https://github.com/apache/airflow/pull/28256#discussion_r674660


##
airflow/dag_processing/manager.py:
##
@@ -782,7 +782,11 @@ def clear_nonexistent_import_errors(file_paths: list[str] 
| None, session=NEW_SE
 """
 query = session.query(errors.ImportError)
 if file_paths:
-query = query.filter(~errors.ImportError.filename.in_(file_paths))
+for file_path in file_paths:
+if file_path.endswith(".zip"):
+query = 
query.filter(~(errors.ImportError.filename.startswith(file_path)))
+else:
+query = query.filter(errors.ImportError.filename != 
file_path)

Review Comment:
   Then if I understand correctly, each time through the loop, new query is 
formed and old one thrown away?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bolkedebruin commented on a diff in pull request #29433: Add dataset update endpoint

2023-02-20 Thread via GitHub


bolkedebruin commented on code in PR #29433:
URL: https://github.com/apache/airflow/pull/29433#discussion_r675033


##
airflow/datasets/manager.py:
##
@@ -55,23 +61,33 @@ def register_dataset_change(
 dataset_model = session.query(DatasetModel).filter(DatasetModel.uri == 
dataset.uri).one_or_none()
 if not dataset_model:
 self.log.warning("DatasetModel %s not found", dataset)
-return
-session.add(
-DatasetEvent(
+return None
+
+if task_instance:
+dataset_event = DatasetEvent(
 dataset_id=dataset_model.id,
 source_task_id=task_instance.task_id,
 source_dag_id=task_instance.dag_id,
 source_run_id=task_instance.run_id,
 source_map_index=task_instance.map_index,
 extra=extra,
 )
-)
+else:
+# When an external dataset change is made through the API, it 
isn't triggered by a task instance,
+# so we create a DatasetEvent without the task and dag data.
+dataset_event = DatasetEvent(

Review Comment:
   It would be great to have extra information available when the dataset has 
externally changed such as:
   
   * by whom - `external_auth_id` or `external_service_id` -> required
   * from where (api, client_ip / remote_addr) - `external_source` -> required
   * the timestamp of the actual event - so it can be reconciled if required -> 
Nullable as it might not be available
   
   This ensures lineage isn't broken across systems



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #29524: Possible deadlock when max_active_runs maxed + depends_on_past = True

2023-02-20 Thread via GitHub


potiuk commented on issue #29524:
URL: https://github.com/apache/airflow/issues/29524#issuecomment-1436621796

   Turning that into discussion then./


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk closed issue #29524: Possible deadlock when max_active_runs maxed + depends_on_past = True

2023-02-20 Thread via GitHub


potiuk closed issue #29524: Possible deadlock when max_active_runs maxed + 
depends_on_past = True
URL: https://github.com/apache/airflow/issues/29524


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29498: add missing read for K8S config file from conn in deferred `KubernetesPodOperator`

2023-02-20 Thread via GitHub


potiuk commented on PR #29498:
URL: https://github.com/apache/airflow/pull/29498#issuecomment-1436627597

   @hussein-awala  I guess you will be still changing the config access pattern 
on that one ? Do I understand correctly?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #29554: Update issue triage policy

2023-02-20 Thread via GitHub


potiuk merged PR #29554:
URL: https://github.com/apache/airflow/pull/29554


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Update issue triage policy (#29554)

2023-02-20 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new dae7bf0800 Update issue triage policy (#29554)
dae7bf0800 is described below

commit dae7bf080025a8e535a86fc588b84d0b3e294ffd
Author: eladkal <45845474+elad...@users.noreply.github.com>
AuthorDate: Mon Feb 20 11:37:50 2023 +0200

Update issue triage policy (#29554)
---
 .github/ISSUE_TEMPLATE/airflow_bug_report.yml  |  2 +-
 .../ISSUE_TEMPLATE/airflow_doc_issue_report.yml|  2 +-
 .../airflow_helmchart_bug_report.yml   |  2 +-
 .../airflow_providers_bug_report.yml   |  2 +-
 .github/ISSUE_TEMPLATE/feature_request.yml |  2 +-
 .github/workflows/stale.yml| 19 +++
 ISSUE_TRIAGE_PROCESS.rst   | 22 +++---
 7 files changed, 43 insertions(+), 8 deletions(-)

diff --git a/.github/ISSUE_TEMPLATE/airflow_bug_report.yml 
b/.github/ISSUE_TEMPLATE/airflow_bug_report.yml
index 6280a37f34..aa3fa29c18 100644
--- a/.github/ISSUE_TEMPLATE/airflow_bug_report.yml
+++ b/.github/ISSUE_TEMPLATE/airflow_bug_report.yml
@@ -1,7 +1,7 @@
 ---
 name: Airflow Bug report
 description: Problems and issues with code in Apache Airflow core
-labels: ["kind:bug", "area:core"]
+labels: ["kind:bug", "area:core", "needs-triage"]
 body:
   - type: markdown
 attributes:
diff --git a/.github/ISSUE_TEMPLATE/airflow_doc_issue_report.yml 
b/.github/ISSUE_TEMPLATE/airflow_doc_issue_report.yml
index 0977e98221..5b97840b5d 100644
--- a/.github/ISSUE_TEMPLATE/airflow_doc_issue_report.yml
+++ b/.github/ISSUE_TEMPLATE/airflow_doc_issue_report.yml
@@ -1,7 +1,7 @@
 ---
 name: Airflow Doc issue report
 description: Problems and issues with Apache Airflow documentation
-labels: ["kind:bug", "kind:documentation"]
+labels: ["kind:bug", "kind:documentation", "needs-triage"]
 body:
   - type: markdown
 attributes:
diff --git a/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml 
b/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml
index 19e4cf5917..78d0e9f63f 100644
--- a/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml
+++ b/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml
@@ -1,7 +1,7 @@
 ---
 name: Airflow Helm Chart Bug report
 description: Problems and issues with the Apache Airflow Official Helm Chart
-labels: ["kind:bug", "area:helm-chart"]
+labels: ["kind:bug", "area:helm-chart", "needs-triage"]
 body:
   - type: markdown
 attributes:
diff --git a/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml 
b/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml
index 99ca0b5236..756fd1cffe 100644
--- a/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml
+++ b/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml
@@ -1,7 +1,7 @@
 ---
 name: Airflow Providers Bug report
 description: Problems and issues with code in Apache Airflow Providers
-labels: ["kind:bug", "area:providers"]
+labels: ["kind:bug", "area:providers", "needs-triage"]
 body:
   - type: markdown
 attributes:
diff --git a/.github/ISSUE_TEMPLATE/feature_request.yml 
b/.github/ISSUE_TEMPLATE/feature_request.yml
index 837a88f698..21ed060802 100644
--- a/.github/ISSUE_TEMPLATE/feature_request.yml
+++ b/.github/ISSUE_TEMPLATE/feature_request.yml
@@ -1,7 +1,7 @@
 ---
 name: Airflow feature request
 description: Suggest an idea for this project
-labels: ["kind:feature"]
+labels: ["kind:feature", "needs-triage"]
 body:
   - type: markdown
 attributes:
diff --git a/.github/workflows/stale.yml b/.github/workflows/stale.yml
index 1dff613a5e..08c40b978d 100644
--- a/.github/workflows/stale.yml
+++ b/.github/workflows/stale.yml
@@ -47,3 +47,22 @@ jobs:
 activity occurs from the issue author.
   close-issue-message: >
 This issue has been closed because it has not received response 
from the issue author.
+  recheck-old-bug-report:
+runs-on: ubuntu-20.04
+steps:
+  - uses: actions/stale@v7
+with:
+  only-issue-labels: 'kind:bug'
+  stale-issue-label: 'Stale Bug Report'
+  days-before-issue-stale: 365
+  days-before-issue-close: 30
+  remove-stale-when-updated: true
+  labels-to-add-when-unstale: 'needs-triage'
+  stale-issue-message: >
+This issue has been automatically marked as stale because it has 
been open for 365 days
+without any activity. There has been several Airflow releases 
since last activity on this issue.
+Kindly asking to recheck the report against latest Airflow version 
and let us know
+if the issue is reproducible. The issue will be closed in next 30 
days if no further activity
+occurs from the issue author.
+  close-issue-message: >
+This issue has been closed because it has not received re

[GitHub] [airflow] potiuk commented on pull request #26639: Multi-threads support for processing diff queues in Kubernetes Executor

2023-02-20 Thread via GitHub


potiuk commented on PR #26639:
URL: https://github.com/apache/airflow/pull/26639#issuecomment-1436643390

   @Dinghang :) ? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jose-lpa commented on issue #29432: Jinja templating doesn't work with container_resources when using dymanic task mapping with Kubernetes Pod Operator

2023-02-20 Thread via GitHub


jose-lpa commented on issue #29432:
URL: https://github.com/apache/airflow/issues/29432#issuecomment-1436658884

   @hussein-awala Can I ask what Airflow version is going to include this fix? 
I'm in the same situation as @vasu2809: using Cloud Composer and not able to 
just upgrade to the latest Airflow version. Looking at the code, it doesn't 
seem that users can fix this issue by just creating a custom operator or 
something like that, right?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] hussein-awala commented on pull request #29498: add missing read for K8S config file from conn in deferred `KubernetesPodOperator`

2023-02-20 Thread via GitHub


hussein-awala commented on PR #29498:
URL: https://github.com/apache/airflow/pull/29498#issuecomment-1436659985

   > I guess you will be still changing the config access pattern on that one ? 
Do I understand correctly?
   
   Yes, I'm testing loading the config file in the triggerer instead of loading 
it in the worker and pass it as a dict.
   
   I convert the PR to draft until I finish testing


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr opened a new pull request, #29631: Avoid unneeded Connexion constructs

2023-02-20 Thread via GitHub


uranusjr opened a new pull request, #29631:
URL: https://github.com/apache/airflow/pull/29631

   We only use Connexion's App class to create a Flask Blueprint, so really we 
can go one level lower and use the FlaskApi interface directly instead.
   
   Connexion's spec-loading also supports dynamic spec templating via Jinja2, 
but we don't and CAN'T use it (won't work for any of the OpenAPI clients), so 
it can be entirely skipped by passing in a pre-loaded dict (instead of a path 
to the YAML file). Jinja2 templating is not cheap!
   
   The above provides a ~5% save in webserver startup on my machine.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on pull request #29616: Refactor docker-compose quick start test

2023-02-20 Thread via GitHub


Taragolis commented on PR #29616:
URL: https://github.com/apache/airflow/pull/29616#issuecomment-1436664937

   The more you look on error the less chance that this error will reproduce 🙄 
🤣 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] V0lantis opened a new issue, #29632: Postgres secret Bug when upgrading helm chart version (1.7.0 -> 1.8.0)

2023-02-20 Thread via GitHub


V0lantis opened a new issue, #29632:
URL: https://github.com/apache/airflow/issues/29632

   ### Official Helm Chart version
   
   1.8.0 (latest released)
   
   ### Apache Airflow version
   
   2.4.3
   
   ### Kubernetes Version
   
   v1.21.14-eks
   
   ### Helm Chart configuration
   
   _No response_
   
   ### Docker Image customizations
   
   _No response_
   
   ### What happened
   
   When trying to applying our deployment configuration with the new helm-chart 
release, we faced the following issue:
   ```
   COMBINED OUTPUT:
 Error: UPGRADE FAILED: execution error at 
(airflow/charts/postgresql/templates/secrets.yaml:17:24): 
 PASSWORDS ERROR: The secret "airflow-postgresql" does not contain the key 
"postgres-password"
   ```
   
   Although, when I go look for the secret, I find it but with the key 
`postgresQL-password` (Trailing **ql** at the end)
   
   ### What you think should happen instead
   
   It should work fine. Although I think the issue come from the postgresql 
bitnani version
   
   ### How to reproduce
   
   Create a Airflow deployment with helm-chart 1.7.0 and with a 
`postgres.enabled: true`. Then upgrade your helm chart to 1.8.0 and try to 
deploy again. 
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal opened a new pull request, #29633: `RedshiftDataOperator` replace `await_result` with `wait_for_completion`

2023-02-20 Thread via GitHub


eladkal opened a new pull request, #29633:
URL: https://github.com/apache/airflow/pull/29633

   Followup of 
https://github.com/apache/airflow/pull/27947#discussion_r1096353052
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #29633: `RedshiftDataOperator` replace `await_result` with `wait_for_completion`

2023-02-20 Thread via GitHub


Taragolis commented on code in PR #29633:
URL: https://github.com/apache/airflow/pull/29633#discussion_r744729


##
airflow/providers/amazon/aws/operators/redshift_data.py:
##
@@ -73,10 +73,11 @@ def __init__(
 secret_arn: str | None = None,
 statement_name: str | None = None,
 with_event: bool = False,
-await_result: bool = True,
+await_result: bool | None = None,
 poll_interval: int = 10,
 aws_conn_id: str = "aws_default",
 region: str | None = None,
+wait_for_completion: bool = True,

Review Comment:
   Small nit, could we move in `await_result` in the end of operator arguments? 
And in this place we could store `wait_for_completion`
   
   This not affect to users because it is mandatory requirement for all 
operators to send arguments as key-value arguments and I think it would be nice 
to store all deprecated values in the bottom operator constructor definition.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] VladaZakharova commented on pull request #29498: add missing read for K8S config file from conn in deferred `KubernetesPodOperator`

2023-02-20 Thread via GitHub


VladaZakharova commented on PR #29498:
URL: https://github.com/apache/airflow/pull/29498#issuecomment-1436702119

   Hi!
   May i ask in which format you will pass the config file to trigger?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] V0lantis commented on issue #29632: Postgres secret Bug when upgrading helm chart version (1.7.0 -> 1.8.0)

2023-02-20 Thread via GitHub


V0lantis commented on issue #29632:
URL: https://github.com/apache/airflow/issues/29632#issuecomment-1436712605

   Find the previous version I probably used for my deployment 
[here](https://github.com/bitnami/charts/blob/e2404c63cc5e72d5dee91e50b4597fd9e0b1ab0c/bitnami/postgresql/templates/secrets.yaml#L20)
 and the PR is bitnami/charts#8827. I guess a similar issue is 
bitnami/charts#4416.
   A potential workaround is to update the secret in place and add a second 
line with the same password. (Tested and worked in my environment). Feel free 
to close my issue if you think it is fixed.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vilozio commented on issue #26773: Tasks stuck in queued state forever after upgrade from 2.2.0 to 2.3.4

2023-02-20 Thread via GitHub


vilozio commented on issue #26773:
URL: https://github.com/apache/airflow/issues/26773#issuecomment-1436722361

   Sorry for a late reply. This issue was resolved for me after migrating to 
2.4.1.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Sangameshmsopen commented on issue #25060: Airflow scheduler crashes due to 'duplicate key value violates unique constraint "task_instance_pkey"'

2023-02-20 Thread via GitHub


Sangameshmsopen commented on issue #25060:
URL: https://github.com/apache/airflow/issues/25060#issuecomment-1436735346

   @potiuk 
   
   Sure i'm providing the details. Same issue has occurred even in one of our 
testing environment.
   
   **Apache airflow version:**
   2.4.3
   **Deployed in Google cloud composer:**
   2.1.4
   
   
   **What happened:**
   
   We have one scheduling DAG which will trigger at 12:00 AM. In that DAG we 
have main task and sub tasks as well.
   Sub tasks will be created dynamically based on few arguments.
   
   When Sub task (which has created dynamically) starts in DAG, I can see 
instance details as null (means no instance has created for that task. Please 
refer screenshot 1). So i don't get any logs for that task.
   
   Screenshot 1:
   https://user-images.githubusercontent.com/107921145/220083694-b090e7b8-1836-46b2-be18-443a3bf763a9.png";>
   
   
   But when i checked the logs in composer service. I can see the error log 
which has occurred under scheduler and time is almost near to stopping of 
scheduler heart beat. (Please refer screenshot 2)
   
   Screenshot 2:
   ![Screenshot 2023-02-20 at 3 48 14 
PM](https://user-images.githubusercontent.com/107921145/220083878-5141a146-fefd-4168-864d-8cde15638020.png)
   ![Screenshot 2023-02-20 at 3 39 48 
PM](https://user-images.githubusercontent.com/107921145/220083919-629ff887-e828-4cb5-9be6-e62f1075be08.png)
   
   Please let me know if any other details are required.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] s0neq opened a new pull request, #29635: support setting endpoint in yandex airflow provider

2023-02-20 Thread via GitHub


s0neq opened a new pull request, #29635:
URL: https://github.com/apache/airflow/pull/29635

   Sometimes people use API endpoints different from "api.cloud.yandex.net". 
Yandex.Cloud Python SDK already supports setting custom API endpoint, this PR 
makes it possible to set endpoint in Yandex Airflow Provider's Yandex.Cloud 
Connection. 
   
   Functionality is tested on local AirFlow installation.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on pull request #29635: support setting endpoint in yandex airflow provider

2023-02-20 Thread via GitHub


boring-cyborg[bot] commented on PR #29635:
URL: https://github.com/apache/airflow/pull/29635#issuecomment-1436775002

   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (ruff, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it's a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] etgcrog commented on issue #14365: docker-compose up airflow-init produces airflow command error

2023-02-20 Thread via GitHub


etgcrog commented on issue #14365:
URL: https://github.com/apache/airflow/issues/14365#issuecomment-1436782316

   I belive that you run the command : docker compose up airflow-init
   but, left the follow command: docker compose up
   if don't work yet, try:
   
   ```
   echo -e "AIRFLOW_UID=$(id -u)" > .env
   
   docker rm (docker ps -a -q)
   docker-compose down --volumes --rm all
   
   docker compose up airflow-init
   
   docker compose up
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ashb commented on a diff in pull request #29625: Aggressively cache entry points in process

2023-02-20 Thread via GitHub


ashb commented on code in PR #29625:
URL: https://github.com/apache/airflow/pull/29625#discussion_r824731


##
airflow/utils/entry_points.py:
##
@@ -28,26 +30,33 @@
 
 log = logging.getLogger(__name__)
 
+EPnD = Tuple[metadata.EntryPoint, metadata.Distribution]
 
-def entry_points_with_dist(group: str) -> Iterator[tuple[metadata.EntryPoint, 
metadata.Distribution]]:
-"""Retrieve entry points of the given group.
-
-This is like the ``entry_points()`` function from importlib.metadata,
-except it also returns the distribution the entry_point was loaded from.
 
-:param group: Filter results to only this entrypoint group
-:return: Generator of (EntryPoint, Distribution) objects for the specified 
groups
-"""
+@functools.lru_cache(maxsize=None)
+def _get_grouped_entry_points() -> dict[str, list[EPnD]]:
 loaded: set[str] = set()
+mapping: dict[str, list[EPnD]] = collections.defaultdict(list)
 for dist in metadata.distributions():
 try:
 key = canonicalize_name(dist.metadata["Name"])

Review Comment:
   I was profling memory usage (not speed), and this seemed to be a cause of a 
lot of bloat -- and it's suprisingly expensive to get the dist name (involves 
parsing a lot of files for _every_ dist in airflow).
   
   I think there are three options here:
   
   1. Only compute this key if the entrypoint group matches (this limits the 
expensive operation to just dists we actually care about, instead of _all_
   2. Use the path component to be the cache key (see below)
   3. Cache based on the entrypoint classpath
   4. Don't cache at all. This only catches a case when you have multiple 
copies of the same dist in multiple cases (which is _rare_ to hit outside of 
being an Airflow developer anyway).
   
   On point 2, it doesn't seem possible to do this using public methods, but 
this:
   
   ```python console
   In [14]: d._path
   Out[14]: 
PosixPath('/home/ash/airflow/.venv/lib/python3.11/site-packages/greenlet-2.0.2.dist-info')
   
   In [15]: d._path.stem
   Out[15]: 'greenlet-2.0.2'
   ```
   
   Doing this might break for less common ways of shipping dists though, so 
it's likely not a good option, not without a fallback anyway.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #29632: Postgres secret Bug when upgrading helm chart version (1.7.0 -> 1.8.0)

2023-02-20 Thread via GitHub


potiuk commented on issue #29632:
URL: https://github.com/apache/airflow/issues/29632#issuecomment-1436805177

   Yeah. This is expected - see Release Notes: 
https://airflow.apache.org/docs/helm-chart/stable/release_notes.html#airflow-helm-chart-1-8-0-2023-02-06
  where we explain the migration steps. The bult in postgres is really a 
"development" feature and if you go production, you should switch to external 
one, so we do not consider that as a breaking change - that's why we left 1.* 
version but we also provided migration steps in release notes in case somoene 
would like to upgrade their development setup.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk closed issue #29632: Postgres secret Bug when upgrading helm chart version (1.7.0 -> 1.8.0)

2023-02-20 Thread via GitHub


potiuk closed issue #29632: Postgres secret Bug when upgrading helm chart 
version (1.7.0 -> 1.8.0)
URL: https://github.com/apache/airflow/issues/29632


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] maxnathaniel commented on pull request #28953: Updated Telegram Provider to ensure compatbility with >=20.0.0

2023-02-20 Thread via GitHub


maxnathaniel commented on PR #28953:
URL: https://github.com/apache/airflow/pull/28953#issuecomment-1436858511

   > @maxnathaniel are you OK with this?
   
   Sorry for my late reply. Yes, I will update the README and do a final check 
of the code.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] hussein-awala commented on a diff in pull request #29270: Adding possibility for annotations in logs pvc

2023-02-20 Thread via GitHub


hussein-awala commented on code in PR #29270:
URL: https://github.com/apache/airflow/pull/29270#discussion_r889625


##
chart/templates/logs-persistent-volume-claim.yaml:
##
@@ -29,6 +29,10 @@ metadata:
 {{- with .Values.labels }}
 {{- toYaml . | nindent 4 }}
 {{- end }}
+{{- with .Values.logs.persistence.annotations }}
+  annotations:
+{{- toYaml . | nindent 4 }}
+  {{- end }}

Review Comment:
   ```suggestion
 {{- with .Values.logs.persistence.annotations }}
 annotations:
   {{- toYaml . | nindent 4 }}
 {{- end }}
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] hussein-awala commented on a diff in pull request #29622: Fix adding annotations for dag persistence PVC

2023-02-20 Thread via GitHub


hussein-awala commented on code in PR #29622:
URL: https://github.com/apache/airflow/pull/29622#discussion_r892276


##
chart/templates/dags-persistent-volume-claim.yaml:
##
@@ -29,10 +29,10 @@ metadata:
 {{- with .Values.labels }}
 {{- toYaml . | nindent 4 }}
 {{- end }}
+{{- with .Values.dags.persistence.annotations }}

Review Comment:
   ```suggestion
 {{- with .Values.dags.persistence.annotations }}
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] maxnathaniel commented on pull request #28953: Updated Telegram Provider to ensure compatbility with >=20.0.0

2023-02-20 Thread via GitHub


maxnathaniel commented on PR #28953:
URL: https://github.com/apache/airflow/pull/28953#issuecomment-1436957932

   @eladkal I have updated the CHANGELOG. Hope it makes sense and provides 
enough information about the changes


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] hussein-awala commented on pull request #29624: Can't configure Kubernetes and Celery workers in Helm Chart

2023-02-20 Thread via GitHub


hussein-awala commented on PR #29624:
URL: https://github.com/apache/airflow/pull/29624#issuecomment-1436973598

   I think this is not necessary, because the Kubernetes workers are not 
created by the helm chart, it just creates and configures the celery workers.
   
   For the K8S workers, they are created by the schedulers (a worker for each 
task), and you can configure them by providing a pod template in [the 
configuration](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#pod-template-file)
 or override the 
[pod_template](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/executor/kubernetes.html#pod-override)
 for a single task.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] janrito opened a new issue, #29637: EcsRegisterTaskDefinitionOperator params are incorrectly invalidated

2023-02-20 Thread via GitHub


janrito opened a new issue, #29637:
URL: https://github.com/apache/airflow/issues/29637

   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   
https://raw.githubusercontent.com/apache/airflow/constraints-2.4.3/constraints-3.10.txt
   
   ### Apache Airflow version
   
   2.4.3
   
   ### Operating System
   
   https://github.com/aws/aws-mwaa-local-runner / on macos 12.6
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   [local ](https://github.com/aws/aws-mwaa-local-runner)
   
   ### What happened
   
   ECS task definition cpu and memory params are incorrectly invalidated
   
   ### What you think should happen instead
   
   parameters are correct, it should register a task. I can run the same bit of 
code using boto3.client.ecs.register_task_definition`
   
   ### How to reproduce
   
   Try and define a task using:
   
   ```python 
   EcsRegisterTaskDefinitionOperator(
   task_id="register_task",
   family=TASK_FAMILY_NAME,
   container_definitions=[
   {
   "name": CONTAINER_NAME,
   "image": "ubuntu",
   "workingDirectory": "/usr/bin",
   "entryPoint": ["sh", "-c"],
   "command": ["ls"],
   }
   ],
   register_task_kwargs={
   "requiresCompatibilities": ["FARGATE"],
   "cpu": "256",
   "memory": "512",
   "networkMode": "awsvpc",
   },
   )
   ```
   
   ```logs
   botocore.exceptions.ParamValidationError: Parameter validation failed:
   Invalid type for parameter cpu, value: 256, type: , valid 
types: 
   Invalid type for parameter memory, value: 512, type: , valid 
types: 
   ```
   
   Could this error have something to do with JSON serialising/deserialising 
where the parameters are autodetected as `int`s, rather than deserialised into 
strings
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #29637: EcsRegisterTaskDefinitionOperator params are incorrectly invalidated

2023-02-20 Thread via GitHub


boring-cyborg[bot] commented on issue #29637:
URL: https://github.com/apache/airflow/issues/29637#issuecomment-1436994044

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal merged pull request #29633: `RedshiftDataOperator` replace `await_result` with `wait_for_completion`

2023-02-20 Thread via GitHub


eladkal merged PR #29633:
URL: https://github.com/apache/airflow/pull/29633


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (dae7bf0800 -> 45419e23a9)

2023-02-20 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from dae7bf0800 Update issue triage policy (#29554)
 add 45419e23a9 `RedshiftDataOperator` replace `await_result` with 
`wait_for_completion` (#29633)

No new revisions were added by this update.

Summary of changes:
 .../amazon/aws/operators/redshift_data.py  | 20 
 .../amazon/aws/operators/test_redshift_data.py | 27 ++
 .../providers/amazon/aws/example_redshift.py   |  4 ++--
 3 files changed, 39 insertions(+), 12 deletions(-)



[GitHub] [airflow] eladkal commented on a diff in pull request #28953: Updated Telegram Provider to ensure compatbility with >=20.0.0

2023-02-20 Thread via GitHub


eladkal commented on code in PR #28953:
URL: https://github.com/apache/airflow/pull/28953#discussion_r964786


##
airflow/providers/telegram/CHANGELOG.rst:
##
@@ -24,6 +24,23 @@
 Changelog
 -
 
+3.2.0
+.

Review Comment:
   ```suggestion
   4.0.0
   .
   
   Breaking changes
   
   ```



##
airflow/providers/telegram/CHANGELOG.rst:
##
@@ -24,6 +24,23 @@
 Changelog
 -
 
+3.2.0
+.
+
+In this version, we upgraded the ``python-telegram-bot`` to ``20.0.0`` and 
above.
+All remains the same except that now the ``get_conn()`` method in 
``TelegramHook`` is a coroutine function.
+Refer to `python-telegram-bot transition guide 
`_
 for more details.
+
+Misc
+
+
+* ``Upgraded python-telegram-bot to 20.0.0 and above``
+* ``Updated unit tests to handle coroutine functions``
+
+.. Below changes are excluded from the changelog. Move them to
+   appropriate section above if needed. Do not delete the lines(!):

Review Comment:
   ```suggestion
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] pgagnon commented on a diff in pull request #29623: Implement file credentials provider for AWS hook AssumeRoleWithWebIdentity

2023-02-20 Thread via GitHub


pgagnon commented on code in PR #29623:
URL: https://github.com/apache/airflow/pull/29623#discussion_r979656


##
airflow/providers/amazon/aws/hooks/base_aws.py:
##
@@ -312,19 +312,35 @@ def _get_web_identity_credential_fetcher(
 base_session = self.basic_session._session or 
botocore.session.get_session()
 client_creator = base_session.create_client
 federation = 
self.extra_config.get("assume_role_with_web_identity_federation")
-if federation == "google":
-web_identity_token_loader = 
self._get_google_identity_token_loader()
-else:
-raise AirflowException(
-f'Unsupported federation: {federation}. Currently "google" 
only are supported.'
-)
+
+web_identity_token_loader = (
+{
+"file": self._get_file_token_loader,
+"google": self._get_google_identity_token_loader,
+}.get(federation)()
+if type(federation) == str
+else None
+)

Review Comment:
   Yes, that's also an option and can be preferred in many cases, but this 
explicitly allows configuration through the Airflow connections subsystem only.
   
   The same can be said about almost all connection types, but we enable 
specific configurations through extras to allow flexibility.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] kaxil commented on pull request #29047: Add Livy Operator with deferrable mode

2023-02-20 Thread via GitHub


kaxil commented on PR #29047:
URL: https://github.com/apache/airflow/pull/29047#issuecomment-1437068944

   Failing tests


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] amoghrajesh commented on a diff in pull request #29270: Adding possibility for annotations in logs pvc

2023-02-20 Thread via GitHub


amoghrajesh commented on code in PR #29270:
URL: https://github.com/apache/airflow/pull/29270#discussion_r998546


##
chart/templates/logs-persistent-volume-claim.yaml:
##
@@ -29,6 +29,10 @@ metadata:
 {{- with .Values.labels }}
 {{- toYaml . | nindent 4 }}
 {{- end }}
+{{- with .Values.logs.persistence.annotations }}
+  annotations:
+{{- toYaml . | nindent 4 }}
+  {{- end }}

Review Comment:
   Oh, missed that. Made the changes



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] amoghrajesh commented on a diff in pull request #29622: Fix adding annotations for dag persistence PVC

2023-02-20 Thread via GitHub


amoghrajesh commented on code in PR #29622:
URL: https://github.com/apache/airflow/pull/29622#discussion_r1112000255


##
chart/templates/dags-persistent-volume-claim.yaml:
##
@@ -29,10 +29,10 @@ metadata:
 {{- with .Values.labels }}
 {{- toYaml . | nindent 4 }}
 {{- end }}
+{{- with .Values.dags.persistence.annotations }}

Review Comment:
   Made the change in the next commit



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis closed issue #29637: EcsRegisterTaskDefinitionOperator params are incorrectly invalidated

2023-02-20 Thread via GitHub


Taragolis closed issue #29637: EcsRegisterTaskDefinitionOperator params are 
incorrectly invalidated
URL: https://github.com/apache/airflow/issues/29637


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on issue #29637: EcsRegisterTaskDefinitionOperator params are incorrectly invalidated

2023-02-20 Thread via GitHub


Taragolis commented on issue #29637:
URL: https://github.com/apache/airflow/issues/29637#issuecomment-1437078616

   Not a bug Airflow provider, the validation happen inside 
[`botocore`](https://github.com/boto/botocore) / 
[`boto3`](https://github.com/boto/boto3), see available options and types in 
[ECS.Client.register_task_definition](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ecs.html#ECS.Client.register_task_definition)
 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] amoghrajesh commented on pull request #29624: Can't configure Kubernetes and Celery workers in Helm Chart

2023-02-20 Thread via GitHub


amoghrajesh commented on PR #29624:
URL: https://github.com/apache/airflow/pull/29624#issuecomment-1437079339

   Thanks for explaining that @hussein-awala
   Made the changes in sync with what was asked in the issue description: 
#28880. Maybe I understood the issue wrong. Could you go through that issue and 
explain to me where I went wrong?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] pgagnon commented on a diff in pull request #29623: Implement file credentials provider for AWS hook AssumeRoleWithWebIdentity

2023-02-20 Thread via GitHub


pgagnon commented on code in PR #29623:
URL: https://github.com/apache/airflow/pull/29623#discussion_r1112002541


##
airflow/providers/amazon/aws/hooks/base_aws.py:
##
@@ -312,19 +312,35 @@ def _get_web_identity_credential_fetcher(
 base_session = self.basic_session._session or 
botocore.session.get_session()
 client_creator = base_session.create_client
 federation = 
self.extra_config.get("assume_role_with_web_identity_federation")
-if federation == "google":
-web_identity_token_loader = 
self._get_google_identity_token_loader()
-else:
-raise AirflowException(
-f'Unsupported federation: {federation}. Currently "google" 
only are supported.'
-)
+
+web_identity_token_loader = (
+{
+"file": self._get_file_token_loader,
+"google": self._get_google_identity_token_loader,
+}.get(federation)()
+if type(federation) == str
+else None
+)

Review Comment:
   Just to expand, if boto allowed passing `web_identity_token_file`, 
`role_arn`, and `role_session_name` as kwargs, then I would agree with you that 
nothing needs to be implemented, but AFAIK, that is not the case; the only 
options available are (1) environment variables and (2) aws 
configuration/shared credentials file, both of which are external and more or 
less static configuration mechanisms.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on issue #29637: EcsRegisterTaskDefinitionOperator params are incorrectly invalidated

2023-02-20 Thread via GitHub


Taragolis commented on issue #29637:
URL: https://github.com/apache/airflow/issues/29637#issuecomment-1437093004

   Nevermind, I have a look on wrong parameters initially.
   Did you use specific values such as `render_template_as_native_obj=True` in 
DAG?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #29635: YandexCloud provider: support Yandex SDK feature "endpoint"

2023-02-20 Thread via GitHub


Taragolis commented on code in PR #29635:
URL: https://github.com/apache/airflow/pull/29635#discussion_r1112016187


##
airflow/providers/yandex/hooks/yandex.py:
##
@@ -122,7 +127,8 @@ def __init__(
 self.connection = self.get_connection(self.connection_id)
 self.extras = self.connection.extra_dejson
 credentials = self._get_credentials()
-self.sdk = yandexcloud.SDK(user_agent=self.provider_user_agent(), 
**credentials)
+endpoint = self._get_field("endpoint", False)
+self.sdk = yandexcloud.SDK(user_agent=self.provider_user_agent(), 
endpoint=endpoint, **credentials)

Review Comment:
   We need to tests this.
   I guess in case of missing endpoint `self._get_field("endpoint", False)` 
will return `False`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ryw opened a new pull request, #29638: Add OSSRank badge

2023-02-20 Thread via GitHub


ryw opened a new pull request, #29638:
URL: https://github.com/apache/airflow/pull/29638

   Airflow is a top 2% project on OSSRank, and this badge shows Airflow's 
current rank in the world of open source. I hope you accept the PR :)
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] francescomucio commented on pull request #29171: Added Snowflake provider to the Docker image

2023-02-20 Thread via GitHub


francescomucio commented on PR #29171:
URL: https://github.com/apache/airflow/pull/29171#issuecomment-1437125244

   Yes, I will try to do it today or tomorrow


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tirkarthi commented on a diff in pull request #28256: Include full path to Python files under zip path while clearing import errors.

2023-02-20 Thread via GitHub


tirkarthi commented on code in PR #28256:
URL: https://github.com/apache/airflow/pull/28256#discussion_r1112035762


##
airflow/dag_processing/manager.py:
##
@@ -782,7 +782,11 @@ def clear_nonexistent_import_errors(file_paths: list[str] 
| None, session=NEW_SE
 """
 query = session.query(errors.ImportError)
 if file_paths:
-query = query.filter(~errors.ImportError.filename.in_(file_paths))
+for file_path in file_paths:
+if file_path.endswith(".zip"):
+query = 
query.filter(~(errors.ImportError.filename.startswith(file_path)))
+else:
+query = query.filter(errors.ImportError.filename != 
file_path)

Review Comment:
   The old query is updated with new condition since query variable is reused. 
It will just chain the queries and finally all conditions are joined by "and". 
Something like below answer.
   
   https://stackoverflow.com/a/3792292



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] janrito commented on issue #29637: EcsRegisterTaskDefinitionOperator params are incorrectly invalidated

2023-02-20 Thread via GitHub


janrito commented on issue #29637:
URL: https://github.com/apache/airflow/issues/29637#issuecomment-1437152518

   Ah! yes. I did. Why is that clashing?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vedantlodha opened a new pull request, #29639: Migrate breeze unit tests to pytest.

2023-02-20 Thread via GitHub


vedantlodha opened a new pull request, #29639:
URL: https://github.com/apache/airflow/pull/29639

   As a part of #29305, this PR removes a unittest.TestCase dependency from 
breeze package. The only remaining unittest dependency of this unittest.mock 
which is anyways used by pytest-mock and is not in plans for removal according 
to https://github.com/apache/airflow/issues/29305#issuecomment-1435711841
   
   
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vedantlodha commented on pull request #29639: Migrate breeze unit tests to pytest.

2023-02-20 Thread via GitHub


vedantlodha commented on PR #29639:
URL: https://github.com/apache/airflow/pull/29639#issuecomment-1437157482

   @Taragolis Since youre the author of the issue, can you take a look into the 
pr (which is barey 2 lines of change :)) whenever you have some time? Thanks


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jose-lpa commented on issue #29432: Jinja templating doesn't work with container_resources when using dymanic task mapping with Kubernetes Pod Operator

2023-02-20 Thread via GitHub


jose-lpa commented on issue #29432:
URL: https://github.com/apache/airflow/issues/29432#issuecomment-1437182956

   I actually fixed my situation by simply using the [`Variable` 
model](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/variables.html)
 instead of trying to go with templated stuff.
   
   Example of my exact situation:
   
   **Failing:**
   ```python
   with DAG(...) as dag:
   
   # ...other tasks...
   
   calculate_statistics = KubernetesPodOperator.partial(
   config_file="/home/airflow/composer_kube_config",
   kubernetes_conn_id="kubernetes_default",
   namespace="default",
   task_id="calculate_statistics",
   name="calculate-statistics",
   image=(
   "eu.gcr.io/hummingbird-technologies/tasks/imagery-stats:{{ 
var.value.ENVIRONMENT }}"
   ),
   image_pull_policy="Always",
   env_vars=[
   V1EnvVar(name="INTERNAL_API_URL", value="{{ 
var.value.INTERNAL_API_URL }}",
   V1EnvVar(name="COLLECTION_NAME", value="{{ 
var.value.COLLECTION_NAME }}"),
   ],
   container_resources=V1ResourceRequirements(requests={"cpu": 1, 
"memory": "10Gi"}),
   startup_timeout_seconds=5 * 60,
   retries=0,
   )
   
   statistics = 
calculate_statistics.expand(arguments=XComArg(argument_builder))
   
   chain(acquire_data, statistics)
   ```
   
   **Working:**
   ```python
   from airflow.models import Variable
   
   
   with DAG(...) as dag:
   
   # ...other tasks...
   
   calculate_statistics = KubernetesPodOperator.partial(
   config_file="/home/airflow/composer_kube_config",
   kubernetes_conn_id="kubernetes_default",
   namespace="default",
   task_id="calculate_statistics",
   name="calculate-statistics",
   image=(
   "eu.gcr.io/hummingbird-technologies/tasks/imagery-stats:{{ 
var.value.ENVIRONMENT }}"
   ),
   image_pull_policy="Always",
   env_vars=[
   V1EnvVar(name="INTERNAL_API_URL", 
value=Variable.get("INTERNAL_API_URL")),
   V1EnvVar(name="COLLECTION_NAME", 
value=Variable.get("COLLECTION_NAME")),
   ],
   container_resources=V1ResourceRequirements(requests={"cpu": 1, 
"memory": "10Gi"}),
   startup_timeout_seconds=5 * 60,
   retries=0,
   )
   
   statistics = 
calculate_statistics.expand(arguments=XComArg(argument_builder))
   
   chain(acquire_data, statistics)
   ```
   
   @vasu2809 maybe this can help you too...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] antonio-antuan opened a new issue, #29640: NoBoundaryInMultipartDefect raised using S3Hook

2023-02-20 Thread via GitHub


antonio-antuan opened a new issue, #29640:
URL: https://github.com/apache/airflow/issues/29640

   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==7.2.0
   
   ### Apache Airflow version
   
   2.4.3
   
   ### Operating System
   
   Arch Linux
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   the same for MWAA (aws-managed airflow)
   
   ### What happened
   
   exception is raised:
   ```
   [2023-02-20, 14:32:02 UTC] {subprocess.py:92} INFO - [2023-02-20, 
14:32:02 UTC] {connectionpool.py:475} WARNING - Failed to 
parse headers 
url=[https://BUCKET.s3.us-west-2.amazonaws.com:443/object-key.json:[NoBoundaryInMultipartDefect()],
 unparsed data: ''
   [2023-02-20, 14:32:02 UTC] {subprocess.py:92} INFO - Traceback (most recent 
call last):
   [2023-02-20, 14:32:02 UTC] {subprocess.py:92} INFO -   File 
"/home/***/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 
469, in _make_request
   [2023-02-20, 14:32:02 UTC] {subprocess.py:92} INFO - 
assert_header_parsing(httplib_response.msg)
   [2023-02-20, 14:32:02 UTC] {subprocess.py:92} INFO -   File 
"/home/***/.local/lib/python3.7/site-packages/urllib3/util/response.py", line 
91, in assert_header_parsing
   [2023-02-20, 14:32:02 UTC] {subprocess.py:92} INFO - raise 
HeaderParsingError(defects=defects, unparsed_data=unparsed_data)
   [2023-02-20, 14:32:02 UTC] {subprocess.py:92} INFO - 
urllib3.exceptions.HeaderParsingError: [NoBoundaryInMultipartDefect()], 
unparsed data: ''
   ```
   
   ### What you think should happen instead
   
   shouldn't be such an exception :)
   
   ### How to reproduce
   
   the code that downloads data is simple:
   ```
   
   def download_from_s3(key: str, bucket_name: str, local_path: str) -> str:
   boto3.set_stream_logger('boto3.resources', logging.DEBUG)
   hook = S3Hook(aws_conn_id='s3_conn')
   file_name = hook.download_file(key=key, bucket_name=bucket_name, 
preserve_file_name=True)
   return file_name
   
   ```
   
   ### Anything else
   
   anyway, file is downldaed and looks valid.
   
   some logs:
   ```
   [2023-02-20, 15:18:38 UTC] {connection_wrapper.py:337} 
INFO - AWS Connection (conn_id='s3_conn', conn_type='aws') credentials 
retrieved from login and password.
   2023-02-20, 15:18:38 UTC boto3.resources.factory [DEBUG] Loading s3:s3
   [2023-02-20, 15:18:38 UTC] {factory.py:66} DEBUG - 
Loading s3:s3
   2023-02-20, 15:18:38 UTC boto3.resources.factory [DEBUG] Loading s3:Object
   [2023-02-20, 15:18:38 UTC] {factory.py:66} DEBUG - 
Loading s3:Object
   2023-02-20, 15:18:38 UTC boto3.resources.action [DEBUG] Calling 
s3:head_object with {'Bucket': 'BUCKET', 'Key': 'object_key.json'}
   [2023-02-20, 15:18:38 UTC] {action.py:85} DEBUG - 
Calling s3:head_object with {'Bucket': 'BUCKET', 'Key': 'object_key.json'}
   [2023-02-20, 15:18:40 UTC] {connectionpool.py:475} 
WARNING - Failed to parse headers 
(url=https://BUCKET.s3.us-west-2.amazonaws.com:443/object_key.json): 
[NoBoundaryInMultipartDefect()], unparsed data: ''
   Traceback (most recent call last):
 File 
"/home/***/.local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 
469, in _make_request
   assert_header_parsing(httplib_response.msg)
 File 
"/home/***/.local/lib/python3.7/site-packages/urllib3/util/response.py", line 
91, in assert_header_parsing
   raise HeaderParsingError(defects=defects, unparsed_data=unparsed_data)
   urllib3.exceptions.HeaderParsingError: [NoBoundaryInMultipartDefect()], 
unparsed data: ''
   2023-02-20, 15:18:40 UTC boto3.resources.action [DEBUG] Response: 
{'ResponseMetadata': {'RequestId': 'W3J4VRW3WQVV8AV7', 'HostId': 
'uRLn/mC6mUAPtgAZRcPbdIlkzWNQ8/AKuPn5HuHjJK1CLNAxfES3DXQsnF7HYSia4guuylFLItY=', 
'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amz-id-2': 
'uRLn/mC6mUAPtgAZRcPbdIlkzWNQ8/AKuPn5HuHjJK1CLNAxfES3DXQsnF7HYSia4guuylFLItY=', 
'x-amz-request-id': 'W3J4VRW3WQVV8AV7', 'date': 'Mon, 20 Feb 2023 15:18:40 
GMT', 'last-modified': 'Thu, 09 Feb 2023 10:34:28 GMT', 'etag': 
'"e7d2a315e24716624b1085cfa7f31ad8"', 'x-amz-server-side-encryption': 'AES256', 
'accept-ranges': 'bytes', 'content-type': 'multipart/form-data', 'server': 
'AmazonS3', 'content-length': '7004'}, 'RetryAttempts': 0}, 'AcceptRanges': 
'bytes', 'LastModified': datetime.datetime(2023, 2, 9, 10, 34, 28, 
tzinfo=tzutc()), 'ContentLength': 7004, 'ETag': 
'"e7d2a315e24716624b1085cfa7f31ad8"', 'ContentType': 'multipart/form-data', 
'ServerSideEncryption': 'AES256', 'Metadata': {}}
   [2023-02-20, 15:18:40 UTC] {action.py:90} DEBUG - 
Response: {'ResponseMetadata': {'RequestId': 'W3J4VRW3WQVV8AV7', 'HostId': 
'uRLn/mC6mUAPtgAZRcPbdIlkzWNQ8/AKuPn5HuHjJK1CLNAxfES3DXQsnF7HYSia4guuylFLItY=', 
'HTTPStatusCode': 

[GitHub] [airflow] boring-cyborg[bot] commented on issue #29640: NoBoundaryInMultipartDefect raised using S3Hook

2023-02-20 Thread via GitHub


boring-cyborg[bot] commented on issue #29640:
URL: https://github.com/apache/airflow/issues/29640#issuecomment-1437187553

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] s0neq commented on a diff in pull request #29635: YandexCloud provider: support Yandex SDK feature "endpoint"

2023-02-20 Thread via GitHub


s0neq commented on code in PR #29635:
URL: https://github.com/apache/airflow/pull/29635#discussion_r1112105457


##
airflow/providers/yandex/hooks/yandex.py:
##
@@ -122,7 +127,8 @@ def __init__(
 self.connection = self.get_connection(self.connection_id)
 self.extras = self.connection.extra_dejson
 credentials = self._get_credentials()
-self.sdk = yandexcloud.SDK(user_agent=self.provider_user_agent(), 
**credentials)
+endpoint = self._get_field("endpoint", False)
+self.sdk = yandexcloud.SDK(user_agent=self.provider_user_agent(), 
endpoint=endpoint, **credentials)

Review Comment:
   thanks for the comment! 
   False is OK because here we initialize yandexcloud.SDK that sets default API 
endpoint in case it's missing:
   
https://github.com/yandex-cloud/python-sdk/blob/master/yandexcloud/_channels.py#L24
   
   we didn't set endpoint until now, so in these tests no endpoint is passed 
into hooks  
https://github.com/apache/airflow/blob/main/tests/system/providers/yandex/example_yandexcloud.py
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] v-hunt commented on issue #13311: pendulum.tz.zoneinfo.exceptions.InvalidTimezone

2023-02-20 Thread via GitHub


v-hunt commented on issue #13311:
URL: https://github.com/apache/airflow/issues/13311#issuecomment-1437224379

   @EKami hi, did you resolve this issue?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal closed issue #29557: Execution Timeout is not working properly on airflow 2.5.0

2023-02-20 Thread via GitHub


eladkal closed issue #29557: Execution Timeout is not working properly on 
airflow 2.5.0
URL: https://github.com/apache/airflow/issues/29557


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ashb commented on a diff in pull request #29623: Implement file credentials provider for AWS hook AssumeRoleWithWebIdentity

2023-02-20 Thread via GitHub


ashb commented on code in PR #29623:
URL: https://github.com/apache/airflow/pull/29623#discussion_r1112131738


##
airflow/providers/amazon/aws/hooks/base_aws.py:
##
@@ -312,19 +312,35 @@ def _get_web_identity_credential_fetcher(
 base_session = self.basic_session._session or 
botocore.session.get_session()
 client_creator = base_session.create_client
 federation = 
self.extra_config.get("assume_role_with_web_identity_federation")
-if federation == "google":
-web_identity_token_loader = 
self._get_google_identity_token_loader()
-else:
-raise AirflowException(
-f'Unsupported federation: {federation}. Currently "google" 
only are supported.'
-)
+
+web_identity_token_loader = (
+{
+"file": self._get_file_token_loader,
+"google": self._get_google_identity_token_loader,
+}.get(federation)()
+if type(federation) == str
+else None
+)

Review Comment:
   Makes sense to me. WDYT @Taragolis ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ashb commented on a diff in pull request #29623: Implement file credentials provider for AWS hook AssumeRoleWithWebIdentity

2023-02-20 Thread via GitHub


ashb commented on code in PR #29623:
URL: https://github.com/apache/airflow/pull/29623#discussion_r1112132611


##
airflow/providers/amazon/aws/hooks/base_aws.py:
##
@@ -311,20 +311,32 @@ def _get_web_identity_credential_fetcher(
 ) -> botocore.credentials.AssumeRoleWithWebIdentityCredentialFetcher:
 base_session = self.basic_session._session or 
botocore.session.get_session()
 client_creator = base_session.create_client
-federation = 
self.extra_config.get("assume_role_with_web_identity_federation")
-if federation == "google":
-web_identity_token_loader = 
self._get_google_identity_token_loader()
-else:
-raise AirflowException(
-f'Unsupported federation: {federation}. Currently "google" 
only are supported.'
-)
+federation = 
str(self.extra_config.get("assume_role_with_web_identity_federation"))
+
+web_identity_token_loader = {
+"file": self._get_file_token_loader,
+"google": self._get_google_identity_token_loader,
+}.get(federation, lambda: None)()
+
+if not web_identity_token_loader:

Review Comment:
   Since you have a default of `lambda: None` this is never going to be hit.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #28953: Updated Telegram Provider to ensure compatbility with >=20.0.0

2023-02-20 Thread via GitHub


potiuk commented on PR #28953:
URL: https://github.com/apache/airflow/pull/28953#issuecomment-1437252268

   Static checks/docs need fixing 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] pgagnon commented on a diff in pull request #29623: Implement file credentials provider for AWS hook AssumeRoleWithWebIdentity

2023-02-20 Thread via GitHub


pgagnon commented on code in PR #29623:
URL: https://github.com/apache/airflow/pull/29623#discussion_r1112140747


##
airflow/providers/amazon/aws/hooks/base_aws.py:
##
@@ -311,20 +311,32 @@ def _get_web_identity_credential_fetcher(
 ) -> botocore.credentials.AssumeRoleWithWebIdentityCredentialFetcher:
 base_session = self.basic_session._session or 
botocore.session.get_session()
 client_creator = base_session.create_client
-federation = 
self.extra_config.get("assume_role_with_web_identity_federation")
-if federation == "google":
-web_identity_token_loader = 
self._get_google_identity_token_loader()
-else:
-raise AirflowException(
-f'Unsupported federation: {federation}. Currently "google" 
only are supported.'
-)
+federation = 
str(self.extra_config.get("assume_role_with_web_identity_federation"))
+
+web_identity_token_loader = {
+"file": self._get_file_token_loader,
+"google": self._get_google_identity_token_loader,
+}.get(federation, lambda: None)()
+
+if not web_identity_token_loader:

Review Comment:
   It will because the callable is called right before assignment:
   
   ```
   $ python
   Python 3.9.7 (default, Dec  5 2021, 13:21:59) 
   [Clang 12.0.5 (clang-1205.0.22.9)] on darwin
   Type "help", "copyright", "credits" or "license" for more information.
   >>> def _get_file_token_loader():
   ... return "file"
   ... 
   >>> def _get_google_identity_token_loader():
   ... return "google"
   ... 
   >>> federation = "test"
   >>> 
   >>> web_identity_token_loader = {
   ... "file": _get_file_token_loader,
   ... "google": _get_google_identity_token_loader,
   ... }.get(federation, lambda: None)()
   >>> 
   >>> print("hit" if not web_identity_token_loader else "not hit")
   hit
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29624: Can't configure Kubernetes and Celery workers in Helm Chart

2023-02-20 Thread via GitHub


potiuk commented on PR #29624:
URL: https://github.com/apache/airflow/pull/29624#issuecomment-1437263567

   I think the idea in the #28800 was to be able to specify some parameters in 
the Helm Chart Values in a similar way for K8S as it si for celery. What it 
would likely mean - is merging the parameters specified in the values.yaml into 
the pod_template_file (providing defaults) and making it possible to run K8S 
executor even without a pod template file.
   
   The pod-template file is nice and super flexible, but as I imagine this one 
- it would be a nice way to configure most of the POD parameters in a very 
similar way for Celery and K8S pods, which seems to me like a good idea. It is 
kind of strange (byt justified in most complex cases) to have the parameters 
specified in pod_template_file. but having a static values (defaults?) 
specified in  Helm chart would make it much easier to configure the 
installation for simple cases where pod template file flexibility is not really 
needed. 
   
   @jedcunningham @dimberman @dstandish - WDYT about such a feature?
   
   This was - I think the original idea behind this feature (at least this is 
how I understood it). A


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #29638: Add OSSRank badge

2023-02-20 Thread via GitHub


potiuk merged PR #29638:
URL: https://github.com/apache/airflow/pull/29638


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Add OSSRank badge (#29638)

2023-02-20 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 497f6feb06 Add OSSRank badge (#29638)
497f6feb06 is described below

commit 497f6feb06996bbe90870f9f62968ae501f7f81a
Author: Ry Walker <4283+...@users.noreply.github.com>
AuthorDate: Mon Feb 20 11:18:30 2023 -0500

Add OSSRank badge (#29638)
---
 README.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/README.md b/README.md
index f50eb89185..3e65ba5f3f 100644
--- a/README.md
+++ b/README.md
@@ -32,6 +32,7 @@
 [![Twitter 
Follow](https://img.shields.io/twitter/follow/ApacheAirflow.svg?style=social&label=Follow)](https://twitter.com/ApacheAirflow)
 [![Slack 
Status](https://img.shields.io/badge/slack-join_chat-white.svg?logo=slack&style=social)](https://s.apache.org/airflow-slack)
 
[![Contributors](https://img.shields.io/github/contributors/apache/airflow)](https://github.com/apache/airflow/graphs/contributors)
+[![OSSRank](https://shields.io/endpoint?url=https://ossrank.com/shield/6)](https://ossrank.com/p/6)
 
 [Apache Airflow](https://airflow.apache.org/docs/apache-airflow/stable/) (or 
simply Airflow) is a platform to programmatically author, schedule, and monitor 
workflows.
 



[GitHub] [airflow] potiuk merged pull request #29639: Migrate breeze unit tests to pytest.

2023-02-20 Thread via GitHub


potiuk merged PR #29639:
URL: https://github.com/apache/airflow/pull/29639


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (497f6feb06 -> d721701e14)

2023-02-20 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 497f6feb06 Add OSSRank badge (#29638)
 add d721701e14 Migrate breeze unit tests to pytest. (#29639)

No new revisions were added by this update.

Summary of changes:
 dev/breeze/tests/test_run_utils.py | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)



[GitHub] [airflow] potiuk closed issue #29305: Migrate remaining tests to `pytest`

2023-02-20 Thread via GitHub


potiuk closed issue #29305: Migrate remaining tests to `pytest`
URL: https://github.com/apache/airflow/issues/29305


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29639: Migrate breeze unit tests to pytest.

2023-02-20 Thread via GitHub


potiuk commented on PR #29639:
URL: https://github.com/apache/airflow/pull/29639#issuecomment-1437270397

   That's it? Are we done :)? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #29635: YandexCloud provider: support Yandex SDK feature "endpoint"

2023-02-20 Thread via GitHub


Taragolis commented on code in PR #29635:
URL: https://github.com/apache/airflow/pull/29635#discussion_r1112150631


##
airflow/providers/yandex/hooks/yandex.py:
##
@@ -122,7 +127,8 @@ def __init__(
 self.connection = self.get_connection(self.connection_id)
 self.extras = self.connection.extra_dejson
 credentials = self._get_credentials()
-self.sdk = yandexcloud.SDK(user_agent=self.provider_user_agent(), 
**credentials)
+endpoint = self._get_field("endpoint", False)
+self.sdk = yandexcloud.SDK(user_agent=self.provider_user_agent(), 
endpoint=endpoint, **credentials)

Review Comment:
   But it not missing in our case, we would provide `False` in this case and it 
would evaluate to this wrong value
   
   ```python
   kwargs_no_endpoint = {"foo": "bar"}
   print(kwargs_no_endpoint.get("endpoint", "api.cloud.yandex.net"))
   
   kwargs_none_endpoint = {"foo": "bar", "endpoint": None}
   print(kwargs_none_endpoint.get("endpoint", "api.cloud.yandex.net"))
   
   kwargs_false_endpoint = {"foo": "bar", "endpoint": False}
   print(kwargs_false_endpoint.get("endpoint", "api.cloud.yandex.net"))
   ```
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vedantlodha commented on pull request #29639: Migrate breeze unit tests to pytest.

2023-02-20 Thread via GitHub


vedantlodha commented on PR #29639:
URL: https://github.com/apache/airflow/pull/29639#issuecomment-1437272462

   @potiuk  Sorry! I see the issue was closed, but there are some tests in 
google providers that still use unittest. Working on a pr for them. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #29635: YandexCloud provider: support Yandex SDK feature "endpoint"

2023-02-20 Thread via GitHub


Taragolis commented on code in PR #29635:
URL: https://github.com/apache/airflow/pull/29635#discussion_r1112151762


##
airflow/providers/yandex/hooks/yandex.py:
##
@@ -122,7 +127,8 @@ def __init__(
 self.connection = self.get_connection(self.connection_id)
 self.extras = self.connection.extra_dejson
 credentials = self._get_credentials()
-self.sdk = yandexcloud.SDK(user_agent=self.provider_user_agent(), 
**credentials)
+endpoint = self._get_field("endpoint", False)
+self.sdk = yandexcloud.SDK(user_agent=self.provider_user_agent(), 
endpoint=endpoint, **credentials)

Review Comment:
   I guess potentially in other places `self._get_field("foo-bar", 
False)`produced incorrect behaviour of Yandex Cloud SDK



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29616: Refactor docker-compose quick start test

2023-02-20 Thread via GitHub


potiuk commented on PR #29616:
URL: https://github.com/apache/airflow/pull/29616#issuecomment-1437277562

   >  The more you look on error the less chance that this error will reproduce 
roll_eyes rofl
   
   Heisentest
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #29608: Enable passing --xcom-args to tasks test CLI command

2023-02-20 Thread via GitHub


uranusjr commented on PR #29608:
URL: https://github.com/apache/airflow/pull/29608#issuecomment-1437292418

   Using JSON for this feels cumbersome to me. I don’t have a much better idea 
though. Also we need to find a way to support custom backends that may not 
store values as JSON.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29616: Refactor docker-compose quick start test

2023-02-20 Thread via GitHub


potiuk commented on PR #29616:
URL: https://github.com/apache/airflow/pull/29616#issuecomment-1437294393

   BTW. This one failed. And the problem is that `wait_for_terminal_state" 
simply exits after 80 tries when the task is "queued" . But why ? Hard to say. 
I have a feeling we have a race condition somewhere in scheduler (and likely 
it's a "real" one affecting tasks converted to "queued" right in the middle of 
some scheduler processing.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29639: Migrate breeze unit tests to pytest.

2023-02-20 Thread via GitHub


potiuk commented on PR #29639:
URL: https://github.com/apache/airflow/pull/29639#issuecomment-1437295426

   Reopened :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] hussein-awala commented on pull request #29624: Can't configure Kubernetes and Celery workers in Helm Chart

2023-02-20 Thread via GitHub


hussein-awala commented on PR #29624:
URL: https://github.com/apache/airflow/pull/29624#issuecomment-1437300580

   @amoghrajesh or @potiuk can you please update the description and link the 
PR to the issue?
   
   I think we can add a new value `k8s_pod_template` and use it when the 
executor is `CeleryKubernetesExecutor`:
   - if the executor is `CeleryExecutor`: load conf from `.Values.workers`
   - if the executor is `CeleryKubernetesExecutor`: load celery worker conf 
from `.Values.workers` and pod template conf from `.Values.k8s_pod_template` if 
it exists, and if not we load them from `.Values.workers` or `.Values` as we do 
for now
   - if the executor is `KubernetesExecutor`: we load pod template conf from 
`.Values.k8s_pod_template` if exists, and if not we load them from 
`.Values.workers`  or `.Values`
   
   I believe some of the `CeleryKubernetesExecutor` users use the same 
configuration for celery and k8s, if we separate the configurations in two 
sections, they will need to duplicate them. With my suggestion, they can keep 
them identical, or override them/some of them in the section k8s_pod_template


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on a diff in pull request #28256: Include full path to Python files under zip path while clearing import errors.

2023-02-20 Thread via GitHub


ephraimbuddy commented on code in PR #28256:
URL: https://github.com/apache/airflow/pull/28256#discussion_r1112172817


##
airflow/dag_processing/manager.py:
##
@@ -782,7 +782,11 @@ def clear_nonexistent_import_errors(file_paths: list[str] 
| None, session=NEW_SE
 """
 query = session.query(errors.ImportError)
 if file_paths:
-query = query.filter(~errors.ImportError.filename.in_(file_paths))
+for file_path in file_paths:
+if file_path.endswith(".zip"):
+query = 
query.filter(~(errors.ImportError.filename.startswith(file_path)))
+else:
+query = query.filter(errors.ImportError.filename != 
file_path)

Review Comment:
   It's not same with the one you linked. What we have here is equivalent to:
   ```python
   for i in range(3):
   x = i
   ```
   at the end, the value of x is 2



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ryw commented on pull request #29638: Add OSSRank badge

2023-02-20 Thread via GitHub


ryw commented on PR #29638:
URL: https://github.com/apache/airflow/pull/29638#issuecomment-1437303211

   Thanks :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on a diff in pull request #28256: Include full path to Python files under zip path while clearing import errors.

2023-02-20 Thread via GitHub


ephraimbuddy commented on code in PR #28256:
URL: https://github.com/apache/airflow/pull/28256#discussion_r1112176062


##
airflow/dag_processing/manager.py:
##
@@ -782,7 +782,11 @@ def clear_nonexistent_import_errors(file_paths: list[str] 
| None, session=NEW_SE
 """
 query = session.query(errors.ImportError)
 if file_paths:
-query = query.filter(~errors.ImportError.filename.in_(file_paths))
+for file_path in file_paths:
+if file_path.endswith(".zip"):
+query = 
query.filter(~(errors.ImportError.filename.startswith(file_path)))
+else:
+query = query.filter(errors.ImportError.filename != 
file_path)

Review Comment:
   Oops Got it



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29624: Can't configure Kubernetes and Celery workers in Helm Chart

2023-02-20 Thread via GitHub


potiuk commented on PR #29624:
URL: https://github.com/apache/airflow/pull/29624#issuecomment-1437306653

   
   > I believe some of the `CeleryKubernetesExecutor` users use the same 
configuration for celery and k8s, if we separate the configurations in two 
sections, they will need to duplicate them. With my suggestion, they can keep 
them identical, or override them/some of them in the section k8s_pod_template
   
   Hmm. I think the whole reason to have CeleryK8S executor was to make them 
"different" - I think characteristics (memory, CPU etc. ) of "always running" 
celery worker Pod - which usually has N Python interpreter started (celery 
parallelism) is pretty much always different than K8S Pod (which always runs 1 
task). I think it would be rather counter-productive to commonalize those 
settings 
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29624: Can't configure Kubernetes and Celery workers in Helm Chart

2023-02-20 Thread via GitHub


potiuk commented on PR #29624:
URL: https://github.com/apache/airflow/pull/29624#issuecomment-1437309061

   (BTW. @amoghrajesh - adding `Closes #PR` inside comment does not work :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



  1   2   3   >