[GitHub] [airflow] potiuk commented on a diff in pull request #28502: Migrate DagFileProcessor.manage_slas to Internal API
potiuk commented on code in PR #28502: URL: https://github.com/apache/airflow/pull/28502#discussion_r1082897117 ## airflow/dag_processing/processor.py: ## @@ -365,8 +365,10 @@ def __init__(self, dag_ids: list[str] | None, dag_directory: str, log: logging.L self._dag_directory = dag_directory self.dag_warnings: set[tuple[str, str]] = set() +@staticmethod +@internal_api_call @provide_session -def manage_slas(self, dag: DAG, session: Session = None) -> None: +def manage_slas(dag_folder, dag_id: str, log: logging.Logger, session: Session = NEW_SESSION) -> None: Review Comment: Not nice bot.. Closed my PR :D -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] mattppal commented on pull request #28592: Guard not-yet-expanded ti in trigger rule dep
mattppal commented on PR #28592: URL: https://github.com/apache/airflow/pull/28592#issuecomment-1398744685 Hi all, I'm seeing the same issue on `rc2` when passing an XComArg object to the mapped task. Lists appear to work as expected. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] vincbeck commented on a diff in pull request #28502: Migrate DagFileProcessor.manage_slas to Internal API
vincbeck commented on code in PR #28502: URL: https://github.com/apache/airflow/pull/28502#discussion_r1082893036 ## airflow/dag_processing/processor.py: ## @@ -365,8 +365,10 @@ def __init__(self, dag_ids: list[str] | None, dag_directory: str, log: logging.L self._dag_directory = dag_directory self.dag_warnings: set[tuple[str, str]] = set() +@staticmethod +@internal_api_call @provide_session -def manage_slas(self, dag: DAG, session: Session = None) -> None: +def manage_slas(dag_folder, dag_id: str, log: logging.Logger, session: Session = NEW_SESSION) -> None: Review Comment: Thanks! I'll update this PR accordingly -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #29064: Capitalize dag to DAG
potiuk commented on PR #29064: URL: https://github.com/apache/airflow/pull/29064#issuecomment-1398738922 I cannot figure out why 'dag' is not flagged by our spellchecker. It looks like it should, and I tried to experiment a bit but I really cannot find why it has been allowed in the first place (capitalized acronyms are excluded from spellcheck). Weird. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Taragolis merged pull request #29063: Migrate Helm tests to `pytest`
Taragolis merged PR #29063: URL: https://github.com/apache/airflow/pull/29063 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Taragolis commented on pull request #29035: Renaming nose compatible methods in flavour of regular pytest naming
Taragolis commented on PR #29035: URL: https://github.com/apache/airflow/pull/29035#issuecomment-1398705272 > > @potiuk, is it the right place? > > Yep. But you can also add "full tests needed" label on a PR to run a complete set of tests. And very easy to forget add this label even if author of the PR has access to do that. `¯\_(ツ)_/¯` This file does not change frequently. Or we could move content from pytest.ini to [pyproject.toml](https://github.com/apache/airflow/blob/main/pyproject.toml) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jedcunningham commented on pull request #28808: Allow setting the name for the base container within K8s Pod Operator
jedcunningham commented on PR #28808: URL: https://github.com/apache/airflow/pull/28808#issuecomment-1398700496 Looking at this with fresh eyes, this works? ``` kpo_task = KubernetesPodOperator(...) kpo_task.BASE_CONTAINER_NAME = "whatever" ``` Is it worth adding this to init for an edge case? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #29063: Migrate Helm tests to `pytest`
potiuk commented on PR #29063: URL: https://github.com/apache/airflow/pull/29063#issuecomment-1398695049 (pending tess succeed ofc) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] bbovenzi opened a new pull request, #29066: Check for run_id url param when linking to graph/gantt views
bbovenzi opened a new pull request, #29066: URL: https://github.com/apache/airflow/pull/29066 Some parts of the app linked over to the graph and gantt views with a `run_id=` param, but we weren't actually using that param to decide which run to show for a user. Now, we will first check for `run_id`, and only use `execution_date` if `run_id` isn't specified. Fixes: https://github.com/apache/airflow/issues/28155 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #28846: Updated app to support configuring the caching hash method for FIPS
potiuk commented on code in PR #28846: URL: https://github.com/apache/airflow/pull/28846#discussion_r1082853296 ## airflow/models/serialized_dag.py: ## @@ -102,7 +102,7 @@ def __init__(self, dag: DAG, processor_subdir: str | None = None): dag_data = SerializedDAG.to_dict(dag) dag_data_json = json.dumps(dag_data, sort_keys=True).encode("utf-8") -self.dag_hash = hashlib.md5(dag_data_json).hexdigest() +self.dag_hash = hashlib.new("md5", data=dag_data_json, usedforsecurity=False).hexdigest() Review Comment: Look for PY38 for example -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #29065: When clearing task instances try to get associated DAGs from database
boring-cyborg[bot] commented on PR #29065: URL: https://github.com/apache/airflow/pull/29065#issuecomment-1398692028 Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst) Here are some useful points: - Pay attention to the quality of your code (ruff, mypy and type annotations). Our [pre-commits]( https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that. - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short [guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it. - Consider using [Breeze environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations. - Be patient and persistent. It might take some time to get a review or get the final approval from Committers. - Please follow [ASF Code of Conduct](https://www.apache.org/foundation/policies/conduct) for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack. - Be sure to read the [Airflow Coding style]( https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices). Apache Airflow is a community-driven project and together we are making it better 🚀. In case of doubts contact the developers at: Mailing List: d...@airflow.apache.org Slack: https://s.apache.org/airflow-slack -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] sean-rose opened a new pull request, #29065: When clearing task instances try to get associated DAGs from database
sean-rose opened a new pull request, #29065: URL: https://github.com/apache/airflow/pull/29065 This fixes problems when recursively clearing task instances across multiple DAGs: * Task instances in downstream DAGs weren't having their `max_tries` property incremented, which could cause downstream external task sensors in reschedule mode to instantly time out. * Task instances in downstream DAGs could have some of their properties overridden by an unrelated task in the upstream DAG if they had the same task ID. closes: #29049 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Taragolis opened a new pull request, #29063: Migrate Helm tests to `pytest`
Taragolis opened a new pull request, #29063: URL: https://github.com/apache/airflow/pull/29063 Migrate `kubernetes_tests` to pytest. Also resolve issue that tests use combination of `unittests`/`nose`/`pytest` which found during this PR: - https://github.com/apache/airflow/pull/29035#issuecomment-1398119466 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] BasPH opened a new pull request, #29064: Capitalize dag to DAG
BasPH opened a new pull request, #29064: URL: https://github.com/apache/airflow/pull/29064 DAG is an abbreviation (Directed Acyclic Graph) and should therefore be written capitalized. This PR changes "dags" to "DAGs" wherever it's written for humans (i.e. not code, hyperlinks, etc.). --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] shyft-mike commented on a diff in pull request #28846: Updated app to support configuring the caching hash method for FIPS
shyft-mike commented on code in PR #28846: URL: https://github.com/apache/airflow/pull/28846#discussion_r1082843374 ## airflow/models/serialized_dag.py: ## @@ -102,7 +102,7 @@ def __init__(self, dag: DAG, processor_subdir: str | None = None): dag_data = SerializedDAG.to_dict(dag) dag_data_json = json.dumps(dag_data, sort_keys=True).encode("utf-8") -self.dag_hash = hashlib.md5(dag_data_json).hexdigest() +self.dag_hash = hashlib.new("md5", data=dag_data_json, usedforsecurity=False).hexdigest() Review Comment: Ah okay. I went with "new" since it used kwargs, so passing usedforsecurity would either work or just get ignored. But I see the precommit linting fails on that being unexpected. Are there any examples currently of where it makes decisions based on the python version? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #28476: Migrate DagFileProcessorManager._deactivate_stale_dags to Internal API
potiuk commented on PR #28476: URL: https://github.com/apache/airflow/pull/28476#issuecomment-1398678859 > > I guess that one should be updated after discussion in #28502 with `@classmethod` ? > > Correct! I rather wait for #28502 to be merged and then I'll update this PR +1 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #28502: Migrate DagFileProcessor.manage_slas to Internal API
potiuk commented on code in PR #28502: URL: https://github.com/apache/airflow/pull/28502#discussion_r1082833504 ## airflow/dag_processing/processor.py: ## @@ -365,8 +365,10 @@ def __init__(self, dag_ids: list[str] | None, dag_directory: str, log: logging.L self._dag_directory = dag_directory self.dag_warnings: set[tuple[str, str]] = set() +@staticmethod +@internal_api_call @provide_session -def manage_slas(self, dag: DAG, session: Session = None) -> None: +def manage_slas(dag_folder, dag_id: str, log: logging.Logger, session: Session = NEW_SESSION) -> None: Review Comment: PR here: https://github.com/aws-mwaa/upstream-to-airflow/pull/3/files -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] shyft-mike commented on a diff in pull request #28846: Updated app to support configuring the caching hash method for FIPS
shyft-mike commented on code in PR #28846: URL: https://github.com/apache/airflow/pull/28846#discussion_r1082832047 ## airflow/www/app.py: ## @@ -131,7 +132,29 @@ def create_app(config=None, testing=False): init_robots(flask_app) +# Configure caching +webserver_caching_hash_method = conf.get(section="webserver", key="CACHING_HASH_METHOD", fallback=None) cache_config = {"CACHE_TYPE": "flask_caching.backends.filesystem", "CACHE_DIR": gettempdir()} + +if ( +webserver_caching_hash_method is not None +and webserver_caching_hash_method.casefold() != "md5".casefold() +): +if webserver_caching_hash_method.casefold() == "sha512".casefold(): +cache_config["CACHE_OPTIONS"] = {"hash_method": hashlib.sha512} +elif webserver_caching_hash_method.casefold() == "sha384".casefold(): +cache_config["CACHE_OPTIONS"] = {"hash_method": hashlib.sha384} +elif webserver_caching_hash_method.casefold() == "sha256".casefold(): +cache_config["CACHE_OPTIONS"] = {"hash_method": hashlib.sha256} +elif webserver_caching_hash_method.casefold() == "sha224".casefold(): Review Comment: This should be cleaned up quite a bit now :+1: -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #28502: Migrate DagFileProcessor.manage_slas to Internal API
potiuk commented on code in PR #28502: URL: https://github.com/apache/airflow/pull/28502#discussion_r1082811184 ## airflow/dag_processing/processor.py: ## @@ -365,8 +365,10 @@ def __init__(self, dag_ids: list[str] | None, dag_directory: str, log: logging.L self._dag_directory = dag_directory self.dag_warnings: set[tuple[str, str]] = set() +@staticmethod +@internal_api_call @provide_session -def manage_slas(self, dag: DAG, session: Session = None) -> None: +def manage_slas(dag_folder, dag_id: str, log: logging.Logger, session: Session = NEW_SESSION) -> None: Review Comment: I see. challenge accepted -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] BasPH opened a new pull request, #29062: Several improvements to the Params doc
BasPH opened a new pull request, #29062: URL: https://github.com/apache/airflow/pull/29062 The PR makes several improvements to https://airflow.apache.org/docs/apache-airflow/stable/concepts/params.html: - Moved task-level params paragraph up to create a more logical structure - Emphasized several code examples - Fixed a hyperlink - Spelling fixes - Simplified first paragraph for clarity - Auto-register DAGs in code examples --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] vincbeck commented on pull request #28476: Migrate DagFileProcessorManager._deactivate_stale_dags to Internal API
vincbeck commented on PR #28476: URL: https://github.com/apache/airflow/pull/28476#issuecomment-1398653005 > I guess that one should be updated after discussion in #28502 with `@classmethod` ? Correct! I rather wait for #28502 to ber merged and then I'll update this PR -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #28476: Migrate DagFileProcessorManager._deactivate_stale_dags to Internal API
potiuk commented on PR #28476: URL: https://github.com/apache/airflow/pull/28476#issuecomment-1398649440 I guess that should be updated after discussion in https://github.com/apache/airflow/pull/28502 with `@classmethod` ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #28979: Fix rendering parameters in PapermillOperator
potiuk commented on PR #28979: URL: https://github.com/apache/airflow/pull/28979#issuecomment-1398644414 Rebased it as for some reason there was only 1 mergable check. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #28981: Implemented log_containers option to read from all containers in KubernetesPodOperator
potiuk commented on code in PR #28981: URL: https://github.com/apache/airflow/pull/28981#discussion_r1082797194 ## airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py: ## @@ -168,6 +168,9 @@ class KubernetesPodOperator(BaseOperator): :param labels: labels to apply to the Pod. (templated) :param startup_timeout_seconds: timeout in seconds to startup the pod. :param get_logs: get the stdout of the container as logs of the tasks. +:param log_containers: list of container names or bool value to collect logs. +If bool value is True, all container logs are collected, +if False, only 'base' container logs are collected. Review Comment: > but can I assume that this will be the final design? No. you can't. There is always possibility that others will come with more comments or find something else. So you have no gurarantees. It happens that I have to (even as maintainer) rewrite and fix and split and merge the same PR 10-20 times. This is possible. I am not saying this will happen in this case. Just that you have no guarantees that there will be no more comments. You have to deal with this. And treat it as a learning opportunity. I - for one - I am super happy when someone suggest a change that allows to remove half of the code I wrote and iterated over in a PR of mine even if it's done after weeks of iterating. The end result is better, often the comment would not even come if I did not iterate multiple times in the process. I've learned somethings. Only pluses. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] vincbeck commented on a diff in pull request #28502: Migrate DagFileProcessor.manage_slas to Internal API
vincbeck commented on code in PR #28502: URL: https://github.com/apache/airflow/pull/28502#discussion_r1082796111 ## airflow/dag_processing/processor.py: ## @@ -365,8 +365,10 @@ def __init__(self, dag_ids: list[str] | None, dag_directory: str, log: logging.L self._dag_directory = dag_directory self.dag_warnings: set[tuple[str, str]] = set() +@staticmethod +@internal_api_call @provide_session -def manage_slas(self, dag: DAG, session: Session = None) -> None: +def manage_slas(dag_folder, dag_id: str, log: logging.Logger, session: Session = NEW_SESSION) -> None: Review Comment: See the method `get_log` I added [here](https://github.com/apache/airflow/pull/28502/files#diff-af25dd96233a97ace811c614fa7d5bd059cbbc1571e421f6675e16f6290814c5R71). It is literally a duplicate of the property log defined just below -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #28529: Add deferrable mode to DataprocCreateClusterOperator and
potiuk commented on PR #28529: URL: https://github.com/apache/airflow/pull/28529#issuecomment-1398635338 Conflicts :( -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #28808: Allow setting the name for the base container within K8s Pod Operator
potiuk commented on PR #28808: URL: https://github.com/apache/airflow/pull/28808#issuecomment-1398633574 @jedcunningham @eladkal - any comments or shall we merge it ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #28502: Migrate DagFileProcessor.manage_slas to Internal API
potiuk commented on code in PR #28502: URL: https://github.com/apache/airflow/pull/28502#discussion_r1082777909 ## airflow/dag_processing/processor.py: ## @@ -365,8 +365,10 @@ def __init__(self, dag_ids: list[str] | None, dag_directory: str, log: logging.L self._dag_directory = dag_directory self.dag_warnings: set[tuple[str, str]] = set() +@staticmethod +@internal_api_call @provide_session -def manage_slas(self, dag: DAG, session: Session = None) -> None: +def manage_slas(dag_folder, dag_id: str, log: logging.Logger, session: Session = NEW_SESSION) -> None: Review Comment: Yes. Allowing classmethods should be fine as well. > I could not find a way to remove the duplicate methods of log and get_log in logging_mixin.py. If you have a solution, I'd be very happy to use it :) What do you mean by that @vincbeck ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk closed issue #28919: Airflow API kerberos authentication error
potiuk closed issue #28919: Airflow API kerberos authentication error URL: https://github.com/apache/airflow/issues/28919 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk merged pull request #29054: Fix kerberos authentication for the REST API.
potiuk merged PR #29054: URL: https://github.com/apache/airflow/pull/29054 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #29044: Resolve RDS cname
potiuk commented on code in PR #29044: URL: https://github.com/apache/airflow/pull/29044#discussion_r1082740254 ## airflow/providers/common/sql/hooks/sql.py: ## @@ -169,6 +169,33 @@ def get_uri(self) -> str: conn.schema = self.__schema or conn.schema return conn.get_uri() +def resolve_rds_cname(self, hostname): Review Comment: Why: because when/if we split providers, having an AWS-specific code in common code is no-go. All the AWS code should be in the provider. If we leave the code in common code that means that the AWS service is privilledged and can do more than other services that have their providers implemented in the community or outside. It would mean that you cannot implement something similar for GCP, Azure or Digital Ocean without modifying Airlfow core or common.sql. And this means that anyone writing their own provider is impaired and limited. This is not what we promised when we say people "it does not matter whether you implement your own provider, or you are part of the core". We are not yet there - of course - but I hope we will one day. And adding more things to unentangle when we do is not an option for me, -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #29044: Resolve RDS cname
potiuk commented on code in PR #29044: URL: https://github.com/apache/airflow/pull/29044#discussion_r1082740254 ## airflow/providers/common/sql/hooks/sql.py: ## @@ -169,6 +169,33 @@ def get_uri(self) -> str: conn.schema = self.__schema or conn.schema return conn.get_uri() +def resolve_rds_cname(self, hostname): Review Comment: Why: because when/if we split providers, having an AWS-specific code in common code is no-go. All the code should ne in the provider. If we leave the code in common code that means that the AWS service is privilledged and can do more than other services that have their providers implemented in the community or outside. It would mean that you cannot implement something similar for GCP, Azure or Digital Ocean without modifying Airlfow core or common.sql. And this means that anyone writing their own provider is impaired and limited. This is not what we promised when we say people "it does not matter whether you implement your own provider, or you are part of the core". We are not yet there - of course - but I hope we will one day. And adding more things to unentangle when we do is not an option for me, -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #29044: Resolve RDS cname
potiuk commented on code in PR #29044: URL: https://github.com/apache/airflow/pull/29044#discussion_r1082731538 ## airflow/providers/common/sql/hooks/sql.py: ## @@ -169,6 +169,33 @@ def get_uri(self) -> str: conn.schema = self.__schema or conn.schema return conn.get_uri() +def resolve_rds_cname(self, hostname): Review Comment: Nope. That's not enough. I do not want the AWS code to be in non-AWS provider (even though we have it now in places). We have an AWS provider - all code specific to AWS should be there, if/when we split providers out of Airlfow core, I do not want any external service code in "apache-airlfow" package. The requirement is simple: none of AWS-specific code should be in `apache/airlfow/providers/common/sql` or `apache/airflow`. All the AWS-specific code should be in `apache/airflow/providers/aws/` That's very simple and straighforward expectation. Exactly how this should be done and how the code is to be injected from one package to the other (via callbacks, hooks, etc. ) - I am open to any proposal. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #29044: Resolve RDS cname
potiuk commented on code in PR #29044: URL: https://github.com/apache/airflow/pull/29044#discussion_r1082731538 ## airflow/providers/common/sql/hooks/sql.py: ## @@ -169,6 +169,33 @@ def get_uri(self) -> str: conn.schema = self.__schema or conn.schema return conn.get_uri() +def resolve_rds_cname(self, hostname): Review Comment: Nope. That's not enough. I do not want the AWS code to be in non-AWS provider (even though we have it now in places). We have an AWS provider - all code specific to AWS should be there, if/when we split providers out of Airlfow core, I do not want any external service code in "apache-airlfow" package. The requirement is simple none of AWS-specific code should be in `apache/airlfow/providers/common/sql` or `apache/airflow`. All the AWS-specific code should be in `apache/airflow/providers/aws/` That's very simple and straighforward expectation. Exactly how this should be done and how the code is to be injected from one package to the other (via callbacks, hooks, etc. ) - I am open to any proposal. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #29044: Resolve RDS cname
potiuk commented on code in PR #29044: URL: https://github.com/apache/airflow/pull/29044#discussion_r1082731538 ## airflow/providers/common/sql/hooks/sql.py: ## @@ -169,6 +169,33 @@ def get_uri(self) -> str: conn.schema = self.__schema or conn.schema return conn.get_uri() +def resolve_rds_cname(self, hostname): Review Comment: Nope. That's not enough. I do not want the AWS code to be in non-AWS provider (even though we have it now in places). We have an AWS provider - all code specific to AWS should be there, if/when we split providers out of Airlfow core, I do not want any external service code in "apache-airlfow" package. The requirement is simple none of AWS-specific code should be in apache/airlfow/providers/common/sql or apache/airflow. All the AWS-specific code should be in apache/airflow/providers/aws/ That's very simple and straighforward expectation. Exactly how this should be done and how the code is to be injected from one package to the other (via callbacks, hooks, etc. ) - I am open to any proposal. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] VladaZakharova closed pull request #29017: Add deferrable mode to KubernetesPodOperator
VladaZakharova closed pull request #29017: Add deferrable mode to KubernetesPodOperator URL: https://github.com/apache/airflow/pull/29017 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] BasPH merged pull request #29027: Annotate and simplify code samples in DAGs doc
BasPH merged PR #29027: URL: https://github.com/apache/airflow/pull/29027 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #29035: Renaming nose compatible methods in flavour of regular pytest naming
potiuk commented on PR #29035: URL: https://github.com/apache/airflow/pull/29035#issuecomment-1398552691 > @potiuk, is it the right place? Yep. But you can also add "full tests needed" label on a PR to run a complete set of tests. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] snjypl commented on pull request #29012: Fix airflow-webserver cannot create resource pods for k8s exceutor deployed using helm chart
snjypl commented on PR #29012: URL: https://github.com/apache/airflow/pull/29012#issuecomment-1398507760 @arjunanan6 also, you should check `kubectl auth can-i create pods --as system:serviceaccount:airflow:airflowlocal-webserver` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #29056: Handling error on cluster policy itself
potiuk commented on PR #29056: URL: https://github.com/apache/airflow/pull/29056#issuecomment-1398494098 (it will be obvious from the stack trace that the error was in the policy not in the DAG). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #29056: Handling error on cluster policy itself
potiuk commented on PR #29056: URL: https://github.com/apache/airflow/pull/29056#issuecomment-1398492336 I thik it's a good idea, but do we really need to add a new exception on that ? Wouldn't just directly hadling ANY Exception (instead of the specific Exceptions we list there) do the same job? Just changing: ``` except ( AirflowClusterPolicyViolation, AirflowDagCycleException, AirflowDagDuplicatedIdException, AirflowDagInconsistent, ParamValidationError, ) as exception: ``` into: ``` except Exception as exception: ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] snjypl commented on pull request #29012: Fix airflow-webserver cannot create resource pods for k8s exceutor deployed using helm chart
snjypl commented on PR #29012: URL: https://github.com/apache/airflow/pull/29012#issuecomment-1398479900 > apiVersion: rbac.authorization.k8s.io/v1 > kind: RoleBinding > metadata: > annotations: > meta.helm.sh/release-name: airflowlocal > meta.helm.sh/release-namespace: airflow > creationTimestamp: "2023-01-18T09:52:13Z" > labels: > app.kubernetes.io/managed-by: Helm > chart: airflow-1.7.0 > heritage: Helm > release: airflowlocal > tier: airflow > name: airflowlocal-pod-launcher-rolebinding > namespace: airflow > resourceVersion: "13279" > uid: 05617ca4-ebf3-424a-990b-8b59274702f0 > roleRef: > apiGroup: rbac.authorization.k8s.io > kind: Role > name: airflowlocal-pod-launcher-role > subjects: > - kind: ServiceAccount > name: airflowlocal-scheduler > namespace: airflow > - kind: ServiceAccount > name: airflowlocal-worker > namespace: airflow @arjunanan6 it seems like you have not applied the patch in this PR correctly, as you can see `airflowlocal-webserver` service account does not have pod laucher role. you can try to manually apply the bellow gist https://gist.github.com/snjypl/aeebe582be0190e483163224f9c966e7 reason why the scheduled task are not having issue is, the scheduled task pod are launched by the `scheduler`. the scheduler sa has pod launcher role. in case of manual trigger, the k8s pod is launched by the `webserver`. how are you deploying the helm chart from this PR? if you can share the steps i might be able to help you with it. anyway, manually doing a `kubectl apply -f ` should fix it for you. later you can try testing the helm chart. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Taragolis commented on pull request #28953: Updated Telegram Provider to ensure compatbility with >=20.0.0
Taragolis commented on PR #28953: URL: https://github.com/apache/airflow/pull/28953#issuecomment-1398463533 Well... my personal thoughts that there is not a lot of differences for end users between migration to `python-telegram-bot>=20.0.0` and replace by `pyTelegramBotAPI`. I think we could keep work Hooks/Operators as it work now in both cases. Breaking compatibility is new object returned by `TelegramHook.connection` and `TelegramHook.get_conn()` witch incompatible with current version in both cases. The main differences how end user who build their Operators will resolve incompatibility issues. 1. `python-telegram-bot>=20.0.0`: In most cases required to wrap all `telegram.Bot` async methods in sync implementation. 2. `pyTelegramBotAPI`: Need migrate to `telebot.TeleBot` methods Unfortunetly I'm not an enduser of Telegram Provider so I'm not a best person who could which approach would be less harmful for end users. @eladkal, @uranusjr Maybe you have you thoughts about this? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] arjunanan6 commented on pull request #29012: Fix airflow-webserver cannot create resource pods for k8s exceutor deployed using helm chart
arjunanan6 commented on PR #29012: URL: https://github.com/apache/airflow/pull/29012#issuecomment-1398443598 @snjypl Sure, here you go: ``` apiVersion: rbac.authorization.k8s.io/v1 kind: RoleBinding metadata: annotations: meta.helm.sh/release-name: airflowlocal meta.helm.sh/release-namespace: airflow creationTimestamp: "2023-01-18T09:52:13Z" labels: app.kubernetes.io/managed-by: Helm chart: airflow-1.7.0 heritage: Helm release: airflowlocal tier: airflow name: airflowlocal-pod-launcher-rolebinding namespace: airflow resourceVersion: "13279" uid: 05617ca4-ebf3-424a-990b-8b59274702f0 roleRef: apiGroup: rbac.authorization.k8s.io kind: Role name: airflowlocal-pod-launcher-role subjects: - kind: ServiceAccount name: airflowlocal-scheduler namespace: airflow - kind: ServiceAccount name: airflowlocal-worker namespace: airflow ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] utkarsharma2 commented on a diff in pull request #28934: Remove hard-coded executor-database coupling
utkarsharma2 commented on code in PR #28934: URL: https://github.com/apache/airflow/pull/28934#discussion_r1082578543 ## airflow/cli/commands/standalone_command.py: ## @@ -159,14 +159,28 @@ def calculate_env(self): # Make sure we're using a local executor flavour executor_class, _ = ExecutorLoader.import_default_executor_cls() if not executor_class.is_local: -if "sqlite" in conf.get("database", "sql_alchemy_conn"): -self.print_output("standalone", "Forcing executor to SequentialExecutor") -env["AIRFLOW__CORE__EXECUTOR"] = executor_constants.SEQUENTIAL_EXECUTOR -else: -self.print_output("standalone", "Forcing executor to LocalExecutor") -env["AIRFLOW__CORE__EXECUTOR"] = executor_constants.LOCAL_EXECUTOR +env["AIRFLOW__CORE__EXECUTOR"] = self.get_local_executor( +database=conf.get("database", "sql_alchemy_conn") +) +self.print_output("standalone", f"Forcing executor to {env['AIRFLOW__CORE__EXECUTOR']}") Review Comment: @o-nikolas The only benefit I see with this code we still remove the need to change this part of the code if someone implements a new local executor, right? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] snjypl commented on pull request #29012: Fix airflow-webserver cannot create resource pods for k8s exceutor deployed using helm chart
snjypl commented on PR #29012: URL: https://github.com/apache/airflow/pull/29012#issuecomment-1398439995 @arjunanan6 can you please share the output of `kubectl get rolebinding airflow-pod-launcher-rolebinding -o yaml` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Anton-Shutik commented on a diff in pull request #29044: Resolve RDS cname
Anton-Shutik commented on code in PR #29044: URL: https://github.com/apache/airflow/pull/29044#discussion_r1082574233 ## airflow/providers/common/sql/hooks/sql.py: ## @@ -169,6 +169,33 @@ def get_uri(self) -> str: conn.schema = self.__schema or conn.schema return conn.get_uri() +def resolve_rds_cname(self, hostname): Review Comment: @potiuk Can it be used like this: ```python pg_hook = PostgresHook(..., behaviors=["rds"]) ``` and this will enable rds specific logic in the PostgresHook ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] utkarsharma2 commented on a diff in pull request #28934: Remove hard-coded executor-database coupling
utkarsharma2 commented on code in PR #28934: URL: https://github.com/apache/airflow/pull/28934#discussion_r1082565064 ## airflow/cli/commands/standalone_command.py: ## @@ -159,14 +159,28 @@ def calculate_env(self): # Make sure we're using a local executor flavour executor_class, _ = ExecutorLoader.import_default_executor_cls() if not executor_class.is_local: -if "sqlite" in conf.get("database", "sql_alchemy_conn"): -self.print_output("standalone", "Forcing executor to SequentialExecutor") -env["AIRFLOW__CORE__EXECUTOR"] = executor_constants.SEQUENTIAL_EXECUTOR -else: -self.print_output("standalone", "Forcing executor to LocalExecutor") -env["AIRFLOW__CORE__EXECUTOR"] = executor_constants.LOCAL_EXECUTOR +env["AIRFLOW__CORE__EXECUTOR"] = self.get_local_executor( +database=conf.get("database", "sql_alchemy_conn") +) +self.print_output("standalone", f"Forcing executor to {env['AIRFLOW__CORE__EXECUTOR']}") return env +@classmethod +def get_local_executor(cls, database: str) -> str: +""" +Get local executor for standalone command +for sqlite we need SEQUENTIAL_EXECUTOR otherwise LOCAL_EXECUTOR. +""" +try: +return [ +executor_name +for executor_name in ExecutorLoader.executors.keys() +if executor_name != "DaskExecutor" Review Comment: @o-nikolas My bad was testing it locally and forgot to remove it. It's not needed here. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] utkarsharma2 commented on a diff in pull request #28934: Remove hard-coded executor-database coupling
utkarsharma2 commented on code in PR #28934: URL: https://github.com/apache/airflow/pull/28934#discussion_r1082563334 ## airflow/sensors/base.py: ## @@ -257,11 +258,14 @@ def _get_next_poke_interval( def prepare_for_execution(self) -> BaseOperator: task = super().prepare_for_execution() + # Sensors in `poke` mode can block execution of DAGs when running # with single process executor, thus we change the mode to`reschedule` # to allow parallel task being scheduled and executed -if conf.get("core", "executor") == "DebugExecutor": -self.log.warning("DebugExecutor changes sensor mode to 'reschedule'.") +executor_name = conf.get("core", "executor") +executor = ExecutorLoader.load_executor(executor_name) Review Comment: @o-nikolas Thanks updated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] VladaZakharova commented on a diff in pull request #29017: Add deferrable mode to KubernetesPodOperator
VladaZakharova commented on code in PR #29017: URL: https://github.com/apache/airflow/pull/29017#discussion_r1082553841 ## airflow/providers/cncf/kubernetes/hooks/kubernetes.py: ## @@ -16,19 +16,28 @@ # under the License. from __future__ import annotations +import contextlib import tempfile import warnings from typing import TYPE_CHECKING, Any, Generator +from asgiref.sync import sync_to_async from kubernetes import client, config, watch +from kubernetes.client.models import V1Pod from kubernetes.config import ConfigException +from kubernetes_asyncio import client as async_client, config as async_config +from kubernetes_asyncio.client import ApiException +from kubernetes_asyncio.config import load_kube_config_from_dict Review Comment: Thank you for all your suggestions, i have added necessary changes -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] VladaZakharova commented on a diff in pull request #29017: Add deferrable mode to KubernetesPodOperator
VladaZakharova commented on code in PR #29017: URL: https://github.com/apache/airflow/pull/29017#discussion_r1082552556 ## generated/provider_dependencies.json: ## @@ -209,8 +209,10 @@ "cncf.kubernetes": { "deps": [ "apache-airflow>=2.3.0", + "asgiref>=3.5.2", "cryptography>=2.0.0", - "kubernetes>=21.7.0,<24" + "kubernetes>=21.7.0,<24", + "kubernetes_asyncio>=24.2.2,<25" Review Comment: I have changed version range and added the comment. Please, check it -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] VladaZakharova commented on a diff in pull request #29017: Add deferrable mode to KubernetesPodOperator
VladaZakharova commented on code in PR #29017: URL: https://github.com/apache/airflow/pull/29017#discussion_r1082551668 ## docs/apache-airflow-providers-cncf-kubernetes/operators.rst: ## @@ -1,19 +1,19 @@ .. Licensed to the Apache Software Foundation (ASF) under one -or more contributor license agreements. See the NOTICE file -distributed with this work for additional information -regarding copyright ownership. The ASF licenses this file -to you under the Apache License, Version 2.0 (the -"License"); you may not use this file except in compliance -with the License. You may obtain a copy of the License at +or more contributor license agreements. See the NOTICE file +distributed with this work for additional information +regarding copyright ownership. The ASF licenses this file +to you under the Apache License, Version 2.0 (the +"License"); you may not use this file except in compliance +with the License. You may obtain a copy of the License at .. http://www.apache.org/licenses/LICENSE-2.0 .. Unless required by applicable law or agreed to in writing, -software distributed under the License is distributed on an -"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -KIND, either express or implied. See the License for the -specific language governing permissions and limitations -under the License. +software distributed under the License is distributed on an +"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +KIND, either express or implied. See the License for the +specific language governing permissions and limitations +under the License. Review Comment: Yeap, thanks, i have fixed it :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #29061: Increase length of user identifier columns in `ab_user` and `ab_register_user` tables
boring-cyborg[bot] commented on PR #29061: URL: https://github.com/apache/airflow/pull/29061#issuecomment-1398412860 Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst) Here are some useful points: - Pay attention to the quality of your code (ruff, mypy and type annotations). Our [pre-commits]( https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that. - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short [guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it. - Consider using [Breeze environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations. - Be patient and persistent. It might take some time to get a review or get the final approval from Committers. - Please follow [ASF Code of Conduct](https://www.apache.org/foundation/policies/conduct) for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack. - Be sure to read the [Airflow Coding style]( https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices). Apache Airflow is a community-driven project and together we are making it better 🚀. In case of doubts contact the developers at: Mailing List: d...@airflow.apache.org Slack: https://s.apache.org/airflow-slack -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] swalkowski opened a new pull request, #29061: Increase length of user identifier columns in `ab_user` and `ab_register_user` tables
swalkowski opened a new pull request, #29061: URL: https://github.com/apache/airflow/pull/29061 To better open Airflow RBAC to external integrations (see below), this change increases the maximum lengths of user identifies columns in `ab_user` and `ab_register_user` tables: - `first_name` and `last_name` from 64 to 256 characters. - `email` and `username` from 256 to 512 characters. While Airflow RBAC model is represented in the fixed set of Flask App Builder `ab_*` tables, it can be integrated with a variety of authentication mechanisms. They may operate on various forms of user identifiers that can be of various lengths. For example: - Display names like first and last name may not always be available, and some integrations may opt for defaulting those columns to the other identifiers, like username or email, which can be longer. - Identity federation is a scenario in which user identifiers can be particularly long, like this one from Google Cloud workload identity federation: `principal://iam.googleapis.com/projects/PROJECT_NUMBER/locations/global/workloadIdentityPools/POOL_ID/subject/SUBJECT_ATTRIBUTE_VALUE`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] atrbgithub commented on issue #29026: Status of testing of Apache Airflow 2.5.1rc2
atrbgithub commented on issue #29026: URL: https://github.com/apache/airflow/issues/29026#issuecomment-1398341172 I've span this up locally and can confirm: [Return list of tasks that will be queued (#28066)](https://github.com/apache/airflow/pull/28066) Is fixed in this release. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Taragolis commented on pull request #29035: Renaming nose compatible methods in flavour of regular pytest naming
Taragolis commented on PR #29035: URL: https://github.com/apache/airflow/pull/29035#issuecomment-1398319160 > Live is like a box of chocolates, you never know which one you get. You got Helm test this time :) I postpone migrate this tests to pytest as much as possible, so this looks like a right time to do this. Just need to check locally that everything work as expected and I will create PR for that. Also we need to use changes in `pytest.ini` as trigger for all tests because changes in it might affects all tests in selected directories - test - docker_tests - kubernetes_tests - dev/breeze/tests @potiuk, is it the right place? https://github.com/apache/airflow/blob/d03b9a76914babef23b84acdb6061b0b93bdc2e3/dev/breeze/src/airflow_breeze/utils/selective_checks.py#L89-L98 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy opened a new pull request, #29058: Optionally export `airflow db clean` data to CSV files
ephraimbuddy opened a new pull request, #29058: URL: https://github.com/apache/airflow/pull/29058 This adds --export-to-csv and --output-path to db clean command to enable archiving to files on disk. When this option is used, the archival table is automatically dropped since data would now go to files. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request, #29057: Fix pre-commit warning for exclude in inclusive-language check
potiuk opened a new pull request, #29057: URL: https://github.com/apache/airflow/pull/29057 The exclude regexp contained /* which was not really what we meant (no harm done but latest pre-commit started to warn about it). --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #28919: Airflow API kerberos authentication error
potiuk commented on issue #28919: URL: https://github.com/apache/airflow/issues/28919#issuecomment-1398308134 I made you co-author of the fix @BMFH - see if what I got is good please :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] arjunanan6 commented on pull request #29012: Fix airflow-webserver cannot create resource pods for k8s exceutor deployed using helm chart
arjunanan6 commented on PR #29012: URL: https://github.com/apache/airflow/pull/29012#issuecomment-1398288734 Alright, so I attempted this locally where I have no restrictions. As discussed on #28394, scheduled tasks run just fine, but there is an exception when running a task manually: ``` [2023-01-20T11:57:44.263+] {kubernetes_executor.py:527} INFO - Start Kubernetes executor [2023-01-20T11:57:44.306+] {kubernetes_executor.py:476} INFO - Found 0 queued task instances [2023-01-20T11:57:44.309+] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'HELLO_WORLD', 'hello', 'scheduled__2023-01-20T11:40:00+00:00', '--ignore-all-dependencies', '--ignore-dependencies', '--force', '--local', '--pool', 'default_pool', '--subdir', 'DAGS_FOLDER/hello.py'] [2023-01-20T11:57:44.310+] {kubernetes_executor.py:559} INFO - Add task TaskInstanceKey(dag_id='HELLO_WORLD', task_id='hello', run_id='scheduled__2023-01-20T11:40:00+00:00', try_number=4, map_index=-1) with command ['airflow', 'tasks', 'run', 'HELLO_WORLD', 'hello', 'scheduled__2023-01-20T11:40:00+00:00', '--ignore-all-dependencies', '--ignore-dependencies', '--force', '--local', '--pool', 'default_pool', '--subdir', 'DAGS_FOLDER/hello.py'] [2023-01-20T11:57:44.310+] {kubernetes_executor.py:130} INFO - Event: and now my watch begins starting at resource_version: 0 [2023-01-20T11:57:44.383+] {kubernetes_executor.py:339} INFO - Creating kubernetes pod for job is TaskInstanceKey(dag_id='HELLO_WORLD', task_id='hello', run_id='scheduled__2023-01-20T11:40:00+00:00', try_number=2, map_index=-1), with pod name hello-world-hello-be2bad2bd8dc4568bd1ba73082ecef4a [2023-01-20T11:57:44.392+] {kubernetes_executor.py:274} ERROR - Exception when attempting to create Namespaced Pod: { "apiVersion": "v1", "kind": "Pod", "metadata": { "annotations": { "dag_id": "HELLO_WORLD", "task_id": "hello", "try_number": "2", "run_id": "scheduled__2023-01-20T11:40:00+00:00" }, "labels": { "tier": "airflow", "component": "worker", "release": "airflowlocal", "airflow-worker": "None", "dag_id": "HELLO_WORLD", "task_id": "hello", "try_number": "2", "airflow_version": "2.5.0", "kubernetes_executor": "True", "run_id": "scheduled__2023-01-20T114000-c15690dab" }, "name": "hello-world-hello-be2bad2bd8dc4568bd1ba73082ecef4a", "namespace": "airflow" }, "spec": { "affinity": {}, "containers": [ { "args": [ "airflow", "tasks", "run", "HELLO_WORLD", "hello", "scheduled__2023-01-20T11:40:00+00:00", "--ignore-all-dependencies", "--ignore-dependencies", "--force", "--local", "--pool", "default_pool", "--subdir", "DAGS_FOLDER/hello.py" ], "env": [ { "name": "AIRFLOW__CORE__EXECUTOR", "value": "LocalExecutor" }, { "name": "AIRFLOW__CORE__FERNET_KEY", "valueFrom": { "secretKeyRef": { "key": "fernet-key", "name": "airflowlocal-fernet-key" } } }, { "name": "AIRFLOW__CORE__SQL_ALCHEMY_CONN", "valueFrom": { "secretKeyRef": { "key": "connection", "name": "airflowlocal-airflow-metadata" } } }, { "name": "AIRFLOW__DATABASE__SQL_ALCHEMY_CONN", "valueFrom": { "secretKeyRef": { "key": "connection", "name": "airflowlocal-airflow-metadata" } } }, { "name": "AIRFLOW_CONN_AIRFLOW_DB", "valueFrom": { "secretKeyRef": { "key": "connection", "name": "airflowlocal-airflow-metadata" } } }, { "name": "AIRFLOW__WEBSERVER__SECRET_KEY", "valueFrom": { "secretKeyRef": { "key": "webserver-secret-key", "name": "airflowlocal-webserver-secret-key" } } }, { "name": "AIRFLOW_IS_K8S_EXECUTOR_POD", "value": "True" } ], "image": "my-dags:0.0.1", "imagePullPolicy": "IfNotPresent", "name": "base", "resources": {}, "volumeMounts": [ {
[GitHub] [airflow] potiuk commented on issue #28919: Airflow API kerberos authentication error
potiuk commented on issue #28919: URL: https://github.com/apache/airflow/issues/28919#issuecomment-1398283300 Ok. Cool Thanks for cofirming :). this sounds cool, providing that the username match :). I will adapt the fix and make some documentation updates to explain it :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #29035: Renaming nose compatible methods in flavour of regular pytest naming
potiuk commented on PR #29035: URL: https://github.com/apache/airflow/pull/29035#issuecomment-1398280798 > Well... I thought that the k8s test failed due to the issues with Github Actions however it also use `setup` methods and we use `pytest.ini` across all tests. > > Let's migrate them first. Live is like a box of chocolates, you never know which one you get. You got Helm test this time :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] arjunanan6 commented on pull request #28394: Fix manual task trigger failing for k8s.
arjunanan6 commented on PR #28394: URL: https://github.com/apache/airflow/pull/28394#issuecomment-1398278773 @snjypl A little update, I've tried out this fix in combination with #29012 locally, and still get the same forbidden error. I'll add more details on that PR since it's more relevant there. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] okayhooni opened a new pull request, #29056: Handling error on cluster policy itself
okayhooni opened a new pull request, #29056: URL: https://github.com/apache/airflow/pull/29056 We use Airflow cluster policy to ensure some dynamic DAGs and tasks to follow some constraints. But, we realized when there is some error on cluster policy itself(= any errors except intentional AirflowClusterPolicyViolation), ALL THE DAGs cannot be loaded. and there is no warning message on web UI and no error records on metadb import_error table. I think if pass those exceptions, there will be some risks to pass bad DAGs/tasks (= if cluster policy is healthy one, those will not pass the validation on cluster policy. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] BMFH commented on a diff in pull request #29054: [Experimental] Treat kerberos identity as email in authentication
BMFH commented on code in PR #29054: URL: https://github.com/apache/airflow/pull/29054#discussion_r1082280492 ## airflow/api/auth/backend/kerberos_auth.py: ## @@ -141,7 +143,7 @@ def decorated(*args, **kwargs): token = "".join(header.split()[1:]) return_code = _gssapi_authenticate(token) if return_code == kerberos.AUTH_GSS_COMPLETE: -g.user = ctx.kerberos_user +g.user = get_airflow_app().appbuilder.sm.find_user(email=ctx.kerberos_user) Review Comment: It works. In case when email domain not equal kerberos realm I suggest to look user by username field. It works too. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] BMFH commented on issue #28919: Airflow API kerberos authentication error
BMFH commented on issue #28919: URL: https://github.com/apache/airflow/issues/28919#issuecomment-1398123889 @potiuk Thank you for answer! I guessed if we do everything exactly as described in the documentation it would work. But it seems I was wrong. I'm not a developer, I'm just a DevOps engineer and I have not idea how it should work exactly, but I try to understand the logic. 1. What we have before: I added some logging in the security.py and I can see that user variable = "domain_user_name@KERBEROS-REALM". In my case it is `янв 20 11:27:29 nginx-test airflow[677767]: [2023-01-20 11:27:29,292] {security.py:418} INFO - !user is Dmitriy.Kondratyev@CORP.mycompany.DIGITAL` I tried to add Airflow user with this username, but it doesn't work. 2. I added your fix and create user with email = domain_user_name@KERBEROS-REALM It works. User was authenticated. And here we have little problem. Email domain is not always have the same name as the kerberos realm. 3. I changed your code for looking username parameter. `g.user = get_airflow_app().appbuilder.sm.find_user(username=ctx.kerberos_user)` Created user with username "domain_user_name@KERBEROS-REALM" and it works! So, to make kerberos auth work with API interface we need to apply this fix (with searching by username) and create an airflow user with username=domain_user@KERBEROS-REALM. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Taragolis commented on pull request #29035: Renaming nose compatible methods in flavour of regular pytest naming
Taragolis commented on PR #29035: URL: https://github.com/apache/airflow/pull/29035#issuecomment-1398119466 Well... I thought that the k8s test failed due to the issues with Github Actions however it also use `setup` methods and we use `pytest.ini` across all tests. Let's migrate them first. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] csp33 commented on issue #28880: Can't configure Kubernetes and Celery workers in Helm Chart
csp33 commented on issue #28880: URL: https://github.com/apache/airflow/issues/28880#issuecomment-1398093799 I could take this one. How about this proposal? values defined at the "workers" level will be taken by both celery & k8s. ```yaml workers: safeToEvict: false celery: resources: limits: cpu: 1 memory: 1Gi requests: cpu: 1 memory: 1Gi kubernetes: resources: limits: cpu: 240m memory: 875Mi requests: cpu: 240m memory: 875Mi ``` @potiuk -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] maxnathaniel commented on pull request #28953: Updated Telegram Provider to ensure compatbility with >=20.0.0
maxnathaniel commented on PR #28953: URL: https://github.com/apache/airflow/pull/28953#issuecomment-1398028299 @Taragolis Sorry have been busy past couple of days. Do you think I should go ahead to replace `python-telegram-bot` with [pyTelegramBotAPI](https://github.com/eternnoir/pyTelegramBotAPI) and use the sync methods? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ecerulm commented on a diff in pull request #29016: Fix leak sensitive field via V1EnvVar on exception
ecerulm commented on code in PR #29016: URL: https://github.com/apache/airflow/pull/29016#discussion_r1082180416 ## airflow/utils/log/secrets_masker.py: ## @@ -200,10 +222,18 @@ def _redact(self, item: Redactable, name: str | None, depth: int) -> Redacted: if name and should_hide_value_for_key(name): return self._redact_all(item, depth) if isinstance(item, dict): -return { +to_return = { dict_key: self._redact(subval, name=dict_key, depth=(depth + 1)) for dict_key, subval in item.items() } +return to_return +elif isinstance(item, ConvertableToDict): # things like V1EnvVar Review Comment: The other alternative to avoid introducing a dependency to V1EnvVar would be to mask everything that is not explicitly `Redactable`. Which one shall I go for? the `ConvertableToDict`, introducing a dependency to `V1EnvVar` or mask everything not redactable? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] phanikumv commented on a diff in pull request #29014: Add deferrable mode to `DbtCloudRunJobOperator`
phanikumv commented on code in PR #29014: URL: https://github.com/apache/airflow/pull/29014#discussion_r1082167660 ## docs/apache-airflow-providers-dbt-cloud/operators.rst: ## @@ -35,10 +35,10 @@ Trigger a dbt Cloud Job Use the :class:`~airflow.providers.dbt.cloud.operators.dbt.DbtCloudRunJobOperator` to trigger a run of a dbt Cloud job. By default, the operator will periodically check on the status of the executed job to terminate with a successful status every ``check_interval`` seconds or until the job reaches a ``timeout`` length of -execution time. This functionality is controlled by the ``wait_for_termination`` parameter. Alternatively, -``wait_for_termination`` can be set to False to perform an asynchronous wait (typically paired with the -:class:`~airflow.providers.dbt.cloud.sensors.dbt.DbtCloudJobRunSensor`). Setting ``wait_for_termination`` to -False is a good approach for long-running dbt Cloud jobs. +execution time. This functionality is controlled by the ``deferrable`` parameter. Alternatively, +``deferrable`` can be set to True to perform an asynchronous execution on the airflow Triggerer, +causing the airflow worker slot to free up and save resources. Setting ``deferrable`` to +True is a good approach for long-running dbt Cloud jobs. Review Comment: updated the docs -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #29031: Scheduler pod restarts
potiuk commented on issue #29031: URL: https://github.com/apache/airflow/issues/29031#issuecomment-1397996468 Hard to say. Maybe you have REALLY SLOW network or filesystem mounted in your system? Or maybe your docker engine has limited memory (as it happens on MacOS where we recommend to increase it to 4GB). Or maybe your database is extremely slow over remote network, firewall. Or maybe your DNS responds with multi-seconds delays to network queries. There might be around million reasons on what's wrong with your system to make it slow. We really have no idea what kind of deployment you have. You are the only person in the world to know it and the only person in the world to be able to look at all the componets and see which one is causing the slowness. Look at the logs produced, increase debugging level, see if you can see any warnings, particularly look at the scheduler logs to see for any signs that things take multiple seconds where it would seem they should be fast (make your best judgment about it - try to make intelligent guesses there). Post here information about your investigations including some details of what you have found, including the suspicious logs. This is what I would do while debugging any applicaiton I would install, in order my reach out to help me to investigate what's wrong with my system and I recommend you do the same. We can help here (note that peopel here are usually helping in their free time, so if you are mindful to that, you should understand the more time you spend on analysing and providing helpful hints, the more likely someone will be willing to spend their free time on help you. As long as you will show that you made an effort to analyse it first and give us something to look at - we might be able to help. If you don't we can at most provide some generic guesses (which @hussein-awala nicely did). Hope to be able to help (providing we have more information). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Subhashini2610 commented on issue #29031: Scheduler pod restarts
Subhashini2610 commented on issue #29031: URL: https://github.com/apache/airflow/issues/29031#issuecomment-1397974322 @hussein-awala Thanks for the insight. I have only one DAG (hello_world.py) synced, which is also not scheduled for periodic runs. Also, I have checked the pod metrics. The max CPU utilisation is 6% and the max memory utilisation is 8% since the time the pod came into existence. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] tirkarthi commented on pull request #28256: Include full path to Python files under zip path while clearing import errors.
tirkarthi commented on PR #28256: URL: https://github.com/apache/airflow/pull/28256#issuecomment-1397945755 @potiuk @uranusjr Sorry, I am little confused. Currently `ImportError` model has filename column that stores full path to Python along with zipfile like `/home/karthikeyan/airflow/dags/error_dag.zip/error_dag.py` . Do you want to add a migration where `file_path` column stores `/home/karthikeyan/airflow/dags/error_dag.zip` value and use it in the deletion query? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] o-nikolas commented on a diff in pull request #29055: [AIP-51] Executors vending CLI commands
o-nikolas commented on code in PR #29055: URL: https://github.com/apache/airflow/pull/29055#discussion_r1082091328 ## airflow/cli/cli_parser.py: ## @@ -17,2165 +17,35 @@ # specific language governing permissions and limitations # under the License. """Command-line interface.""" + + from __future__ import annotations import argparse -import json -import os -import textwrap -from argparse import Action, ArgumentError, RawTextHelpFormatter +from argparse import Action, RawTextHelpFormatter from functools import lru_cache -from typing import Callable, Iterable, NamedTuple, Union - -import lazy_object_proxy +from typing import Iterable -from airflow import settings -from airflow.cli.commands.legacy_commands import check_legacy_command -from airflow.configuration import conf +from airflow.cli.cli_config import ( +DAG_CLI_DICT, +ActionCommand, +Arg, +CLICommand, +DefaultHelpParser, +GroupCommand, +core_commands, +) from airflow.exceptions import AirflowException -from airflow.executors.executor_constants import CELERY_EXECUTOR, CELERY_KUBERNETES_EXECUTOR from airflow.executors.executor_loader import ExecutorLoader -from airflow.utils.cli import ColorMode from airflow.utils.helpers import partition -from airflow.utils.module_loading import import_string -from airflow.utils.timezone import parse as parsedate - -BUILD_DOCS = "BUILDING_AIRFLOW_DOCS" in os.environ - - -def lazy_load_command(import_path: str) -> Callable: -"""Create a lazy loader for command.""" -_, _, name = import_path.rpartition(".") - -def command(*args, **kwargs): -func = import_string(import_path) -return func(*args, **kwargs) - -command.__name__ = name - -return command - - -class DefaultHelpParser(argparse.ArgumentParser): -"""CustomParser to display help message.""" - -def _check_value(self, action, value): -"""Override _check_value and check conditionally added command.""" -if action.dest == "subcommand" and value == "celery": -executor = conf.get("core", "EXECUTOR") -if executor not in (CELERY_EXECUTOR, CELERY_KUBERNETES_EXECUTOR): -executor_cls, _ = ExecutorLoader.import_executor_cls(executor) -classes = () -try: -from airflow.executors.celery_executor import CeleryExecutor - -classes += (CeleryExecutor,) -except ImportError: -message = ( -"The celery subcommand requires that you pip install the celery module. " -"To do it, run: pip install 'apache-airflow[celery]'" -) -raise ArgumentError(action, message) -try: -from airflow.executors.celery_kubernetes_executor import CeleryKubernetesExecutor - -classes += (CeleryKubernetesExecutor,) -except ImportError: -pass -if not issubclass(executor_cls, classes): -message = ( -f"celery subcommand works only with CeleryExecutor, CeleryKubernetesExecutor and " -f"executors derived from them, your current executor: {executor}, subclassed from: " -f'{", ".join([base_cls.__qualname__ for base_cls in executor_cls.__bases__])}' -) -raise ArgumentError(action, message) -if action.dest == "subcommand" and value == "kubernetes": -try: -import kubernetes.client # noqa: F401 -except ImportError: -message = ( -"The kubernetes subcommand requires that you pip install the kubernetes python client. " -"To do it, run: pip install 'apache-airflow[cncf.kubernetes]'" -) -raise ArgumentError(action, message) - -if action.choices is not None and value not in action.choices: -check_legacy_command(action, value) - -super()._check_value(action, value) - -def error(self, message): -"""Override error and use print_instead of print_usage.""" -self.print_help() -self.exit(2, f"\n{self.prog} command error: {message}, see help above.\n") - - -# Used in Arg to enable `None' as a distinct value from "not passed" -_UNSET = object() - - -class Arg: -"""Class to keep information about command line argument.""" - -def __init__( -self, -flags=_UNSET, -help=_UNSET, -action=_UNSET, -default=_UNSET, -nargs=_UNSET, -type=_UNSET, -choices=_UNSET, -required=_UNSET, -metavar=_UNSET, -dest=_UNSET, -): -self.flags = flags -self.kwargs = {} -for k, v in locals().items(): -if v is _UNSET: -continue -if
[GitHub] [airflow] schwartzpub commented on issue #29037: Cron schedule and Time Zones leads to incorrect intervals
schwartzpub commented on issue #29037: URL: https://github.com/apache/airflow/issues/29037#issuecomment-1397824977 Last run/next run aside -- what other details are needed that aren't provided above? For reference, the documentation for 2.5 is where the recommendation for `default_timezone = UTC` came from. The dag definition is as follows: ```python from datetime import datetime, timedelta from airflow import DAG from airflow.models import Variable from airflow.operators.bash import BashOperator import pendulum with DAG( "test_dataflow", default_args = { "depends_on_past": False, "email": ["t...@test.com"], "email_on_failure": False, }, description="Test Dataflow", schedule="*/15 6-17 * * 1-5", start_date=pendulum.datetime(2023,1,19,6,tz='America/Chicago'), catchup=False, ) as dag: ssis_p = Variable.get("ssis_password") bash_comm = "/opt/ssis/bin/dtexec /f /home/airflow/airflow/ssis/Package.dtsx /de {0} /l 'DTS.LogProviderTextFile;ssis.txt'".format(ssis_p) t1 = BashOperator( task_id="ssis_dataflow", bash_command=bash_comm ) t1 ``` The local server is set to CST ``` user@host:~$ sudo timedatectl Local time: Thu 2023-01-19 06:44:37 CST Universal time: Thu 2023-01-19 12:44:37 UTC RTC time: Thu 2023-01-19 12:44:36 Time zone: America/Chicago (CST, -0600) System clock synchronized: yes NTP service: active RTC in local TZ: no ``` The airflow.cfg is currently set to UTC but changing to `America/Chicago` and restarting the schedulre and webserver services doesn't change the behavior: ``` default_timezone = utc ``` I cannot provide a screenshot since the run times in the UI are not a good indicator (if there is a screenshot that can show this, I can certainly provide one), but given the above configurations I would expect the first run each weekday to happen at 6:15a CST. Instead, the first run of the day happens at 12:15pm CST. When I check the DAG in the morning(s) and through to the afternoon there are no new runs until 12:15pm CST. If there is any other information I can provide that might be missing here, please let me know so I can provide it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #28693: AIP-44 Migrate DagModel.get_paused_dag_ids to Internal API
potiuk commented on PR #28693: URL: https://github.com/apache/airflow/pull/28693#issuecomment-1397793451 > Is there any action item/comment here I should act on or we can forward with this PR? Rebasing/conflict resolving after we merged #28841 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a diff in pull request #28618: Add deferrable mode to DataprocInstantiateWorkflowTemplateOperator
potiuk commented on code in PR #28618: URL: https://github.com/apache/airflow/pull/28618#discussion_r1082013189 ## airflow/providers/google/cloud/triggers/dataproc.py: ## @@ -84,3 +84,73 @@ async def run(self): raise AirflowException(f"Dataproc job execution failed {self.job_id}") await asyncio.sleep(self.polling_interval_seconds) yield TriggerEvent({"job_id": self.job_id, "job_state": state}) + + +class DataprocWorkflowTrigger(BaseTrigger): Review Comment: Yeah. Would it? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #28956: Move project and license docs down to start with developer-focused docs
potiuk commented on PR #28956: URL: https://github.com/apache/airflow/pull/28956#issuecomment-1397788465 Oh Absolutely. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk merged pull request #28956: Move project and license docs down to start with developer-focused docs
potiuk merged PR #28956: URL: https://github.com/apache/airflow/pull/28956 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] hussein-awala commented on issue #29031: Scheduler pod restarts
hussein-awala commented on issue #29031: URL: https://github.com/apache/airflow/issues/29031#issuecomment-1397785578 Since the liveness probe request return `503`, you scheduler could be overloaded, and there are several different causes for this problem: - The request/limit resources for the pod - The number of the dags and the tasks in your server - The complexity of your dags files - The performance of the file system your using (volumes) - ... I recommend reading [this doc](https://airflow.apache.org/docs/apache-airflow/stable/concepts/scheduler.html#fine-tuning-your-scheduler-performance), it can help you to debug and improve scheduler performance. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #19781: A way to test email configuration
potiuk commented on issue #19781: URL: https://github.com/apache/airflow/issues/19781#issuecomment-1397782458 > btw should i create a separate ticket on that or will just a PR suffice No need for issues in Airflow. PR is enough in vast majority of canses. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk merged pull request #29051: Add a summary table at the end of system test execs
potiuk merged PR #29051: URL: https://github.com/apache/airflow/pull/29051 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] o-nikolas opened a new pull request, #29055: Onikolas/aip 51/executor cli vending
o-nikolas opened a new pull request, #29055: URL: https://github.com/apache/airflow/pull/29055 This PR addresses executor coupling **5a** and **5b** in Airflow CLI. See [this issue](https://github.com/apache/airflow/issues/27932) (#27932) for more context. ## Please read before reviewing: The changes can be organized into three main groups, which are described below. In fact, it may be very helpful to actually review the three commits of this PR in isolation. 1. In order to resolve circular imports as well as to cleanup the ~2k line `cli_parser.py` module, all cli config (declaration of Args, Commands, etc) moved to a new `cli_config.py` module. This config module is much larger than the remaining `cli_parser.py` module and contains more of the original code, so it is perhaps more helpful to think of it as the `cli_parser.py` module has been _renamed_ to `cli_config.py` where most of the code still lives, and then just the methods related to constructing the parser itself have moved to a new `cli_parser.py` module :) **NOTE: you will need to unhide the diffs for these files since they are quite large** 1. Updates to the executor interface (i.e. add stub methods to the base executor and the two hybrid executors which don't inherit directly from base) to vend cli commands and make small changes in the cli_parser.py module to read in commands from this new method. Add tests for the interface. 1. Move executor related cli commands (for Celery and Kubernetes) to their respective executor modules utilizing the new interface from step 2. All original tests for these CLI commands still apply. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #29051: Add a summary table at the end of system test execs
potiuk commented on PR #29051: URL: https://github.com/apache/airflow/pull/29051#issuecomment-1397779289 > > (and we should wait for it). > > You mean we should not merge this until AIP-52 gives us a way to mark teardown tasks ? Nope. Wait with separating the tables. I think what you did is cool. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #28919: Airflow API kerberos authentication error
potiuk commented on issue #28919: URL: https://github.com/apache/airflow/issues/28919#issuecomment-139007 How do you expect your kerberos user to map to Airflow @BMFH? I believe indeed the current way kerberos user is retrieved is wrong. From what I see. the user that is set by kerberos_auth is kerberos_id string, but what we expect there is the user object from FAB. If - for example you have email address then likely this one should fix it https://github.com/apache/airflow/pull/29054 - maybe you can apply the same change and see if that fixes the problem (or explore my hypothesis a bit more?) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request, #29054: [Experimental] Treat kerberos identity as email in authentication
potiuk opened a new pull request, #29054: URL: https://github.com/apache/airflow/pull/29054 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] vandonr-amz commented on pull request #28816: introduce a method to convert dictionaries to boto-style key-value lists
vandonr-amz commented on PR #28816: URL: https://github.com/apache/airflow/pull/28816#issuecomment-1397776181 @eladkal @uranusjr anything else you'd like changed with that PR ? Otherwise we could merge it 😇 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] vandonr-amz opened a new pull request, #29053: introduce base class for EKS sensors
vandonr-amz opened a new pull request, #29053: URL: https://github.com/apache/airflow/pull/29053 most of the EKS sensors do the same thing, this can be grouped in one base class, a bit like what was done here https://github.com/apache/airflow/blob/main/airflow/providers/amazon/aws/sensors/sagemaker.py#L33 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] hussein-awala commented on issue #29002: KubernetesPodOperator xcom push failure
hussein-awala commented on issue #29002: URL: https://github.com/apache/airflow/issues/29002#issuecomment-1397773389 I have the same problem with some of my pod tasks, I just created this [PR](https://github.com/apache/airflow/pull/29052), can you check it please? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] hussein-awala opened a new pull request, #29052: Fix KubernetesPodOperator xcom push when `get_logs=False`
hussein-awala opened a new pull request, #29052: URL: https://github.com/apache/airflow/pull/29052 closes: #29002 --- **^ Add meaningful description above** When `get_logs=False`, we should wait the `base` container completion before reading the xcom file, to do that we are checking if the container is running, but there is a possibility that the container is in waiting state, and in the current `await_container_completion` method, we consider it as completed. Instead, I'm checking if the container has the state `terminated` and if not I wait 1s. Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] o-nikolas commented on pull request #28934: Remove hard-coded executor-database coupling
o-nikolas commented on PR #28934: URL: https://github.com/apache/airflow/pull/28934#issuecomment-1397769679 Hey @utkarsharma2! Have you had any time to revisit this one? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] o-nikolas merged pull request #28706: Update provide_bucket_name() decorator to handle new conn_type
o-nikolas merged PR #28706: URL: https://github.com/apache/airflow/pull/28706 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #27156: Add documentation for BigQuery transfer operators
github-actions[bot] commented on PR #27156: URL: https://github.com/apache/airflow/pull/27156#issuecomment-1397768482 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #28056: Move GCP Sheets system tests
github-actions[bot] commented on PR #28056: URL: https://github.com/apache/airflow/pull/28056#issuecomment-1397768424 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] vandonr-amz commented on pull request #29051: Add a summary table at the end of system test execs
vandonr-amz commented on PR #29051: URL: https://github.com/apache/airflow/pull/29051#issuecomment-1397764394 > (and we should wait for it). You mean we should not merge this until AIP-52 gives us a way to mark teardown tasks ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] o-nikolas merged pull request #29001: uniformize getting hook through cached property in aws sensors
o-nikolas merged PR #29001: URL: https://github.com/apache/airflow/pull/29001 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #28902: Chart: Update default git-sync version to 3.6.2
potiuk commented on PR #28902: URL: https://github.com/apache/airflow/pull/28902#issuecomment-1397758005 @aleveille - still working on it ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #18010: Airflow scheduler with statsd enabled crashes when dag_id contains unexpected characters
potiuk commented on issue #18010: URL: https://github.com/apache/airflow/issues/18010#issuecomment-1397756161 Should be part of discussion here: https://lists.apache.org/thread/96tco8dfs4mh12kqm1pwjhm5mqr54qbm . I think we should limit ID to ASCII and add extra description. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #29051: Add a summary table at the end of system test execs
potiuk commented on PR #29051: URL: https://github.com/apache/airflow/pull/29051#issuecomment-1397754758 > Is the watcher supposed to be the "separator" ? Because in that case, disable_job_queue is part of the teardown and apparently executed before the watcher (I sort by end_date, maybe that's wrong ?) Ah... you are right . Watcher is just for marking success/failure.. Bummer. so AIP-52 it is > Yes, I'd love to, but very personal reason not to: the log collector we use keeps the escape codes uninterpreted, so it actually makes the output less readable, but maybe it's something we should fix on our end, idk. I think we have formatter that is conditional and can detect if you are in terminal or not. That would be enough. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org