[GitHub] [airflow] Bowrna commented on a change in pull request #21145: enter the shell breeze2 environment
Bowrna commented on a change in pull request #21145: URL: https://github.com/apache/airflow/pull/21145#discussion_r805295828 ## File path: dev/breeze/src/airflow_breeze/shell/enter_shell.py ## @@ -0,0 +1,176 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from pathlib import Path +from typing import Dict, List + +from airflow_breeze import global_constants +from airflow_breeze.cache import ( +check_cache_and_write_if_not_cached, +read_from_cache_file, +write_to_cache_file, +) +from airflow_breeze.console import console +from airflow_breeze.global_constants import ( +FLOWER_HOST_PORT, +MSSQL_HOST_PORT, +MSSQL_VERSION, +MYSQL_HOST_PORT, +MYSQL_VERSION, +POSTGRES_HOST_PORT, +POSTGRES_VERSION, +REDIS_HOST_PORT, +SSH_PORT, +WEBSERVER_HOST_PORT, +) +from airflow_breeze.shell.shell_builder import ShellBuilder +from airflow_breeze.utils.docker_command_utils import ( +check_docker_compose_version, +check_docker_resources, +check_docker_version, +) +from airflow_breeze.utils.path_utils import BUILD_CACHE_DIR +from airflow_breeze.utils.run_utils import ( +filter_out_none, +instruct_for_setup, +md5sum_check_if_build_is_needed, +run_command, +) +from airflow_breeze.visuals import ASCIIART, ASCIIART_STYLE, CHEATSHEET, CHEATSHEET_STYLE + +PARAMS_TO_ENTER_SHELL = { +"HOST_USER_ID": "host_user_id", +"HOST_GROUP_ID": "host_group_id", +"COMPOSE_FILE": "compose_files", +"PYTHON_MAJOR_MINOR_VERSION": "python_version", +"BACKEND": "backend", +"AIRFLOW_VERSION": "airflow_version", +"INSTALL_AIRFLOW_VERSION": "install_airflow_version", +"AIRFLOW_SOURCES": "airflow_sources", +"AIRFLOW_CI_IMAGE": "airflow_ci_image_name", +"AIRFLOW_CI_IMAGE_WITH_TAG": "airflow_ci_image_name_with_tag", +"AIRFLOW_PROD_IMAGE": "airflow_prod_image_name", +"AIRFLOW_IMAGE_KUBERNETES": "airflow_image_kubernetes", +"SQLITE_URL": "sqlite_url", +"USE_AIRFLOW_VERSION": "use_airflow_version", +"SKIP_TWINE_CHECK": "skip_twine_check", +"USE_PACKAGES_FROM_DIST": "use_packages_from_dist", +"EXECUTOR": "executor", +"START_AIRFLOW": "start_airflow", +"ENABLED_INTEGRATIONS": "enabled_integrations", +"GITHUB_ACTIONS": "github_actions", +"ISSUE_ID": "issue_id", +"NUM_RUNS": "num_runs", +"VERSION_SUFFIX_FOR_SVN": "version_suffix_for_svn", +"VERSION_SUFFIX_FOR_PYPI": "version_suffix_for_pypi", +} + +PARAMS_FOR_SHELL_CONSTANTS = { +"SSH_PORT": SSH_PORT, +"WEBSERVER_HOST_PORT": WEBSERVER_HOST_PORT, +"FLOWER_HOST_PORT": FLOWER_HOST_PORT, +"REDIS_HOST_PORT": REDIS_HOST_PORT, +"MYSQL_HOST_PORT": MYSQL_HOST_PORT, +"MYSQL_VERSION": MYSQL_VERSION, +"MSSQL_HOST_PORT": MSSQL_HOST_PORT, +"MSSQL_VERSION": MSSQL_VERSION, +"POSTGRES_HOST_PORT": POSTGRES_HOST_PORT, +"POSTGRES_VERSION": POSTGRES_VERSION, +} + +PARAMS_IN_CACHE = { +'python_version': 'PYTHON_MAJOR_MINOR_VERSION', +'backend': 'BACKEND', +'executor': 'EXECUTOR', +'postgres_version': 'POSTGRES_VERSION', +'mysql_version': 'MYSQL_VERSION', +'mssql_version': 'MSSQL_VERSION', +} + +DEFAULT_VALUES_FOR_PARAM = { +'python_version': 'DEFAULT_PYTHON_MAJOR_MINOR_VERSION', +'backend': 'DEFAULT_BACKEND', +'executor': 'DEFAULT_EXECUTOR', +'postgres_version': 'POSTGRES_VERSION', +'mysql_version': 'MYSQL_VERSION', +'mssql_version': 'MSSQL_VERSION', +} + + +def construct_arguments_docker_compose_command(shell_params: ShellBuilder) -> List[str]: +args_command = [] +for param_name in PARAMS_TO_ENTER_SHELL: +param_value = PARAMS_TO_ENTER_SHELL[param_name] +args_command.append("-e") +args_command.append(param_name + '=' + str(getattr(shell_params, param_value))) +for constant_param_name in PARAMS_FOR_SHELL_CONSTANTS: +constant_param_value = PARAMS_FOR_SHELL_CONSTANTS[constant_param_name] +args_command.append("-e") +args_command.append(constant_param_name + '=' + str(constant_param_value)) +return args_command + + Review comment: @potiuk Do you think it's better to log all the env variable used during this docker-compose command? -- This is an
[GitHub] [airflow] sungpeo commented on pull request #21538: py files doesn't have to be checked is_zipfiles in process_files
sungpeo commented on pull request #21538: URL: https://github.com/apache/airflow/pull/21538#issuecomment-1037900877 Without extra changes, I rebased my PR branch. > The PR most likely needs to run full matrix of tests because it modifies parts of the core of Airflow. However, committers might decide to merge it quickly and take the risk. If they don't merge it quickly - please rebase it to the latest main at your convenience, or amend the last commit of the PR, and push it with --force-with-lease. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Bowrna commented on a change in pull request #21145: enter the shell breeze2 environment
Bowrna commented on a change in pull request #21145: URL: https://github.com/apache/airflow/pull/21145#discussion_r805291826 ## File path: dev/breeze/src/airflow_breeze/shell/enter_shell.py ## @@ -0,0 +1,176 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from pathlib import Path +from typing import Dict, List + +from airflow_breeze import global_constants +from airflow_breeze.cache import ( +check_cache_and_write_if_not_cached, +read_from_cache_file, +write_to_cache_file, +) +from airflow_breeze.console import console +from airflow_breeze.global_constants import ( +FLOWER_HOST_PORT, +MSSQL_HOST_PORT, +MSSQL_VERSION, +MYSQL_HOST_PORT, +MYSQL_VERSION, +POSTGRES_HOST_PORT, +POSTGRES_VERSION, +REDIS_HOST_PORT, +SSH_PORT, +WEBSERVER_HOST_PORT, +) +from airflow_breeze.shell.shell_builder import ShellBuilder +from airflow_breeze.utils.docker_command_utils import ( +check_docker_compose_version, +check_docker_resources, +check_docker_version, +) +from airflow_breeze.utils.path_utils import BUILD_CACHE_DIR +from airflow_breeze.utils.run_utils import ( +filter_out_none, +instruct_for_setup, +md5sum_check_if_build_is_needed, +run_command, +) +from airflow_breeze.visuals import ASCIIART, ASCIIART_STYLE, CHEATSHEET, CHEATSHEET_STYLE + +PARAMS_TO_ENTER_SHELL = { +"HOST_USER_ID": "host_user_id", +"HOST_GROUP_ID": "host_group_id", +"COMPOSE_FILE": "compose_files", +"PYTHON_MAJOR_MINOR_VERSION": "python_version", +"BACKEND": "backend", +"AIRFLOW_VERSION": "airflow_version", +"INSTALL_AIRFLOW_VERSION": "install_airflow_version", +"AIRFLOW_SOURCES": "airflow_sources", +"AIRFLOW_CI_IMAGE": "airflow_ci_image_name", +"AIRFLOW_CI_IMAGE_WITH_TAG": "airflow_ci_image_name_with_tag", +"AIRFLOW_PROD_IMAGE": "airflow_prod_image_name", +"AIRFLOW_IMAGE_KUBERNETES": "airflow_image_kubernetes", +"SQLITE_URL": "sqlite_url", +"USE_AIRFLOW_VERSION": "use_airflow_version", +"SKIP_TWINE_CHECK": "skip_twine_check", +"USE_PACKAGES_FROM_DIST": "use_packages_from_dist", +"EXECUTOR": "executor", +"START_AIRFLOW": "start_airflow", +"ENABLED_INTEGRATIONS": "enabled_integrations", +"GITHUB_ACTIONS": "github_actions", +"ISSUE_ID": "issue_id", +"NUM_RUNS": "num_runs", +"VERSION_SUFFIX_FOR_SVN": "version_suffix_for_svn", +"VERSION_SUFFIX_FOR_PYPI": "version_suffix_for_pypi", +} + +PARAMS_FOR_SHELL_CONSTANTS = { +"SSH_PORT": SSH_PORT, +"WEBSERVER_HOST_PORT": WEBSERVER_HOST_PORT, +"FLOWER_HOST_PORT": FLOWER_HOST_PORT, +"REDIS_HOST_PORT": REDIS_HOST_PORT, +"MYSQL_HOST_PORT": MYSQL_HOST_PORT, +"MYSQL_VERSION": MYSQL_VERSION, +"MSSQL_HOST_PORT": MSSQL_HOST_PORT, +"MSSQL_VERSION": MSSQL_VERSION, +"POSTGRES_HOST_PORT": POSTGRES_HOST_PORT, +"POSTGRES_VERSION": POSTGRES_VERSION, +} + +PARAMS_IN_CACHE = { +'python_version': 'PYTHON_MAJOR_MINOR_VERSION', +'backend': 'BACKEND', +'executor': 'EXECUTOR', +'postgres_version': 'POSTGRES_VERSION', +'mysql_version': 'MYSQL_VERSION', +'mssql_version': 'MSSQL_VERSION', +} + +DEFAULT_VALUES_FOR_PARAM = { +'python_version': 'DEFAULT_PYTHON_MAJOR_MINOR_VERSION', +'backend': 'DEFAULT_BACKEND', +'executor': 'DEFAULT_EXECUTOR', +'postgres_version': 'POSTGRES_VERSION', +'mysql_version': 'MYSQL_VERSION', +'mssql_version': 'MSSQL_VERSION', +} + + +def construct_arguments_docker_compose_command(shell_params: ShellBuilder) -> List[str]: +args_command = [] +for param_name in PARAMS_TO_ENTER_SHELL: +param_value = PARAMS_TO_ENTER_SHELL[param_name] +args_command.append("-e") +args_command.append(param_name + '=' + str(getattr(shell_params, param_value))) +for constant_param_name in PARAMS_FOR_SHELL_CONSTANTS: +constant_param_value = PARAMS_FOR_SHELL_CONSTANTS[constant_param_name] +args_command.append("-e") +args_command.append(constant_param_name + '=' + str(constant_param_value)) +return args_command + + Review comment: sure @potiuk I will first solve the issue by fixing it in os.environ and then handle the verbose case too. thanks --
[GitHub] [airflow] besenthil commented on issue #21377: Databricks: add SQL endpoint operators
besenthil commented on issue #21377: URL: https://github.com/apache/airflow/issues/21377#issuecomment-1037626693 @dinowernli Is there a test stub that I can use for testing the API call? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #18938: Jinja templates should be rendered in dict keys
github-actions[bot] commented on pull request #18938: URL: https://github.com/apache/airflow/pull/18938#issuecomment-1037598181 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ac1997 commented on pull request #21351: Add and use supports_celery attribute for cli celery command validation
ac1997 commented on pull request #21351: URL: https://github.com/apache/airflow/pull/21351#issuecomment-1037536124 Just want to resurface this PR... @potiuk Do you have any thoughts on this? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ac1997 opened a new pull request #21541: Add str to xcom_pull task_ids typing
ac1997 opened a new pull request #21541: URL: https://github.com/apache/airflow/pull/21541 `TaskInstance.xcom_pull` supports both `str` and `list` but the typing has only `List` in it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated (0a2d0d1 -> cca2f94)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git. from 0a2d0d1 Added template_ext = ('.json') to databricks operators #18925 (#21530) add cca2f94 Add mssql-cli to devel extra in Airflow (#21520) No new revisions were added by this update. Summary of changes: Dockerfile.ci | 9 + ...nstall_pip_version.sh => install_pipx_tools.sh} | 22 -- 2 files changed, 21 insertions(+), 10 deletions(-) copy scripts/docker/{install_pip_version.sh => install_pipx_tools.sh} (57%)
[GitHub] [airflow] potiuk merged pull request #21520: Add mssql-cli to devel extra in Airflow
potiuk merged pull request #21520: URL: https://github.com/apache/airflow/pull/21520 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #21443: Status of testing Providers that were prepared on February 09, 2022
potiuk commented on issue #21443: URL: https://github.com/apache/airflow/issues/21443#issuecomment-1037453412 Cool! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] thcidale0808 commented on pull request #21223: Added Feature: Create Pools from Env Vars.
thcidale0808 commented on pull request #21223: URL: https://github.com/apache/airflow/pull/21223#issuecomment-1037443067 @potiuk , please find below a proposed solution and answers to your questions: ## Solution: Create a state file and a pool config sync cli and api to sync to database and config. The state file will be the source of true. When user call the config sync using the cli or api, the core flow will be: [state file → DB (models), state file → conf (hot reload)]. The state file will be generated when airflow is initiated based on the config and env variables. The state file will be updated when the user create or change a pool using the current UI. For changes of the environment variables, the user will also need to call the sync process using the new api or cli and pass the boolean argument —env-var. This will update the state file based on the env vars and then execute the core flow. Example: When user create or update a pool using the UI or the importer, will update the state file and then call the sync process. ## Questions: Based on the solution above, please find below answer to your questions: - I updated the environment value and the pool has not been updated ? Why? Is it explained somewhere? How can I update it? R: Call pool config sync cli or api passing the argument —env-var. This will update the state file and execute the core flow. - I removed the pool from environment but it has not been removed? Why ? How can I remove the pool without using CLI or UI or API since I could create one ? R: Similar with previous question. Call the pool config sync cli or api passing the argument —env-var - How can I rename a pool by changing the environment/config ? R: For env variables changes, call pool config sync cli or api passing the argument —env-var. This will update the state file and execute the core flow. Similarly, for config change, call the sync using —config - How can I update pool value without using CLI or UI or API ? R: Using the current UI. This will also update the state file which will be the source of the true for the sync process. - If I already updated my pool value via DB, what happens if I change the value in the environment or conf? R: This will a sync issue. So upon executing the dag use the pool, log a warning letting the user know that there’s a mismatch between the DB and the state file and asking to change the values using the UI or env variables and execute the sync process. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pateash edited a comment on issue #21443: Status of testing Providers that were prepared on February 09, 2022
pateash edited a comment on issue #21443: URL: https://github.com/apache/airflow/issues/21443#issuecomment-1037411412 Hi @potiuk @eladkal , **apache-airflow-provider-github v1.0.0rc2** working as expected, I have validated against **airflow v2.2.3** https://user-images.githubusercontent.com/16856802/153725314-99d9525d-3676-4140-8442-bca9e9418958.png";> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pateash edited a comment on issue #21443: Status of testing Providers that were prepared on February 09, 2022
pateash edited a comment on issue #21443: URL: https://github.com/apache/airflow/issues/21443#issuecomment-1037411412 Hi @potiuk @eladkal , **apache-airflow-provider-github v1.0.0rc2** working as expected, I have validated against **airflow v2.2.3** https://user-images.githubusercontent.com/16856802/153725314-99d9525d-3676-4140-8442-bca9e9418958.png";> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pateash removed a comment on issue #21443: Status of testing Providers that were prepared on February 09, 2022
pateash removed a comment on issue #21443: URL: https://github.com/apache/airflow/issues/21443#issuecomment-1037409252 > k int @potiuk, **apache-airflow-providers-GitHub** tested and working as expected. https://user-images.githubusercontent.com/16856802/153725246-74e5bcb2-92ac-40dc-82f1-f97fd36bb320.png";> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pateash commented on issue #21443: Status of testing Providers that were prepared on February 09, 2022
pateash commented on issue #21443: URL: https://github.com/apache/airflow/issues/21443#issuecomment-1037411412 @potiuk @eladkal , **apache-airflow-provider-github v1.0.0rc2** working as expected, I have validated against **airflow v2.2.3** https://user-images.githubusercontent.com/16856802/153725314-99d9525d-3676-4140-8442-bca9e9418958.png";> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pateash commented on issue #21443: Status of testing Providers that were prepared on February 09, 2022
pateash commented on issue #21443: URL: https://github.com/apache/airflow/issues/21443#issuecomment-1037409252 > k int @potiuk, **apache-airflow-providers-GitHub** tested and working as expected. https://user-images.githubusercontent.com/16856802/153725246-74e5bcb2-92ac-40dc-82f1-f97fd36bb320.png";> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Acehaidrey commented on pull request #20733: Add Audit Log View to Dag View
Acehaidrey commented on pull request #20733: URL: https://github.com/apache/airflow/pull/20733#issuecomment-1037384199 Thank you so much! Glad can help and had a lot of help from Sam + Brent here. I'll start the discussion in the slack about the improvements discussed. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] hubert-pietron opened a new pull request #21540: Fix logging JDBC SQL error when task fails
hubert-pietron opened a new pull request #21540: URL: https://github.com/apache/airflow/pull/21540 closes: #16295 closes: #18482 JDBC operator could not log errors when task failed. For specific type of error (jpype.java.sql.SQLException, jaydebeapi.DatabaseError, jaydebeapi.InterfaceError) this line https://github.com/apache/airflow/blob/0a2d0d1ecbb7a72677f96bc17117799ab40853e0/airflow/models/taskinstance.py#L1733 would produce AttributeError ``` [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - Traceback (most recent call last): [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/opt/***/***/models/taskinstance.py", line 1726, in handle_failure self.log.error("Task failed with exception", exc_info=error.__traceback__) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/usr/local/lib/python3.8/logging/__init__.py", line 1475, in error self._log(ERROR, msg, args, **kwargs) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/usr/local/lib/python3.8/logging/__init__.py", line 1589, in _log self.handle(record) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/usr/local/lib/python3.8/logging/__init__.py", line 1599, in handle self.callHandlers(record) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/usr/local/lib/python3.8/logging/__init__.py", line 1661, in callHandlers hdlr.handle(record) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/usr/local/lib/python3.8/logging/__init__.py", line 950, in handle rv = self.filter(record) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/usr/local/lib/python3.8/logging/__init__.py", line 811, in filter result = f.filter(record) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/opt/***/***/utils/log/secrets_masker.py", line 169, in filter self._redact_exception_with_context(exc) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - File "/opt/***/***/utils/log/secrets_masker.py", line 150, in _redact_exception_with_context exception.args = (self.redact(v) for v in exception.args) [2022-02-12, 09:27:16 UTC] {logging_mixin.py:115} WARNING - AttributeError: can't set attribute ``` Catching that error allow to see meaningful log when JDBC task fails. example DAG ``` from datetime import datetime, timedelta from airflow import DAG from airflow.providers.jdbc.operators.jdbc import JdbcOperator with DAG( dag_id='jdbc_with_error', schedule_interval='0 0 * * *', start_date=datetime(2021, 1, 1), dagrun_timeout=timedelta(minutes=60), tags=['example'], catchup=False, ) as dag: start = JdbcOperator( task_id='test_task', sql='select 1 x y z', jdbc_conn_id='my_jdbc_connection', autocommit=True, ) ``` log after change ``` *** Reading local file: /root/airflow/logs/jdbc_with_error/test_task/2022-02-12T15:38:46.534291+00:00/1.log [2022-02-12, 15:38:47 UTC] {taskinstance.py:1052} INFO - Dependencies all met for [2022-02-12, 15:38:47 UTC] {taskinstance.py:1052} INFO - Dependencies all met for [2022-02-12, 15:38:47 UTC] {taskinstance.py:1249} INFO - [2022-02-12, 15:38:47 UTC] {taskinstance.py:1250} INFO - Starting attempt 1 of 1 [2022-02-12, 15:38:47 UTC] {taskinstance.py:1251} INFO - [2022-02-12, 15:38:47 UTC] {taskinstance.py:1270} INFO - Executing on 2022-02-12 15:38:46.534291+00:00 [2022-02-12, 15:38:47 UTC] {standard_task_runner.py:52} INFO - Started process 7755 to run task [2022-02-12, 15:38:47 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'jdbc_with_error', 'test_task', 'manual__2022-02-12T15:38:46.534291+00:00', '--job-id', '245', '--raw', '--subdir', 'DAGS_FOLDER/example_dag.py', '--cfg-path', '/tmp/tmp7ryzrpgx', '--error-file', '/tmp/tmp0u3p6q8m'] [2022-02-12, 15:38:47 UTC] {standard_task_runner.py:77} INFO - Job 245: Subtask test_task [2022-02-12, 15:38:47 UTC] {logging_mixin.py:115} INFO - Running on host 464ba29d61dc [2022-02-12, 15:38:47 UTC] {logging_mixin.py:115} WARNING - /opt/***/***/models/taskinstance.py:830 DeprecationWarning: Passing 'execution_date' to 'XCom.clear()' is deprecated. Use 'run_id' instead. [2022-02-12, 15:38:47 UTC] {logging_mixin.py:115} WARNING - /usr/local/lib/python3.8/site-packages/sqlalchemy/sql/coercions.py:518 SAWarning: Coercing Subquery object into a select() for use in IN(); please pass a select() construct explicitly [2022-02-12, 15:38:48 UTC] {taskinstance.py:1442} INFO - Exporting the following env vars: AIRFLOW_CTX_DAG_OWNER=*** AIRF
[GitHub] [airflow] xyu opened a new pull request #21539: Filter out default configs when overrides exist.
xyu opened a new pull request #21539: URL: https://github.com/apache/airflow/pull/21539 When sending configs to Airflow workers we materialize a temp config file. In #18772 a feature was added so that `_cmd` generated secrets are not written to the files in some cases instead favoring maintaining the raw `_cmd` settings. Unfortunately during materializing of the configs via `as_dict()` Airflow defaults are generated and materialized as well including defaults for the non `_cmd` versions of some settings. And due to Airflow setting precedence stating bare versions of settings winning over `_cmd` versions it results in `_cmd` settings being discarded: https://airflow.apache.org/docs/apache-airflow/stable/howto/set-config.html This change checks `_cmd`, env, and secrets when materializing configs via `as_dict()` so that if the bare versions of the values is exactly the same as Airflow defaults and we have "hidden" / special versions of these configs that are trying to be set we remove the bare versions so that the correct version can be used. closes: #20092 related: #18772 #4050 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk closed pull request #21520: Add mssql-cli to devel extra in Airflow
potiuk closed pull request #21520: URL: https://github.com/apache/airflow/pull/21520 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #21443: Status of testing Providers that were prepared on February 09, 2022
potiuk commented on issue #21443: URL: https://github.com/apache/airflow/issues/21443#issuecomment-1037340722 Still some chances for testing as we are waiting for votes. So feel free :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #21356: Standardize approach to dependencies
potiuk commented on pull request #21356: URL: https://github.com/apache/airflow/pull/21356#issuecomment-1037338257 I think that one is pretty read if we will be ok with introducing the policies as I proposed . -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk edited a comment on pull request #21520: Add mssql-cli to devel extra in Airflow
potiuk edited a comment on pull request #21520: URL: https://github.com/apache/airflow/pull/21520#issuecomment-1037334254 Hey @ephraimbuddy - I found a way of adding the mssql-cli without impacting our dependencies (I am using pipx to install it). @uranusjr - re pipx - is it often the case that package scripts need some adjustments when installed by `pipx` to make it works (like in this case?) I guess because they are using own bash scripts rather than relying on entrypoint? I had to do this to make it works: ``` # Unfortunately mssql-cli installed by `pipx` does not work out of the box because it uses # its own execution bash script which is not compliant with the auto-activation of # pipx venvs - we need to manually patch Python executable in the script to fix it: ¯\_(ツ)_/¯ sed "s/python /\/root\/\.local\/pipx\/venvs\/mssql-cli\/bin\/python /" -i /root/.local/bin/mssql-cli ``` The original script was this (which had no chance to run when it was installed by pipx): ``` #!/usr/bin/env bash SOURCE="${BASH_SOURCE[0]}" while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symlink DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )" SOURCE="$(readlink "$SOURCE")" [[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located done DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )" # Set the /root/.local/pipx/venvs/mssql-cli/bin/python io encoding to UTF-8 by default if not set. if [ -z ${PYTHONIOENCODING+x} ]; then export PYTHONIOENCODING=UTF-8; fi export PYTHONPATH="${DIR}:${PYTHONPATH}" python -m mssqlcli.main "$@" ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #21520: Add mssql-cli to devel extra in Airflow
potiuk commented on pull request #21520: URL: https://github.com/apache/airflow/pull/21520#issuecomment-1037334254 Hey @ephraimbuddy - I found a way of adding the mssql-cli without impacting our dependencies (I am using pipx to install it). @uranusjr - re pipx - is it often the case that package scripts need some adjustments when installed by `pipx` to make it works (like in this case?) I guess because they are using own bash scripts rather than relying on entrypoint? I had to do this to make it works: ``` # Unfortunately mssql-cli installed by `pipx` does not work out of the box because it uses # its own execution bash script which is not compliant with the auto-activation of # pipx venvs - we need to manually patch Python executable in the script to fix it: ¯\_(ツ)_/¯ sed "s/python /\/root\/\.local\/pipx\/venvs\/mssql-cli\/bin\/python /" -i /root/.local/bin/mssql-cli ``` The original script was this (which had no chance to run when it was installed by pipx): ``` #!/usr/bin/env bash SOURCE="${BASH_SOURCE[0]}" while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symlink DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )" SOURCE="$(readlink "$SOURCE")" [[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located done DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )" # Set the /root/.local/pipx/venvs/mssql-cli/bin/python io encoding to UTF-8 by default if not set. if [ -z ${PYTHONIOENCODING+x} ]; then export PYTHONIOENCODING=UTF-8; fi export PYTHONPATH="${DIR}:${PYTHONPATH}" /root/.local/pipx/venvs/mssql-cli/bin/python -m mssqlcli.main "$@" ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on a change in pull request #21145: enter the shell breeze2 environment
potiuk commented on a change in pull request #21145: URL: https://github.com/apache/airflow/pull/21145#discussion_r805176319 ## File path: dev/breeze/src/airflow_breeze/shell/enter_shell.py ## @@ -0,0 +1,176 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from pathlib import Path +from typing import Dict, List + +from airflow_breeze import global_constants +from airflow_breeze.cache import ( +check_cache_and_write_if_not_cached, +read_from_cache_file, +write_to_cache_file, +) +from airflow_breeze.console import console +from airflow_breeze.global_constants import ( +FLOWER_HOST_PORT, +MSSQL_HOST_PORT, +MSSQL_VERSION, +MYSQL_HOST_PORT, +MYSQL_VERSION, +POSTGRES_HOST_PORT, +POSTGRES_VERSION, +REDIS_HOST_PORT, +SSH_PORT, +WEBSERVER_HOST_PORT, +) +from airflow_breeze.shell.shell_builder import ShellBuilder +from airflow_breeze.utils.docker_command_utils import ( +check_docker_compose_version, +check_docker_resources, +check_docker_version, +) +from airflow_breeze.utils.path_utils import BUILD_CACHE_DIR +from airflow_breeze.utils.run_utils import ( +filter_out_none, +instruct_for_setup, +md5sum_check_if_build_is_needed, +run_command, +) +from airflow_breeze.visuals import ASCIIART, ASCIIART_STYLE, CHEATSHEET, CHEATSHEET_STYLE + +PARAMS_TO_ENTER_SHELL = { +"HOST_USER_ID": "host_user_id", +"HOST_GROUP_ID": "host_group_id", +"COMPOSE_FILE": "compose_files", +"PYTHON_MAJOR_MINOR_VERSION": "python_version", +"BACKEND": "backend", +"AIRFLOW_VERSION": "airflow_version", +"INSTALL_AIRFLOW_VERSION": "install_airflow_version", +"AIRFLOW_SOURCES": "airflow_sources", +"AIRFLOW_CI_IMAGE": "airflow_ci_image_name", +"AIRFLOW_CI_IMAGE_WITH_TAG": "airflow_ci_image_name_with_tag", +"AIRFLOW_PROD_IMAGE": "airflow_prod_image_name", +"AIRFLOW_IMAGE_KUBERNETES": "airflow_image_kubernetes", +"SQLITE_URL": "sqlite_url", +"USE_AIRFLOW_VERSION": "use_airflow_version", +"SKIP_TWINE_CHECK": "skip_twine_check", +"USE_PACKAGES_FROM_DIST": "use_packages_from_dist", +"EXECUTOR": "executor", +"START_AIRFLOW": "start_airflow", +"ENABLED_INTEGRATIONS": "enabled_integrations", +"GITHUB_ACTIONS": "github_actions", +"ISSUE_ID": "issue_id", +"NUM_RUNS": "num_runs", +"VERSION_SUFFIX_FOR_SVN": "version_suffix_for_svn", +"VERSION_SUFFIX_FOR_PYPI": "version_suffix_for_pypi", +} + +PARAMS_FOR_SHELL_CONSTANTS = { +"SSH_PORT": SSH_PORT, +"WEBSERVER_HOST_PORT": WEBSERVER_HOST_PORT, +"FLOWER_HOST_PORT": FLOWER_HOST_PORT, +"REDIS_HOST_PORT": REDIS_HOST_PORT, +"MYSQL_HOST_PORT": MYSQL_HOST_PORT, +"MYSQL_VERSION": MYSQL_VERSION, +"MSSQL_HOST_PORT": MSSQL_HOST_PORT, +"MSSQL_VERSION": MSSQL_VERSION, +"POSTGRES_HOST_PORT": POSTGRES_HOST_PORT, +"POSTGRES_VERSION": POSTGRES_VERSION, +} + +PARAMS_IN_CACHE = { +'python_version': 'PYTHON_MAJOR_MINOR_VERSION', +'backend': 'BACKEND', +'executor': 'EXECUTOR', +'postgres_version': 'POSTGRES_VERSION', +'mysql_version': 'MYSQL_VERSION', +'mssql_version': 'MSSQL_VERSION', +} + +DEFAULT_VALUES_FOR_PARAM = { +'python_version': 'DEFAULT_PYTHON_MAJOR_MINOR_VERSION', +'backend': 'DEFAULT_BACKEND', +'executor': 'DEFAULT_EXECUTOR', +'postgres_version': 'POSTGRES_VERSION', +'mysql_version': 'MYSQL_VERSION', +'mssql_version': 'MSSQL_VERSION', +} + + +def construct_arguments_docker_compose_command(shell_params: ShellBuilder) -> List[str]: +args_command = [] +for param_name in PARAMS_TO_ENTER_SHELL: +param_value = PARAMS_TO_ENTER_SHELL[param_name] +args_command.append("-e") +args_command.append(param_name + '=' + str(getattr(shell_params, param_value))) +for constant_param_name in PARAMS_FOR_SHELL_CONSTANTS: +constant_param_value = PARAMS_FOR_SHELL_CONSTANTS[constant_param_name] +args_command.append("-e") +args_command.append(constant_param_name + '=' + str(constant_param_value)) +return args_command + + Review comment: The problem is that `docker-compose up` does not handle `-e` flag :( See here: https://stackoverflow.com/q
[GitHub] [airflow] github-actions[bot] commented on pull request #21538: py files doesn't have to be checked is_zipfiles in process_files
github-actions[bot] commented on pull request #21538: URL: https://github.com/apache/airflow/pull/21538#issuecomment-1037266688 The PR most likely needs to run full matrix of tests because it modifies parts of the core of Airflow. However, committers might decide to merge it quickly and take the risk. If they don't merge it quickly - please rebase it to the latest main at your convenience, or amend the last commit of the PR, and push it with --force-with-lease. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Bowrna commented on a change in pull request #21145: enter the shell breeze2 environment
Bowrna commented on a change in pull request #21145: URL: https://github.com/apache/airflow/pull/21145#discussion_r805174798 ## File path: dev/breeze/src/airflow_breeze/shell/enter_shell.py ## @@ -0,0 +1,176 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from pathlib import Path +from typing import Dict, List + +from airflow_breeze import global_constants +from airflow_breeze.cache import ( +check_cache_and_write_if_not_cached, +read_from_cache_file, +write_to_cache_file, +) +from airflow_breeze.console import console +from airflow_breeze.global_constants import ( +FLOWER_HOST_PORT, +MSSQL_HOST_PORT, +MSSQL_VERSION, +MYSQL_HOST_PORT, +MYSQL_VERSION, +POSTGRES_HOST_PORT, +POSTGRES_VERSION, +REDIS_HOST_PORT, +SSH_PORT, +WEBSERVER_HOST_PORT, +) +from airflow_breeze.shell.shell_builder import ShellBuilder +from airflow_breeze.utils.docker_command_utils import ( +check_docker_compose_version, +check_docker_resources, +check_docker_version, +) +from airflow_breeze.utils.path_utils import BUILD_CACHE_DIR +from airflow_breeze.utils.run_utils import ( +filter_out_none, +instruct_for_setup, +md5sum_check_if_build_is_needed, +run_command, +) +from airflow_breeze.visuals import ASCIIART, ASCIIART_STYLE, CHEATSHEET, CHEATSHEET_STYLE + +PARAMS_TO_ENTER_SHELL = { +"HOST_USER_ID": "host_user_id", +"HOST_GROUP_ID": "host_group_id", +"COMPOSE_FILE": "compose_files", +"PYTHON_MAJOR_MINOR_VERSION": "python_version", +"BACKEND": "backend", +"AIRFLOW_VERSION": "airflow_version", +"INSTALL_AIRFLOW_VERSION": "install_airflow_version", +"AIRFLOW_SOURCES": "airflow_sources", +"AIRFLOW_CI_IMAGE": "airflow_ci_image_name", +"AIRFLOW_CI_IMAGE_WITH_TAG": "airflow_ci_image_name_with_tag", +"AIRFLOW_PROD_IMAGE": "airflow_prod_image_name", +"AIRFLOW_IMAGE_KUBERNETES": "airflow_image_kubernetes", +"SQLITE_URL": "sqlite_url", +"USE_AIRFLOW_VERSION": "use_airflow_version", +"SKIP_TWINE_CHECK": "skip_twine_check", +"USE_PACKAGES_FROM_DIST": "use_packages_from_dist", +"EXECUTOR": "executor", +"START_AIRFLOW": "start_airflow", +"ENABLED_INTEGRATIONS": "enabled_integrations", +"GITHUB_ACTIONS": "github_actions", +"ISSUE_ID": "issue_id", +"NUM_RUNS": "num_runs", +"VERSION_SUFFIX_FOR_SVN": "version_suffix_for_svn", +"VERSION_SUFFIX_FOR_PYPI": "version_suffix_for_pypi", +} + +PARAMS_FOR_SHELL_CONSTANTS = { +"SSH_PORT": SSH_PORT, +"WEBSERVER_HOST_PORT": WEBSERVER_HOST_PORT, +"FLOWER_HOST_PORT": FLOWER_HOST_PORT, +"REDIS_HOST_PORT": REDIS_HOST_PORT, +"MYSQL_HOST_PORT": MYSQL_HOST_PORT, +"MYSQL_VERSION": MYSQL_VERSION, +"MSSQL_HOST_PORT": MSSQL_HOST_PORT, +"MSSQL_VERSION": MSSQL_VERSION, +"POSTGRES_HOST_PORT": POSTGRES_HOST_PORT, +"POSTGRES_VERSION": POSTGRES_VERSION, +} + +PARAMS_IN_CACHE = { +'python_version': 'PYTHON_MAJOR_MINOR_VERSION', +'backend': 'BACKEND', +'executor': 'EXECUTOR', +'postgres_version': 'POSTGRES_VERSION', +'mysql_version': 'MYSQL_VERSION', +'mssql_version': 'MSSQL_VERSION', +} + +DEFAULT_VALUES_FOR_PARAM = { +'python_version': 'DEFAULT_PYTHON_MAJOR_MINOR_VERSION', +'backend': 'DEFAULT_BACKEND', +'executor': 'DEFAULT_EXECUTOR', +'postgres_version': 'POSTGRES_VERSION', +'mysql_version': 'MYSQL_VERSION', +'mssql_version': 'MSSQL_VERSION', +} + + +def construct_arguments_docker_compose_command(shell_params: ShellBuilder) -> List[str]: +args_command = [] +for param_name in PARAMS_TO_ENTER_SHELL: +param_value = PARAMS_TO_ENTER_SHELL[param_name] +args_command.append("-e") +args_command.append(param_name + '=' + str(getattr(shell_params, param_value))) +for constant_param_name in PARAMS_FOR_SHELL_CONSTANTS: +constant_param_value = PARAMS_FOR_SHELL_CONSTANTS[constant_param_name] +args_command.append("-e") +args_command.append(constant_param_name + '=' + str(constant_param_value)) +return args_command + + Review comment: @potiuk this is how I tried to implement running docker-compose command with `-e` flags. -- This is an automated mes
[GitHub] [airflow] potiuk commented on pull request #21145: enter the shell breeze2 environment
potiuk commented on pull request #21145: URL: https://github.com/apache/airflow/pull/21145#issuecomment-1037260947 > @potiuk I am trying to run docker-compose command via subprocess and I face error when I pass env variables as -e. It works properly when set as os.environ variable and executing the docker-compose command. But it didn't work that way when I tried via `-e` flag. I will check if I could fix it. How do you pass them ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Bowrna commented on pull request #21145: enter the shell breeze2 environment
Bowrna commented on pull request #21145: URL: https://github.com/apache/airflow/pull/21145#issuecomment-1037259920 @potiuk I am trying to run docker-compose command via subprocess and I face error when I pass env variables as -e. It works properly when set as os.environ variable and executing the docker-compose command. But it didn't work that way when I tried via `-e` flag. I will check if I could fix it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #21538: py files doesn't have to be checked is_zipfiles in process_files
boring-cyborg[bot] commented on pull request #21538: URL: https://github.com/apache/airflow/pull/21538#issuecomment-1037251194 Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst) Here are some useful points: - Pay attention to the quality of your code (flake8, mypy and type annotations). Our [pre-commits]( https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that. - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short [guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it. - Consider using [Breeze environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations. - Be patient and persistent. It might take some time to get a review or get the final approval from Committers. - Please follow [ASF Code of Conduct](https://www.apache.org/foundation/policies/conduct) for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack. - Be sure to read the [Airflow Coding style]( https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices). Apache Airflow is a community-driven project and together we are making it better 🚀. In case of doubts contact the developers at: Mailing List: d...@airflow.apache.org Slack: https://s.apache.org/airflow-slack -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] sungpeo opened a new pull request #21538: py files doesn't have to be checked is_zipfiles in process_files
sungpeo opened a new pull request #21538: URL: https://github.com/apache/airflow/pull/21538 py files doesn't have to be checked with is_zipfiles in process_files like find_dag_file_paths in file. https://github.com/apache/airflow/blob/0a2d0d1ecbb7a72677f96bc17117799ab40853e0/airflow/utils/file.py#L192-L201 zipfile.is_zipfile could take longer than anticipated in case of remote file mount (DAG_DIR). So, I want py files (generaly almost dag files are py) skip to check is_zipfile. (It is simple change. Exist tests cover this one) --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated (5590e98 -> 0a2d0d1)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git. from 5590e98 Add Audit Log View to Dag View (#20733) add 0a2d0d1 Added template_ext = ('.json') to databricks operators #18925 (#21530) No new revisions were added by this update. Summary of changes: airflow/providers/databricks/operators/databricks.py | 2 ++ 1 file changed, 2 insertions(+)
[GitHub] [airflow] potiuk closed issue #18925: Add template_ext = ('.json') to databricks operators
potiuk closed issue #18925: URL: https://github.com/apache/airflow/issues/18925 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #21530: Added template_ext = ('.json') to databricks operators #18925
boring-cyborg[bot] commented on pull request #21530: URL: https://github.com/apache/airflow/pull/21530#issuecomment-1037217272 Awesome work, congrats on your first merged pull request! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk merged pull request #21530: Added template_ext = ('.json') to databricks operators #18925
potiuk merged pull request #21530: URL: https://github.com/apache/airflow/pull/21530 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated (a08140e -> 5590e98)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git. from a08140e fixup! Prepare to switch to debian bullseye (#21522) (#21536) add 5590e98 Add Audit Log View to Dag View (#20733) No new revisions were added by this update. Summary of changes: airflow/config_templates/config.yml | 18 + airflow/config_templates/default_airflow.cfg | 11 +++ airflow/www/templates/airflow/dag.html | 3 + airflow/www/templates/airflow/dag_audit_log.html | 89 airflow/www/views.py | 35 ++ tests/www/views/test_views_home.py | 6 ++ 6 files changed, 162 insertions(+) create mode 100644 airflow/www/templates/airflow/dag_audit_log.html
[GitHub] [airflow] potiuk commented on pull request #20733: Add Audit Log View to Dag View
potiuk commented on pull request #20733: URL: https://github.com/apache/airflow/pull/20733#issuecomment-1037216950 > Finally thank you @potiuk it is good :) Do I need to do anything else for this to be merged? Nope. Just merged! Great job. Audit log view has been a highly requested feature ! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk merged pull request #20733: Add Audit Log View to Dag View
potiuk merged pull request #20733: URL: https://github.com/apache/airflow/pull/20733 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on issue #21537: airflow/providers/google/cloud/transfers/sql_to_gcs.py partition parquet files
boring-cyborg[bot] commented on issue #21537: URL: https://github.com/apache/airflow/issues/21537#issuecomment-1037135852 Thanks for opening your first issue here! Be sure to follow the issue template! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] manlouk opened a new issue #21537: airflow/providers/google/cloud/transfers/sql_to_gcs.py partition parquet files
manlouk opened a new issue #21537: URL: https://github.com/apache/airflow/issues/21537 ### Description Add the ability to partition parquet files by columns. Right now you can partition files only size. ### Use case/motivation _No response_ ### Related issues _No response_ ### Are you willing to submit a PR? - [X] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pierrejeambrun edited a comment on pull request #20386: Add support for BeamGoPipelineOperator
pierrejeambrun edited a comment on pull request #20386: URL: https://github.com/apache/airflow/pull/20386#issuecomment-1037129646 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] pierrejeambrun commented on pull request #20386: Add support for BeamGoPipelineOperator
pierrejeambrun commented on pull request #20386: URL: https://github.com/apache/airflow/pull/20386#issuecomment-1037129646 > Unfortunately - static checks you will have to fix by pushing new code I am afraid :smile: @potiuk that new docstring-param hook got me I guess :joy: I just pushed a fix for that -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #21378: Switch to Debian 11 (bullseye) as base for our dockerfiles
potiuk commented on pull request #21378: URL: https://github.com/apache/airflow/pull/21378#issuecomment-1037097959 Unfortuntely we cannot move to bullseye (yet). The problem is that MSSQL ODBC driver (which is the only "reliable" and stable drirver for MSSQL for SQLAlchemy) does not support Debian Bullseye yet. This has been documented and commented on by MSSQL driver maintainers: - https://github.com/MicrosoftDocs/sql-docs/issues/6494 - https://github.com/MicrosoftDocs/sql-docs/issues/6789 - https://github.com/MicrosoftDocs/sql-docs/issues/6804 - https://github.com/MicrosoftDocs/sql-docs/issues/7255 The answer in all those issues given 28 Jan 2022: > Debian 11 installation instructions will be added when msodbcsql17 supports. Debian 11. > Regards, > David Let's hope it will be soon. This is the last blocker from migrating to Bullseye. All the rest seems to work I asked the maintainers for the expected timeline https://github.com/MicrosoftDocs/sql-docs/issues/7255#issuecomment-1037097131 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk edited a comment on issue #18190: Docker image - Migrate to 3.x-slim-bullseye from 3.x-slim-buster
potiuk edited a comment on issue #18190: URL: https://github.com/apache/airflow/issues/18190#issuecomment-1037097619 I asked the maintainers for the expected timeline https://github.com/MicrosoftDocs/sql-docs/issues/7255#issuecomment-1037097131 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #18190: Docker image - Migrate to 3.x-slim-bullseye from 3.x-slim-buster
potiuk commented on issue #18190: URL: https://github.com/apache/airflow/issues/18190#issuecomment-1037097619 I asked the maintainer for the expected timeline https://github.com/MicrosoftDocs/sql-docs/issues/7255#issuecomment-1037097131 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] uranusjr edited a comment on issue #20991: Elasticsearch remote log will not fetch task logs from manual dagruns before 2.2 upgrade
uranusjr edited a comment on issue #20991: URL: https://github.com/apache/airflow/issues/20991#issuecomment-1037092707 The core “fix” is actually overriden when the ElasticSearch provider’s implementation is used, so just releasing the provider would be enough to resolve the title issue in Airflow 2.2.3. No-one has complained about the core implementation yet, but this can still be in 2.2.4 to fix the issue pre-emptively. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] uranusjr commented on issue #20991: Elasticsearch remote log will not fetch task logs from manual dagruns before 2.2 upgrade
uranusjr commented on issue #20991: URL: https://github.com/apache/airflow/issues/20991#issuecomment-1037092707 The core “fix” is actually overriden when the ElasticSearch provider’s implementation is used, so just releasing the provider would be enough to resolve the issue in Airflow 2.2.3. No-one has complained about the core implementation yet, but this can still be in 2.2.4 to fix the issue pre-emptively. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #18190: Docker image - Migrate to 3.x-slim-bullseye from 3.x-slim-buster
potiuk commented on issue #18190: URL: https://github.com/apache/airflow/issues/18190#issuecomment-1037085507 Unfortuntely we cannot move to bullseye (yet). The problem is that MSSQL ODBC driver (which is the only "reliable" and stable drirver for MSSQL for SQLAlchemy) does not support Debian Bullseye yet. This has been documented and commented on by MSSQL driver maintainers: - https://github.com/MicrosoftDocs/sql-docs/issues/6494 - https://github.com/MicrosoftDocs/sql-docs/issues/6789 - https://github.com/MicrosoftDocs/sql-docs/issues/6804 - https://github.com/MicrosoftDocs/sql-docs/issues/7255 The answer in all those issues given 28 Jan 2022: > Debian 11 installation instructions will be added when msodbcsql17 supports. Debian 11. > Regards, > David Let's hope it will be soon. This is the last blocker from migrating to Bullseye. All the rest seems to work -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch main updated (44bd211 -> a08140e)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git. from 44bd211 Use compat data interval shim in log handlers (#21289) add a08140e fixup! Prepare to switch to debian bullseye (#21522) (#21536) No new revisions were added by this update. Summary of changes: scripts/docker/install_mysql.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
[GitHub] [airflow] potiuk merged pull request #21536: fixup! Prepare to switch to debian bullseye (#21522)
potiuk merged pull request #21536: URL: https://github.com/apache/airflow/pull/21536 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] Acehaidrey commented on pull request #20733: Add Audit Log View to Dag View
Acehaidrey commented on pull request #20733: URL: https://github.com/apache/airflow/pull/20733#issuecomment-1037056388 Finally thank you @potiuk it is good :) Do I need to do anything else for this to be merged? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org