Re: [PR] Allow users to write dag_id and task_id in their national characters, added display name for dag / task (v2) [airflow]
uranusjr commented on code in PR #38446: URL: https://github.com/apache/airflow/pull/38446#discussion_r1538674841 ## airflow/api_connexion/schemas/dag_schema.py: ## @@ -50,6 +50,7 @@ class Meta: model = DagModel dag_id = auto_field(dump_only=True) +dag_display_name = fields.Method("get_dag_display_name", dump_only=True) Review Comment: Can this not just be an `auto_field` or `fields.String`? There’s no special logic needed here. Same for `task_display_name` below. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Support multiple Docker hosts in `docker_url` attribute. [airflow]
Taragolis commented on code in PR #38466: URL: https://github.com/apache/airflow/pull/38466#discussion_r1538673076 ## airflow/providers/docker/operators/docker.py: ## @@ -343,13 +345,20 @@ def hook(self) -> DockerHook: assert_hostname=self.tls_hostname, ssl_version=self.tls_ssl_version, ) -return DockerHook( -docker_conn_id=self.docker_conn_id, -base_url=self.docker_url, -version=self.api_version, -tls=tls_config, -timeout=self.timeout, -) +for url in self.docker_url: +hook = DockerHook( +docker_conn_id=self.docker_conn_id, +base_url=url, +version=self.api_version, +tls=tls_config, +timeout=self.timeout, +) +try: +hook.get_conn() +return hook +except Exception as e: +self.log.error("Failed to establish connection to Docker host %s: %s", url, e) +raise Exception("Failed to establish connection to Docker host.") Review Comment: Do not raise Exception, it might be a problem in the future because for some test it would be required to catch `Exception` which is too broad, better use here AirflowException, which also is broad, but at least it Aitflow specific error -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Support multiple Docker hosts in `docker_url` attribute. [airflow]
Taragolis commented on code in PR #38466: URL: https://github.com/apache/airflow/pull/38466#discussion_r1538671107 ## airflow/providers/docker/operators/docker.py: ## @@ -343,13 +345,21 @@ def hook(self) -> DockerHook: assert_hostname=self.tls_hostname, ssl_version=self.tls_ssl_version, ) -return DockerHook( -docker_conn_id=self.docker_conn_id, -base_url=self.docker_url, -version=self.api_version, -tls=tls_config, -timeout=self.timeout, -) +hook = None +for url in self.docker_url: +hook = DockerHook( +docker_conn_id=self.docker_conn_id, +base_url=url, +version=self.api_version, +tls=tls_config, +timeout=self.timeout +) +try: +hook.get_conn() Review Comment: Yeah, I guess `hook.api_client.ping()` ping would be enough ```python from docker import APIClient client = APIClient("unix://var/run/docker.sock", version="1.30") print(client.ping()) try: client = APIClient("unix://foo/bar/spam.egg", version="1.30") if not client.ping(): msg = "Unable to retrieve success response on ping docker daemon" raise ConnectionError(msg) except Exception as ex: print("NOPE") ``` Maybe even better to add it into the hook right after client created https://github.com/apache/airflow/blob/c92b8db0d40672e6dd7e8cf064193e4371fcb2e8/airflow/providers/docker/hooks/docker.py#L149-L152 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] fixing the timezone in next run id info [airflow]
Bowrna commented on code in PR #38482: URL: https://github.com/apache/airflow/pull/38482#discussion_r1538667087 ## airflow/www/templates/airflow/dag.html: ## @@ -151,7 +151,8 @@ -Next Run ID: {{ dag_model.next_dagrun }} +Next Run ID: {{ dag_model.next_dagrun }} + {{ dag_model.next_dagrun.tzinfo }} Review Comment: The above image is during load, and I am attaching below the image which shows up after the page got loaded https://github.com/apache/airflow/assets/10162465/ddab1af1-adc7-4e48-84a0-a59dfd39d950";> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] fixing the timezone in next run id info [airflow]
Bowrna commented on code in PR #38482: URL: https://github.com/apache/airflow/pull/38482#discussion_r153852 ## airflow/www/templates/airflow/dag.html: ## @@ -151,7 +151,8 @@ -Next Run ID: {{ dag_model.next_dagrun }} +Next Run ID: {{ dag_model.next_dagrun }} + {{ dag_model.next_dagrun.tzinfo }} Review Comment: @bbovenzi @pierrejeambrun With the fix added in the dag.html, I could see that time has timezone info (both in text UTC and numerical +00:00 representation ) on loading, but then after loading it goes back to the old state. Am I missing something to notice? https://github.com/apache/airflow/assets/10162465/cc7d4882-c6f6-4c99-8758-68869eec6217";> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Allow users to write dag_id and task_id in their national characters, added display name for dag / task (v2) [airflow]
jscheffl commented on PR #38446: URL: https://github.com/apache/airflow/pull/38446#issuecomment-2019508432 > One UI bug > > Also, we should include display_name in the REST API for GET `/dag/{dag_id}` and `/dag/{dag_id}/details` Please re-review. Extended the schema definition in API now for DAG and Task. First time I change something in API, please let me know if this was the "right" way. Was missing docs in dontribution... but seems to be showing up as fields now locally. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] fixing the timezone in next run id info [airflow]
Bowrna commented on code in PR #38482: URL: https://github.com/apache/airflow/pull/38482#discussion_r1538663724 ## airflow/www/templates/airflow/dag.html: ## @@ -151,7 +151,8 @@ -Next Run ID: {{ dag_model.next_dagrun }} +Next Run ID: {{ dag_model.next_dagrun }} + {{ dag_model.next_dagrun.tzinfo }} Review Comment: @bbovenzi @pierrejeambrun With the fix added in the dag.html, I could see that time has timezone info (both in text `UTC` and numerical `+00:00` representation ) on loading, but then after loading it goes back to the old state. Am I missing something to notice? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] fixing the timezone in next run id info [airflow]
Bowrna commented on code in PR #38482: URL: https://github.com/apache/airflow/pull/38482#discussion_r1538663724 ## airflow/www/templates/airflow/dag.html: ## @@ -151,7 +151,8 @@ -Next Run ID: {{ dag_model.next_dagrun }} +Next Run ID: {{ dag_model.next_dagrun }} + {{ dag_model.next_dagrun.tzinfo }} Review Comment: @bbovenzi @pierrejeambrun With the fix added in the dag.html, I could see that time has timezone info (both in text `UTC` and numerical `+00:00` representation ) on loading, but then after loading it goes back to the old state. Am I missing something to notice? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Allow users to write dag_id and task_id in their national characters, added display name for dag / task (v2) [airflow]
jscheffl commented on code in PR #38446: URL: https://github.com/apache/airflow/pull/38446#discussion_r1538663599 ## airflow/www/static/js/dag/details/Header.tsx: ## @@ -105,7 +106,7 @@ const Header = () => { onClick={clearSelection} _hover={isDagDetails ? { cursor: "default" } : undefined} > - + Review Comment: And... yes the model always makes a fallback to task ID/ DAG id if display name not present. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Allow users to write dag_id and task_id in their national characters, added display name for dag / task (v2) [airflow]
jscheffl commented on code in PR #38446: URL: https://github.com/apache/airflow/pull/38446#discussion_r1538663273 ## airflow/www/static/js/dag/details/Header.tsx: ## @@ -105,7 +106,7 @@ const Header = () => { onClick={clearSelection} _hover={isDagDetails ? { cursor: "default" } : undefined} > - + Review Comment: Thanks! Didi not notice - or was this a late change due to your rework the last days? Anyway, thanks... fixed! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add method to get metadata from GCS blob in GCSHook [airflow]
amoghrajesh commented on code in PR #38398: URL: https://github.com/apache/airflow/pull/38398#discussion_r1538658019 ## tests/providers/google/cloud/hooks/test_gcs.py: ## @@ -565,6 +565,20 @@ def test_object_get_md5hash(self, mock_service): assert response == returned_file_md5hash +@mock.patch(GCS_STRING.format("GCSHook.get_conn")) +def test_object_get_metadata(self, mock_service): +test_bucket = "test_bucket" +test_object = "test_object" +returned_file_metadata = {"test_metadata_key": "test_metadata_val"} + +bucket_method = mock_service.return_value.bucket +get_blob_method = bucket_method.return_value.get_blob +get_blob_method.return_value.metadata = returned_file_metadata + +response = self.gcs_hook.get_metadata(bucket_name=test_bucket, object_name=test_object) + +assert response == returned_file_metadata Review Comment: Can we add negative scenario test as well? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
Taragolis commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538655531 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: Add `pydantic != "v2"` instead of every time -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] fixing the timezone in next run id info [airflow]
Bowrna opened a new pull request, #38482: URL: https://github.com/apache/airflow/pull/38482 closes: #38172 related: #38172 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch suppress-no-tests-collected updated (ddb102ba5d -> eeed47f9f3)
This is an automated email from the ASF dual-hosted git repository. taragolis pushed a change to branch suppress-no-tests-collected in repository https://gitbox.apache.org/repos/asf/airflow.git from ddb102ba5d Add --suppress-no-test-exit-code in case of pydantic tests add eeed47f9f3 Run only if pydantic != v2 (default) No new revisions were added by this update. Summary of changes: dev/breeze/src/airflow_breeze/commands/testing_commands.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
(airflow) branch bump-importlib-metadata updated (f53303f1e1 -> 9b53dd760b)
This is an automated email from the ASF dual-hosted git repository. taragolis pushed a change to branch bump-importlib-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard f53303f1e1 Fixup tests discard a869e33ee0 Use `importlib_metadata` with compat to Python 3.10/3.12 stdlib add 1a9b71a129 Create KubernetesPatchJobOperator operator (#38146) add bce63b281d Add 2.9.0b1 to issue template (#38364) add ea951afb69 Add check in AWS auth manager to check if the Amazon Verified Permissions schema is up to date (#38333) add a0c1985089 Chart: Fix cluster-wide RBAC naming clash when using multiple multiNamespace releases with the same name (#37197) add a1671f1f7d Retrieve dataset event created through RESTful API when creating dag run (#38332) add 13356f8d4d Add missing deprecated Fab auth manager (#38376) add 4e33ceb400 Add option to lower-bind preinstalled provider (#38363) add 7fcc1b4bf9 Add average duration markline in task and dagrun duration charts. (#38214) add 6fa7b726d6 Use current time to calculate duration when end date is not present. (#38375) add d6e9d48ea9 Add entry to INTHEWILD.md (#38380) add 32c573f390 remove section as it's no longer true with dataset expressions PR (#38370) add 3840ec690c Release notes for helm chart 1.13.1 (#38356) add ae6fec927c Fix docs link for helm chart 1.13.1 (#38381) add 095c5fe313 Ensure __exit__ is called in decorator context managers (#38383) add fa3265becf doc: Use sys.version_info for determine Python Major.Minor (#38372) add 0aee6813db Fix set deprecated slack operators arguments in `MappedOperator` (#38345) add c893cb3bfb Fix set deprecated amazon operators arguments in `MappedOperator` (#38346) add 72c0911ede Fix deprecated apache.hive operators arguments in `MappedOperator` (#38351) add 87faf3144f docs(openlineage): fix quotation around openlineage transport value (#38378) add 6296f7e224 Allow to use `redis`>=5 (#38385) add c9867bbbff Use common image build workflows in pull-request-target workflow (#38231) add 623939b002 remove K8S 1.25 support (#38367) add 9721e0b82d Resolve PT018: Assertion should be broken down into multiple parts (#38138) add c74947a69d refactor(databricks): remove redundant else block (#38397) add 7ecd5faddf fix(core): add return statement to yield within a while loop in triggers (#38389) add 99166a94e9 fix(airbyte): add return statement to yield within a while loop in triggers (#38390) add 5bf60bcf56 fix(sftp): add return statement to yield within a while loop in triggers (#38391) add eead6c2479 fix(http): add return statement to yield within a while loop in triggers (#38392) add da4f6f077c fix(google): add return statement to yield within a while loop in triggers (#38394) add 9ea4050d41 fix(amazon): add return statement to yield within a while loop in triggers (#38396) add a550742c6e Separate out additional CI image checks (#38225) add fc868f4be2 Implement deferrable mode for KubernetesJobOperator (#38251) add 315360691c Removed/Updated Outdated UI screenshots from documentation (#38403) add cddf1cc7dc fix(dbt): add return statement to yield within a while loop in triggers (#38395) add 51c94e12eb fix(microsoft/azure): add return statement to yield within a while loop in triggers (#38393) add 612382e368 Add guard checks to dataset expressions (#38388) add 83d62cae69 Create AWS auth manager documentation. Part: setup identity center (#38273) add 947c48b2fb DockerOperator: use DOCKER_HOST as default for docker_url (#38387) add 40559a315e Fix typo in verification process for Helm Chart (#38410) add 5344881715 Make postgresql default engine args comply with SA 2.0 (#38362) add 6a225ccb2a to avoid type's mismatch durind concatenation modified to f-strings (#38412) add 36704fb9a7 Fix release docs for the helm chart (#38413) add 0d11f3ca2f Add experimental warning in AWS auth manager documentation (#38414) add 9b2d8e0fcd Upgrade to latest chart dependencies detected by canary run (#38416) add bcd7d35ff8 Update Airflow decorators stub to reflect changes in #38387 (#38406) add fd5fe8d2c6 Revert ObjectStorage config variables name (#38415) add 30817a5c6d support iam token from metadata, simplify code (#38411) add a7d7ef6433 Don't allow defaults other than None in context parameters, and improve error message (#38015) add 88ea87d3d0 Update UV to latest version released (#38419) add ac50669c82 update to latest service bus (#38384) add 665d46ce17 Resolve PT012 in `telegram` provider (#38386) add 48c8f35acf `ExternalPythonOperator` use version from `sys.version_info` (#38377) add bf28b3ed40 Update `azure-serivcebus` in pyproject.toml (#38422) add fac6aa4870 Use `python-on-whales` in docker tests (#38421) add 98933ed34a Ex
Re: [PR] Implement the breeze tag_providers command [airflow]
amoghrajesh commented on code in PR #38447: URL: https://github.com/apache/airflow/pull/38447#discussion_r1538612419 ## dev/breeze/doc/09_release_management_tasks.rst: ## @@ -189,6 +189,26 @@ These are all of the available flags for the ``release-prod-images`` command: :width: 100% :alt: Breeze release management release prod images +Adding git tags for providers +" + +This command can be utilized to manage git tags for providers within the airflow remote repository during provider releases. +Sometimes in cases when there is a connectivity issue to Github, it might be possible that local tags get created and lead to annoying errors. +The default behaviour would be to clean such local tags up. + +If you want to disable this behaviour, set the env CLEAN_LOCAL_TAGS to false. Review Comment: You also need to add about the new click option here: flag and env variable -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch constraints-main updated: Updating constraints. Github run id:8430762665
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a commit to branch constraints-main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/constraints-main by this push: new e7af2381bf Updating constraints. Github run id:8430762665 e7af2381bf is described below commit e7af2381bf22d91cffd370df935ec6f3dd2adc2a Author: Automated GitHub Actions commit AuthorDate: Tue Mar 26 05:06:51 2024 + Updating constraints. Github run id:8430762665 This update in constraints is automatically committed by the CI 'constraints-push' step based on 'refs/heads/main' in the 'apache/airflow' repository with commit sha 49a76ec30ad96d2aa976a3a46a4cb57be5ef85e3. The action that build those constraints can be found at https://github.com/apache/airflow/actions/runs/8430762665/ The image tag used for that build was: 49a76ec30ad96d2aa976a3a46a4cb57be5ef85e3. You can enter Breeze environment with this image by running 'breeze shell --image-tag 49a76ec30ad96d2aa976a3a46a4cb57be5ef85e3' All tests passed in this build so we determined we can push the updated constraints. See https://github.com/apache/airflow/blob/main/README.md#installing-from-pypi for details. --- constraints-3.10.txt | 6 +++--- constraints-3.11.txt | 6 +++--- constraints-3.12.txt | 6 +++--- constraints-3.8.txt | 6 +++--- constraints-3.9.txt | 6 +++--- constraints-no-providers-3.10.txt | 2 +- constraints-no-providers-3.11.txt | 2 +- constraints-no-providers-3.12.txt | 2 +- constraints-no-providers-3.8.txt | 2 +- constraints-no-providers-3.9.txt | 2 +- constraints-source-providers-3.10.txt | 6 +++--- constraints-source-providers-3.11.txt | 6 +++--- constraints-source-providers-3.12.txt | 6 +++--- constraints-source-providers-3.8.txt | 6 +++--- constraints-source-providers-3.9.txt | 6 +++--- 15 files changed, 35 insertions(+), 35 deletions(-) diff --git a/constraints-3.10.txt b/constraints-3.10.txt index 31298b82bc..a263d3f155 100644 --- a/constraints-3.10.txt +++ b/constraints-3.10.txt @@ -1,6 +1,6 @@ # -# This constraints file was automatically generated on 2024-03-25T19:39:28.405656 +# This constraints file was automatically generated on 2024-03-26T04:44:21.002431 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -270,7 +270,7 @@ db-dtypes==1.2.0 debugpy==1.8.1 decorator==5.1.1 defusedxml==0.7.1 -deltalake==0.16.2 +deltalake==0.16.3 dill==0.3.1.1 distlib==0.3.8 distro==1.9.0 @@ -592,7 +592,7 @@ requests-file==2.0.0 requests-kerberos==0.14.0 requests-mock==1.11.0 requests-ntlm==1.2.0 -requests-oauthlib==1.4.0 +requests-oauthlib==1.4.1 requests-toolbelt==1.0.0 requests==2.31.0 responses==0.25.0 diff --git a/constraints-3.11.txt b/constraints-3.11.txt index 82915165fb..b72158e8d9 100644 --- a/constraints-3.11.txt +++ b/constraints-3.11.txt @@ -1,6 +1,6 @@ # -# This constraints file was automatically generated on 2024-03-25T19:39:28.238334 +# This constraints file was automatically generated on 2024-03-26T04:44:20.953130 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -269,7 +269,7 @@ db-dtypes==1.2.0 debugpy==1.8.1 decorator==5.1.1 defusedxml==0.7.1 -deltalake==0.16.2 +deltalake==0.16.3 dill==0.3.1.1 distlib==0.3.8 distro==1.9.0 @@ -590,7 +590,7 @@ requests-file==2.0.0 requests-kerberos==0.14.0 requests-mock==1.11.0 requests-ntlm==1.2.0 -requests-oauthlib==1.4.0 +requests-oauthlib==1.4.1 requests-toolbelt==1.0.0 requests==2.31.0 responses==0.25.0 diff --git a/constraints-3.12.txt b/constraints-3.12.txt index 4a16254248..d8146de414 100644 --- a/constraints-3.12.txt +++ b/constraints-3.12.txt @@ -1,6 +1,6 @@ # -# This constraints file was automatically generated on 2024-03-25T19:40:32.080252 +# This constraints file was automatically generated on 2024-03-26T04:45:11.374951 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -262,7 +262,7 @@ datadog==0.49.1 db-dtypes==1.2.0 decorator==5.1.1 defusedxml==0.7.1 -deltalake==0.16.2 +deltalake==0.16.3 dill==0.3.8 distlib==0.3.8 distro==1.9.0 @@ -568,7 +568,7 @@ requests-file==2.0.0 requests-kerberos==0.14.0 requests-mock==1
Re: [PR] changing dag_processing.processes from UpDownCounter to guage [airflow]
Bowrna commented on code in PR #38400: URL: https://github.com/apache/airflow/pull/38400#discussion_r1538589431 ## airflow/dag_processing/manager.py: ## @@ -1207,6 +1209,7 @@ def _kill_timed_out_processors(self): processor.start_time.isoformat(), ) Stats.decr("dag_processing.processes", tags={"file_path": file_path, "action": "timeout"}) +Stats.gauge("dag_processing.processes_count", -1, delta=True, tags={"file_path": file_path, "action": "timeout"}) Stats.incr("dag_processing.processor_timeouts", tags={"file_path": file_path}) Review Comment: @ferruzzi I could see that param "dag_processing.processor_timeouts" is only increased. I still didn't get the logic behind why it has to be incremented as counter but specified as gauge in the documentation. Can you help me understand it? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated: Chart: Default airflow version to 2.8.4 (#38478)
This is an automated email from the ASF dual-hosted git repository. jedcunningham pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 49a76ec30a Chart: Default airflow version to 2.8.4 (#38478) 49a76ec30a is described below commit 49a76ec30ad96d2aa976a3a46a4cb57be5ef85e3 Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com> AuthorDate: Tue Mar 26 00:19:58 2024 -0400 Chart: Default airflow version to 2.8.4 (#38478) --- chart/Chart.yaml | 20 ++-- chart/newsfragments/38478.significant.rst | 3 +++ chart/values.schema.json | 4 ++-- chart/values.yaml | 4 ++-- 4 files changed, 17 insertions(+), 14 deletions(-) diff --git a/chart/Chart.yaml b/chart/Chart.yaml index d65defcdbd..659a36a628 100644 --- a/chart/Chart.yaml +++ b/chart/Chart.yaml @@ -20,7 +20,7 @@ apiVersion: v2 name: airflow version: 1.14.0 -appVersion: 2.8.3 +appVersion: 2.8.4 description: The official Helm chart to deploy Apache Airflow, a platform to programmatically author, schedule, and monitor workflows home: https://airflow.apache.org/ @@ -47,23 +47,23 @@ annotations: url: https://airflow.apache.org/docs/helm-chart/1.14.0/ artifacthub.io/screenshots: | - title: DAGs View - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/dags.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/dags.png - title: Datasets View - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/datasets.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/datasets.png - title: Grid View - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/grid.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/grid.png - title: Graph View - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/graph.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/graph.png - title: Calendar View - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/calendar.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/calendar.png - title: Variable View - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/variable_hidden.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/variable_hidden.png - title: Gantt Chart - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/gantt.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/gantt.png - title: Task Duration - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/duration.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/duration.png - title: Code View - url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/code.png + url: https://airflow.apache.org/docs/apache-airflow/2.8.4/_images/code.png artifacthub.io/changes: | - description: Don't overwrite ``.Values.airflowPodAnnotations`` kind: fixed diff --git a/chart/newsfragments/38478.significant.rst b/chart/newsfragments/38478.significant.rst new file mode 100644 index 00..9200834d00 --- /dev/null +++ b/chart/newsfragments/38478.significant.rst @@ -0,0 +1,3 @@ +Default Airflow image is updated to ``2.8.4`` + +The default Airflow image that is used with the Chart is now ``2.8.4``, previously it was ``2.8.3``. diff --git a/chart/values.schema.json b/chart/values.schema.json index 1d132d7f10..47b07c7d44 100644 --- a/chart/values.schema.json +++ b/chart/values.schema.json @@ -77,7 +77,7 @@ "defaultAirflowTag": { "description": "Default airflow tag to deploy.", "type": "string", -"default": "2.8.3", +"default": "2.8.4", "x-docsSection": "Common" }, "defaultAirflowDigest": { @@ -92,7 +92,7 @@ "airflowVersion": { "description": "Airflow version (Used to make some decisions based on Airflow Version being deployed).", "type": "string", -"default": "2.8.3", +"default": "2.8.4", "x-docsSection": "Common" }, "securityContext": { diff --git a/chart/values.yaml b/chart/values.yaml index caa72fcc57..26bc9e6db4 100644 --- a/chart/values.yaml +++ b/chart/values.yaml @@ -68,13 +68,13 @@ airflowHome: /opt/airflow defaultAirflowRepository: apache/airflow # Default airflow tag to deploy -defaultAirflowTag: "2.8.3" +defaultAirflowTag: "2.8.4" # Default airflow digest. If specified, it takes precedence over tag defaultAirflowDigest: ~ # Airflow version (Used to make some decisions based on Airflow Version being deployed) -airflowVersion: "2.8.3" +airflo
Re: [PR] Airflow 2.8.4 has been released [airflow]
jedcunningham merged PR #38477: URL: https://github.com/apache/airflow/pull/38477 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Chart: Default airflow version to 2.8.4 [airflow]
jedcunningham merged PR #38478: URL: https://github.com/apache/airflow/pull/38478 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated: Airflow 2.8.4 has been released (#38477)
This is an automated email from the ASF dual-hosted git repository. jedcunningham pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new ac08cb3148 Airflow 2.8.4 has been released (#38477) ac08cb3148 is described below commit ac08cb31485afeac9a323981e07b9b33d368bcea Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com> AuthorDate: Tue Mar 26 00:19:43 2024 -0400 Airflow 2.8.4 has been released (#38477) --- .github/ISSUE_TEMPLATE/airflow_bug_report.yml | 2 +- Dockerfile | 2 +- README.md | 10 +++ RELEASE_NOTES.rst | 31 ++ airflow/reproducible_build.yaml| 4 +-- .../installation/supported-versions.rst| 2 +- generated/PYPI_README.md | 8 +++--- .../ci/pre_commit/pre_commit_supported_versions.py | 2 +- 8 files changed, 46 insertions(+), 15 deletions(-) diff --git a/.github/ISSUE_TEMPLATE/airflow_bug_report.yml b/.github/ISSUE_TEMPLATE/airflow_bug_report.yml index f199fcc5d8..02ad1d180b 100644 --- a/.github/ISSUE_TEMPLATE/airflow_bug_report.yml +++ b/.github/ISSUE_TEMPLATE/airflow_bug_report.yml @@ -26,7 +26,7 @@ body: multiple: false options: - "2.9.0b1" -- "2.8.3" +- "2.8.4" - "main (development)" - "Other Airflow 2 version (please specify below)" validations: diff --git a/Dockerfile b/Dockerfile index 6b86ea8035..c286695514 100644 --- a/Dockerfile +++ b/Dockerfile @@ -45,7 +45,7 @@ ARG AIRFLOW_UID="5" ARG AIRFLOW_USER_HOME_DIR=/home/airflow # latest released version here -ARG AIRFLOW_VERSION="2.8.3" +ARG AIRFLOW_VERSION="2.8.4" ARG PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" diff --git a/README.md b/README.md index 6a53a3404d..6613eca799 100644 --- a/README.md +++ b/README.md @@ -98,7 +98,7 @@ Airflow is not a streaming solution, but it is often used to process real-time d Apache Airflow is tested with: -| | Main version (dev) | Stable version (2.8.3) | +| | Main version (dev) | Stable version (2.8.4) | |-||-| | Python | 3.8, 3.9, 3.10, 3.11, 3.12 | 3.8, 3.9, 3.10, 3.11| | Platform| AMD64/ARM64(\*)| AMD64/ARM64(\*) | @@ -180,15 +180,15 @@ them to the appropriate format and workflow that your tool requires. ```bash -pip install 'apache-airflow==2.8.3' \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.8.3/constraints-3.8.txt"; +pip install 'apache-airflow==2.8.4' \ + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.8.4/constraints-3.8.txt"; ``` 2. Installing with extras (i.e., postgres, google) ```bash pip install 'apache-airflow[postgres,google]==2.8.3' \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.8.3/constraints-3.8.txt"; + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.8.4/constraints-3.8.txt"; ``` For information on installing provider packages, check @@ -293,7 +293,7 @@ Apache Airflow version life cycle: | Version | Current Patch/Minor | State | First Release | Limited Support | EOL/Terminated | |---|---|---|-|---|--| -| 2 | 2.8.3 | Supported | Dec 17, 2020| TBD | TBD | +| 2 | 2.8.4 | Supported | Dec 17, 2020| TBD | TBD | | 1.10 | 1.10.15 | EOL | Aug 27, 2018| Dec 17, 2020 | June 17, 2021| | 1.9 | 1.9.0 | EOL | Jan 03, 2018| Aug 27, 2018 | Aug 27, 2018 | | 1.8 | 1.8.2 | EOL | Mar 19, 2017| Jan 03, 2018 | Jan 03, 2018 | diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst index 93f3090d81..9b127af3da 100644 --- a/RELEASE_NOTES.rst +++ b/RELEASE_NOTES.rst @@ -21,6 +21,37 @@ .. towncrier release notes start + +Airflow 2.8.4 (2024-03-25) +-- + +Significant Changes +^^^ + +No significant changes. + +Bug Fixes +" +- Fix incorrect serialization of ``FixedTimezone`` (#38139) +- Fix excessive permission changing for log task handler (#38164) +- Fix task instances list link (#38096) +- Fix a bug where scheduler heartrate parameter was not used (#37992) +- Add padding to prevent grid horizontal scroll overlapping tasks (#37942) +- Fix hash caching in ``ObjectStoragePath`` (#37769) + +Miscellaneous +" +- Limit importlib_resources as it breaks ``pytest_rewrites`` (#38095, #38139) +- Limi
Re: [PR] Implement the breeze tag_providers command [airflow]
Lee-W commented on code in PR #38447: URL: https://github.com/apache/airflow/pull/38447#discussion_r1538540678 ## dev/breeze/src/airflow_breeze/commands/release_management_commands.py: ## @@ -949,6 +950,81 @@ def run_generate_constraints_in_parallel( ) +@release_management.command( +name="tag-providers", +help="Generates tags for airflow provider releases.", +) +@click.option( +"--clean-local-tags", +default=True, +is_flag=True, +envvar="CLEAN_LOCAL_TAGS", +help="Delete local tags that are created due to github connectivity issues to avoid errors. " +"The default behaviour would be to clean such local tags.", +show_default=True, +) +@option_dry_run +@option_verbose +def tag_providers( +clean_local_tags: bool, +): +found_remote = None +remotes = ["origin", "apache"] +for remote in remotes: +try: +command = ["git", "remote", "get-url", "--push", shlex.quote(remote)] +result = run_command(command, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL, text=True) +if "apache/airflow.git" in result.stdout: +found_remote = remote +break +except subprocess.CalledProcessError: +pass + +if found_remote is None: +raise ValueError("Could not find remote configured to push to apache/airflow") + +tags = [] +for file in os.listdir(os.path.join(SOURCE_DIR_PATH, "dist")): +if file.endswith(".whl"): +match = re.match(r".*airflow_providers_(.*)-(.*)-py3.*", file) +if match: +provider = f"providers-{match.group(1).replace('_', '-')}" +tag = f"{provider}/{match.group(2)}" +try: +run_command( +["git", "tag", shlex.quote(tag), "-m", f"Release {date.today()} of providers"], +check=True, +) +tags.append(tag) +except subprocess.CalledProcessError: +pass + +if tags and len(tags) > 0: Review Comment: I think we might not need `len(tags) > 0`. ```suggestion if tags: ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix BigQuery connection and add docs [airflow]
Lee-W commented on code in PR #38430: URL: https://github.com/apache/airflow/pull/38430#discussion_r1538535382 ## airflow/providers/google/cloud/hooks/bigquery.py: ## @@ -116,17 +146,24 @@ def __init__( "The `delegate_to` parameter has been deprecated before and finally removed in this version" " of Google Provider. You MUST convert it to `impersonate_chain`" ) -super().__init__( -gcp_conn_id=gcp_conn_id, -impersonation_chain=impersonation_chain, -) -self.use_legacy_sql = use_legacy_sql -self.location = location -self.priority = priority +super().__init__(**kwargs) +self.use_legacy_sql: bool = self._get_field("use_legacy_sql", use_legacy_sql) +self.location: str | None = self._get_field("location", location) +self.priority: str = self._get_field("priority", priority) +self.api_resource_configs: dict = self._get_field("api_resource_configs", api_resource_configs or {}) +self.labels = self._get_field("labels", labels or {}) self.running_job_id: str | None = None -self.api_resource_configs: dict = api_resource_configs or {} -self.labels = labels -self.credentials_path = "bigquery_hook_credentials.json" + +@cached_property +@deprecated( +reason=( +"`BigQueryHook.credentials_path` property is deprecated and will be removed in the future. " +"This property used for obtaining credentials path but no longer in actual use. " +), +category=AirflowProviderDeprecationWarning, +) +def credentials_path(self): Review Comment: ```suggestion def credentials_path(self) -> str: ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Implement on_status parameter for KubernetesDeleteJobOperator [airflow]
Lee-W commented on code in PR #38458: URL: https://github.com/apache/airflow/pull/38458#discussion_r1538532111 ## airflow/providers/cncf/kubernetes/operators/job.py: ## @@ -366,10 +366,17 @@ class KubernetesDeleteJobOperator(BaseOperator): :param in_cluster: run kubernetes client with in_cluster configuration. :param cluster_context: context that points to kubernetes cluster. Ignored when in_cluster is True. If None, current-context is used. (templated) +:param on_status: Condition for performing delete operation depending on the job status. Values: Review Comment: how about `delete_on_status`? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Implement deferrable mode for GKEStartJobOperator [airflow]
Lee-W commented on code in PR #38454: URL: https://github.com/apache/airflow/pull/38454#discussion_r1538521679 ## airflow/providers/google/cloud/triggers/kubernetes_engine.py: ## @@ -237,3 +243,67 @@ def _get_hook(self) -> GKEAsyncHook: impersonation_chain=self.impersonation_chain, ) return self._hook + + +class GKEJobTrigger(BaseTrigger): +"""GKEJobTrigger run on the trigger worker to check the state of Job.""" + +def __init__( +self, +cluster_url: str, +ssl_ca_cert: str, +job_name: str, +job_namespace: str, +gcp_conn_id: str = "google_cloud_default", +poll_interval: float = 2, +impersonation_chain: str | Sequence[str] | None = None, +): Review Comment: ```suggestion ) -> None: ``` ## airflow/providers/google/cloud/triggers/kubernetes_engine.py: ## @@ -237,3 +243,67 @@ def _get_hook(self) -> GKEAsyncHook: impersonation_chain=self.impersonation_chain, ) return self._hook + + +class GKEJobTrigger(BaseTrigger): +"""GKEJobTrigger run on the trigger worker to check the state of Job.""" + +def __init__( +self, +cluster_url: str, +ssl_ca_cert: str, +job_name: str, +job_namespace: str, +gcp_conn_id: str = "google_cloud_default", +poll_interval: float = 2, +impersonation_chain: str | Sequence[str] | None = None, +): +super().__init__() +self.cluster_url = cluster_url +self.ssl_ca_cert = ssl_ca_cert +self.job_name = job_name +self.job_namespace = job_namespace +self.gcp_conn_id = gcp_conn_id +self.poll_interval = poll_interval +self.impersonation_chain = impersonation_chain + +def serialize(self) -> tuple[str, dict[str, Any]]: +"""Serialize KubernetesCreateJobTrigger arguments and classpath.""" +return ( + "airflow.providers.google.cloud.triggers.kubernetes_engine.GKEJobTrigger", +{ +"cluster_url": self.cluster_url, +"ssl_ca_cert": self.ssl_ca_cert, +"job_name": self.job_name, +"job_namespace": self.job_namespace, +"gcp_conn_id": self.gcp_conn_id, +"poll_interval": self.poll_interval, +"impersonation_chain": self.impersonation_chain, +}, +) + +async def run(self) -> AsyncIterator[TriggerEvent]: # type: ignore[override] +"""Get current job status and yield a TriggerEvent.""" +job: V1Job = await self.hook.wait_until_job_complete(name=self.job_name, namespace=self.job_namespace) +job_dict = job.to_dict() +error_message = self.hook.is_job_failed(job=job) +yield TriggerEvent( +{ +"name": job.metadata.name, +"namespace": job.metadata.namespace, +"status": "error" if error_message else "success", +"message": f"Job failed with error: {error_message}" +if error_message +else "Job completed successfully", +"job": job_dict, +} +) Review Comment: ```suggestion status = "error" if error_message else "success" message = f"Job failed with error: {error_message}" if error_message else "Job completed successfully" yield TriggerEvent( { "name": job.metadata.name, "namespace": job.metadata.namespace, "status": status "message": message, "job": job_dict, } ) ``` I would suggest we move the if-else block out of the dict for better readbility ## tests/providers/google/cloud/triggers/test_kubernetes_engine.py: ## @@ -54,7 +61,7 @@ LOCATION = "us-central1-c" GCP_CONN_ID = "test-non-existing-project-id" IMPERSONATION_CHAIN = ["impersonate", "this", "test"] -TRIGGER_PATH = "airflow.providers.google.cloud.triggers.kubernetes_engine.GKEOperationTrigger" +TRIGGER_PATH = GKE_TRIGGERS_PATH + ".GKEOperationTrigger" Review Comment: ```suggestion TRIGGER_PATH = f"{GKE_TRIGGERS_PATH}.GKEOperationTrigger" ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [I] Airflow + Advanced Logging Configuration to stdout + CeleryExecutor does not write logs to stdout as expected [airflow]
boring-cyborg[bot] commented on issue #38479: URL: https://github.com/apache/airflow/issues/38479#issuecomment-2019286036 Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[I] Airflow + Advanced Logging Configuration to stdout + CeleryExecutor does not write logs to stdout as expected [airflow]
shivshav opened a new issue, #38479: URL: https://github.com/apache/airflow/issues/38479 ### Apache Airflow version 2.8.3 ### If "Other Airflow 2 version" selected, which one? _No response_ ### What happened? We would like to setup airflow to output task logs to `stdout` along with its usual mechanism of logging task logs to a file which then get pushed to some remote logs destination. Setting up a custom log configuration to do this with a custom handler does not work **specifically** when using the `CeleryExecutor`. Instead, no logs appear at all on stdout and we only get the logs normally generated in the usual task log files the `task` log handler writes to. This configuration _does_ work if using the `LocalExecutor` (haven't tried other executors to see if this is a problem with more than just the `CeleryExecutor` fwiw as we only use the `LocalExecutor` in local development and the `CeleryExecutor` in our deployed environments. ### What you think should happen instead? Based on the information noted [here](https://airflow.apache.org/docs/apache-airflow/2.8.3/administration-and-deployment/logging-monitoring/advanced-logging-configuration.html) and our setup, I would've expected logs to appear in both the task log files and on `stdout` so our usual log collectors can collect/ship them as normal. ### How to reproduce Docker Compose set up (note, we normally utilize our own custom airflow images based on the official ones, but I was able to repro with the official ones so the images for the airflow components point to that just to remove any extra moving parts) ```yaml version: '2.4' ## Shared YAML anchors for configuration ## Note: top-level keys prefixed with 'x-' are ignored by docker-compose for parsing, hence the naming # Common config for postgres connection x-pg-envs: &pg-envs POSTGRES_USER: airflow POSTGRES_PASSWORD: airflow POSTGRES_DB: airflow PGUSER: airflow # Common configuration for airflow containers shared as a YAML anchor x-airflow-app: &airflow-app image: apache/airflow:2.8.3-python3.11 build: context: . restart: always env_file: - .env environment: <<: *pg-envs _AIRFLOW_WWW_USER_CREATE: 'true' _AIRFLOW_WWW_USER_USERNAME: ${_AIRFLOW_WWW_USER_USERNAME:-airflow} _AIRFLOW_WWW_USER_PASSWORD: ${_AIRFLOW_WWW_USER_PASSWORD:-airflow} depends_on: airflow_postgres: condition: service_healthy redis: condition: service_healthy volumes: - airflow_logs:/opt/airflow/logs - ./config/airflow.cfg.dev:/opt/airflow/airflow.cfg - ./config/local:/opt/airflow/config - ./test-dags:/opt/airflow/dags/repo services: airflow_postgres: image: postgres:16 environment: <<: *pg-envs healthcheck: test: ["CMD-SHELL", "pg_isready -U airflow -d airflow"] interval: 1s timeout: 5s retries: 10 ports: - "5435:5432" volumes: - airflow_local_postgres:/var/lib/postgresql/data redis: image: redis:6 healthcheck: test: ["CMD", "redis-cli", "ping"] interval: 2s retries: 5 start_period: 3s volumes: - redis_data:/data webserver: <<: *airflow-app command: ["webserver"] ports: - "8080:8080" healthcheck: test: ["CMD-SHELL", "[ -f /opt/airflow/airflow-webserver.pid ]"] interval: 30s timeout: 30s retries: 3 scheduler: <<: *airflow-app command: ["scheduler"] # The worker and flower services aren't relevant for the LocalExecutor setup, just the CeleryExecutor setup worker: <<: *airflow-app command: ["celery", "worker"] healthcheck: test: ["CMD-SHELL", "[-f /opt/airflow/airflow-worker.pid"] interval: 30s timeout: 30s retries: 3 flower: <<: *airflow-app command: ["celery", "flower"] healthcheck: test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-flower.pid ]"] interval: 30s timeout: 30s retries: 3 ports: - ":" migrate_db: <<: *airflow-app command: ["db", "init"] restart: on-failure volumes: airflow_local_postgres: airflow_logs: redis_data: ``` Custom log configuration, located in `config/local` and mounted under `/opt/airflow/config` in the above docker-compose.yaml ```python from copy import deepcopy import sys from airflow.config_templates.airflow_local_settings import DEFAULT_LOGGING_CONFIG # code taken from "https://github.com/apache/airflow/discussions/29920#discussioncomment-5208504"; LOGGING_CONFIG = deepcopy(DEFAU
(airflow-site) branch gh-pages updated (ae2897195d -> 4aa7329f23)
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a change to branch gh-pages in repository https://gitbox.apache.org/repos/asf/airflow-site.git discard ae2897195d Rewritten history to remove past gh-pages deployments new 4aa7329f23 Rewritten history to remove past gh-pages deployments This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (ae2897195d) \ N -- N -- N refs/heads/gh-pages (4aa7329f23) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: announcements/index.html | 6 + blog/airflow-1.10.10/index.html| 4 +- blog/airflow-1.10.12/index.html| 4 +- blog/airflow-1.10.8-1.10.9/index.html | 4 +- blog/airflow-2.2.0/index.html | 4 +- blog/airflow-2.3.0/index.html | 4 +- blog/airflow-2.4.0/index.html | 4 +- blog/airflow-2.5.0/index.html | 4 +- blog/airflow-2.6.0/index.html | 4 +- blog/airflow-2.7.0/index.html | 4 +- blog/airflow-2.8.0/index.html | 4 +- blog/airflow-survey-2020/index.html| 4 +- blog/airflow-survey-2022/index.html| 4 +- blog/airflow-survey/index.html | 4 +- blog/airflow-two-point-oh-is-here/index.html | 4 +- blog/airflow_summit_2021/index.html| 4 +- blog/airflow_summit_2022/index.html| 4 +- blog/announcing-new-website/index.html | 4 +- blog/apache-airflow-for-newcomers/index.html | 4 +- .../index.html | 4 +- .../index.html | 4 +- .../index.html | 4 +- .../index.html | 4 +- blog/fab-oid-vulnerability/index.html | 4 +- .../index.html | 4 +- blog/introducing_setup_teardown/index.html | 4 +- .../index.html | 4 +- search/index.html | 4 +- sitemap.xml| 134 ++--- use-cases/adobe/index.html | 4 +- use-cases/adyen/index.html | 4 +- use-cases/big-fish-games/index.html| 4 +- use-cases/business_operations/index.html | 4 +- use-cases/dish/index.html | 4 +- use-cases/etl_analytics/index.html | 4 +- use-cases/experity/index.html | 4 +- use-cases/infrastructure-management/index.html | 4 +- use-cases/mlops/index.html | 4 +- use-cases/onefootball/index.html | 4 +- use-cases/plarium-krasnodar/index.html | 4 +- use-cases/seniorlink/index.html| 4 +- use-cases/sift/index.html | 4 +- use-cases/snapp/index.html | 4 +- use-cases/suse/index.html | 4 +- 44 files changed, 157 insertions(+), 151 deletions(-)
Re: [I] DatabricksSQLOperator struggles with parsing the data [airflow]
github-actions[bot] commented on issue #36838: URL: https://github.com/apache/airflow/issues/36838#issuecomment-2019139299 This issue has been automatically marked as stale because it has been open for 14 days with no response from the author. It will be closed in next 7 days if no further activity occurs from the issue author. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] wip: Avoid SQLAchemy Query API usage in tests [airflow]
github-actions[bot] commented on PR #37242: URL: https://github.com/apache/airflow/pull/37242#issuecomment-2019139260 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] FIX: Update Airflow Airbyte Provider to use the new Airbyte API [airflow]
github-actions[bot] commented on PR #37244: URL: https://github.com/apache/airflow/pull/37244#issuecomment-2019139236 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow-site) branch announce_2.8.4 deleted (was ef7fc51919)
This is an automated email from the ASF dual-hosted git repository. jedcunningham pushed a change to branch announce_2.8.4 in repository https://gitbox.apache.org/repos/asf/airflow-site.git was ef7fc51919 Announce 2.8.4 The revisions that were on this branch are still contained in other references; therefore, this change does not discard any commits from the repository.
Re: [PR] Announce 2.8.4 [airflow-site]
potiuk merged PR #989: URL: https://github.com/apache/airflow-site/pull/989 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow-site) branch main updated: Announce 2.8.4 (#989)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow-site.git The following commit(s) were added to refs/heads/main by this push: new 12401831c2 Announce 2.8.4 (#989) 12401831c2 is described below commit 12401831c2362c9a43602338ee5d24d6fcf72e2b Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com> AuthorDate: Mon Mar 25 20:04:28 2024 -0400 Announce 2.8.4 (#989) --- landing-pages/site/content/en/announcements/_index.md | 9 + 1 file changed, 9 insertions(+) diff --git a/landing-pages/site/content/en/announcements/_index.md b/landing-pages/site/content/en/announcements/_index.md index 96f3c42d5b..a96189e340 100644 --- a/landing-pages/site/content/en/announcements/_index.md +++ b/landing-pages/site/content/en/announcements/_index.md @@ -15,6 +15,15 @@ menu: # March 25, 2024 +We’ve just released Apache **Airflow 2.8.4**. + +📦 PyPI: https://pypi.org/project/apache-airflow/2.8.4/ \ +📚 Docs: https://airflow.apache.org/docs/apache-airflow/2.8.4 \ +🛠️ Release Notes: https://airflow.apache.org/docs/apache-airflow/2.8.4/release_notes.html \ +🪶 Sources: https://airflow.apache.org/docs/apache-airflow/2.8.4/installation/installing-from-sources.html + +# March 25, 2024 + We've just released Apache **Airflow Helm chart 1.13.1**. 📦 ArtifactHub: https://artifacthub.io/packages/helm/apache-airflow/airflow \
(airflow) branch main updated: Apply task instance mutation hook consistently (#38440)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 615e1eceff Apply task instance mutation hook consistently (#38440) 615e1eceff is described below commit 615e1eceffcb5c3f30b7f137d4f9d2b482fffcbc Author: Jens Scheffler <95105677+jsche...@users.noreply.github.com> AuthorDate: Tue Mar 26 01:04:07 2024 +0100 Apply task instance mutation hook consistently (#38440) * Apply task instance mutation hook consistently * Add test for cluster policy applied in pytest --- airflow/models/taskinstance.py| 3 +++ tests/models/test_taskinstance.py | 16 +--- 2 files changed, 16 insertions(+), 3 deletions(-) diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py index 7619d06989..9968da5898 100644 --- a/airflow/models/taskinstance.py +++ b/airflow/models/taskinstance.py @@ -98,6 +98,7 @@ from airflow.models.taskreschedule import TaskReschedule from airflow.models.xcom import LazyXComAccess, XCom from airflow.plugins_manager import integrate_macros_plugins from airflow.sentry import Sentry +from airflow.settings import task_instance_mutation_hook from airflow.stats import Stats from airflow.templates import SandboxedEnvironment from airflow.ti_deps.dep_context import DepContext @@ -943,6 +944,8 @@ def _refresh_from_task( task_instance.executor_config = task.executor_config task_instance.operator = task.task_type task_instance.custom_operator_name = getattr(task, "custom_operator_name", None) +# Re-apply cluster policy here so that task default do not overload previous data +task_instance_mutation_hook(task_instance) def _record_task_map_for_downstreams( diff --git a/tests/models/test_taskinstance.py b/tests/models/test_taskinstance.py index 74a803f941..02069d382c 100644 --- a/tests/models/test_taskinstance.py +++ b/tests/models/test_taskinstance.py @@ -3348,10 +3348,20 @@ class TestTaskInstance: @pytest.mark.parametrize("pool_override", [None, "test_pool2"]) -def test_refresh_from_task(pool_override): +@pytest.mark.parametrize("queue_by_policy", [None, "forced_queue"]) +def test_refresh_from_task(pool_override, queue_by_policy, monkeypatch): +default_queue = "test_queue" +expected_queue = queue_by_policy or default_queue +if queue_by_policy: +# Apply a dummy cluster policy to check if it is always applied +def mock_policy(task_instance: TaskInstance): +task_instance.queue = queue_by_policy + + monkeypatch.setattr("airflow.models.taskinstance.task_instance_mutation_hook", mock_policy) + task = EmptyOperator( task_id="empty", -queue="test_queue", +queue=default_queue, pool="test_pool1", pool_slots=3, priority_weight=10, @@ -3362,7 +3372,7 @@ def test_refresh_from_task(pool_override): ti = TI(task, run_id=None) ti.refresh_from_task(task, pool_override=pool_override) -assert ti.queue == task.queue +assert ti.queue == expected_queue if pool_override: assert ti.pool == pool_override
Re: [PR] Apply task instance mutation hook consistently [airflow]
potiuk merged PR #38440: URL: https://github.com/apache/airflow/pull/38440 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Chart: Default airflow version to 2.8.4 [airflow]
jedcunningham opened a new pull request, #38478: URL: https://github.com/apache/airflow/pull/38478 (no comment) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build (#38442) [airflow]
potiuk commented on PR #38473: URL: https://github.com/apache/airflow/pull/38473#issuecomment-2019108636 Back green - this time with the **right** runner. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [I] Apache Beam pipeline option parsed incorrectly if value is False [airflow]
potiuk commented on issue #38457: URL: https://github.com/apache/airflow/issues/38457#issuecomment-2019104029 Assigned you @dondaum -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Allow users to write dag_id and task_id in their national characters, added display name for dag / task (v2) [airflow]
bbovenzi commented on code in PR #38446: URL: https://github.com/apache/airflow/pull/38446#discussion_r1538363623 ## airflow/www/static/js/dag/details/Header.tsx: ## @@ -105,7 +106,7 @@ const Header = () => { onClick={clearSelection} _hover={isDagDetails ? { cursor: "default" } : undefined} > - + Review Comment: Should we use dagId as a backup or will displayName always exist? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Allow users to write dag_id and task_id in their national characters, added display name for dag / task (v2) [airflow]
bbovenzi commented on code in PR #38446: URL: https://github.com/apache/airflow/pull/38446#discussion_r1538363397 ## airflow/www/static/js/dag/details/Header.tsx: ## @@ -34,6 +34,7 @@ import RunTypeIcon from "src/components/RunTypeIcon"; import BreadcrumbText from "./BreadcrumbText"; const dagId = getMetaValue("dag_id"); +const dagDisplayName = getMetaValue("dag_display_name"); Review Comment: We actually have to add this to the meta tags of `dag.html`, right now this is blank. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Airflow 2.8.4 has been released [airflow]
jedcunningham opened a new pull request, #38477: URL: https://github.com/apache/airflow/pull/38477 (no comment) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Migrate to connexion v3 [airflow]
Satoshi-Sh commented on PR #37638: URL: https://github.com/apache/airflow/pull/37638#issuecomment-2019084388 I just fixed the test_cors.py. There might be an improvement in refactoring. If someone can review it, it would be great. https://github.com/sudiptob2/airflow/pull/35 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Allow users to write dag_id and task_id in their national characters, added display name for dag / task (v2) [airflow]
potiuk commented on PR #38446: URL: https://github.com/apache/airflow/pull/38446#issuecomment-2019081969 LGTM. @ephraimbuddy @uranusjr ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
potiuk commented on PR #38474: URL: https://github.com/apache/airflow/pull/38474#issuecomment-2019075905 Left an unresolved conversation to avoid accidental merge -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
potiuk commented on code in PR #38474: URL: https://github.com/apache/airflow/pull/38474#discussion_r1538347137 ## airflow/migrations/versions/0139_2_10_0_add_new_executor_field_to_db.py: ## @@ -0,0 +1,46 @@ +# Review Comment: Please do not merge before 2.0.9b2 is cut! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Improve datasets graph UX [airflow]
bbovenzi opened a new pull request, #38476: URL: https://github.com/apache/airflow/pull/38476 We had a bunch of complicated logic trying to separate out unconnected dataset "graphs" from each other in the UI. It was buggy and could cause the UI to crash if there were too many datasets. So I decided to refactor a few things wih the tools I have available: - Always render the full datasets dependencies graph - Merge the Dataset and DAG search bars into one, with a single selection vs multiselect - Can click to select a DAG - Hover on a DAG shows popover to link to it's page - Add loading indicator to the graph - Zoom the graph to the node upon selection - Highlight edges connected to a DAG https://github.com/apache/airflow/assets/4600967/3e2ea7f6-3fd8-4fb5-932e-23b459b8a669";> ![Mar-25-2024 19-03-17](https://github.com/apache/airflow/assets/4600967/5b172900-32a7-448f-ad43-f46baadeba40) --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Apply task instance mutation hook consistently [airflow]
potiuk commented on PR #38440: URL: https://github.com/apache/airflow/pull/38440#issuecomment-2019069077 nice! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build (#38442) [airflow]
potiuk commented on PR #38473: URL: https://github.com/apache/airflow/pull/38473#issuecomment-2019066412 The reason it failed before in https://github.com/apache/airflow/pull/38468 was tht "build-images" worfkflow used "self-hosted" runners to build images, but we should switch to "public" runner. This has been fixed in #38463 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
ferruzzi commented on PR #38474: URL: https://github.com/apache/airflow/pull/38474#issuecomment-2019064093 Previously had approval from @potiuk and @eladkal , safe to merge after https://github.com/apache/airflow/pull/38446 "and one other PR" See [here](https://github.com/apache/airflow/pull/38054#issuecomment-2018507597) for details. ## Don't be a ferruzzi! :P -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Helm chart 1.13.1 has been released [airflow]
jedcunningham merged PR #38469: URL: https://github.com/apache/airflow/pull/38469 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Implement the breeze tag_providers command [airflow]
potiuk commented on code in PR #38447: URL: https://github.com/apache/airflow/pull/38447#discussion_r1538336861 ## dev/breeze/src/airflow_breeze/commands/release_management_commands.py: ## @@ -949,6 +950,81 @@ def run_generate_constraints_in_parallel( ) +@release_management.command( +name="tag-providers", +help="Generates tags for airflow provider releases.", +) +@click.option( +"--clean-local-tags", Review Comment: https://click.palletsprojects.com/en/8.1.x/options/#boolean-flags -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated: Helm chart 1.13.1 has been released (#38469)
This is an automated email from the ASF dual-hosted git repository. jedcunningham pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new e92e0f7a79 Helm chart 1.13.1 has been released (#38469) e92e0f7a79 is described below commit e92e0f7a79c3e602d5d81dd84d3ac3ddd112efd2 Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com> AuthorDate: Mon Mar 25 18:58:52 2024 -0400 Helm chart 1.13.1 has been released (#38469) --- .github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml | 3 ++- chart/Chart.yaml| 4 ++-- 2 files changed, 4 insertions(+), 3 deletions(-) diff --git a/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml b/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml index edb4b4a54a..5f89251010 100644 --- a/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml +++ b/.github/ISSUE_TEMPLATE/airflow_helmchart_bug_report.yml @@ -28,7 +28,8 @@ body: What Apache Airflow Helm Chart version are you using? multiple: false options: -- "1.13.0 (latest released)" +- "1.13.1 (latest released)" +- "1.13.0" - "1.12.0" - "1.11.0" - "1.10.0" diff --git a/chart/Chart.yaml b/chart/Chart.yaml index 3f35faffb2..d65defcdbd 100644 --- a/chart/Chart.yaml +++ b/chart/Chart.yaml @@ -19,7 +19,7 @@ --- apiVersion: v2 name: airflow -version: 1.13.1 +version: 1.14.0 appVersion: 2.8.3 description: The official Helm chart to deploy Apache Airflow, a platform to programmatically author, schedule, and monitor workflows @@ -44,7 +44,7 @@ type: application annotations: artifacthub.io/links: | - name: Documentation - url: https://airflow.apache.org/docs/helm-chart/1.13.1/ + url: https://airflow.apache.org/docs/helm-chart/1.14.0/ artifacthub.io/screenshots: | - title: DAGs View url: https://airflow.apache.org/docs/apache-airflow/2.8.3/_images/dags.png
Re: [PR] Use `importlib_metadata` with compat to Python 3.10/3.12 stdlib [airflow]
Taragolis commented on PR #38366: URL: https://github.com/apache/airflow/pull/38366#issuecomment-2019062556 > Rebase and merge @Taragolis ? Yep, I will do it in the morning, I'm out of laptop -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
ferruzzi commented on PR #38054: URL: https://github.com/apache/airflow/pull/38054#issuecomment-2019061460 I have access to the fork/branch he submitted it from, [aws-mwaa:onikolas/aip-61/db_migration*](https://github.com/aws-mwaa/upstream-to-airflow/tree/onikolas/aip-61/db_migration), so I resubmitted it [here](https://github.com/apache/airflow/pull/38474) and verified that it shows his credit on the commits. * I got that origin fork/branch from here: ![image](https://github.com/apache/airflow/assets/1920178/ae87d0ca-6111-4bdd-966b-ea83a6e109d4) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Implement the breeze tag_providers command [airflow]
potiuk commented on code in PR #38447: URL: https://github.com/apache/airflow/pull/38447#discussion_r1538335745 ## dev/breeze/src/airflow_breeze/commands/release_management_commands.py: ## @@ -949,6 +950,81 @@ def run_generate_constraints_in_parallel( ) +@release_management.command( +name="tag-providers", +help="Generates tags for airflow provider releases.", +) +@click.option( +"--clean-local-tags", Review Comment: NIT: ```suggestion "--clean-local-tags/--no-clean-local-tags", ``` This will allow you to toggle it to false by specifying `--no-clean-local-tags`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Announce 2.8.4 [airflow-site]
jedcunningham opened a new pull request, #989: URL: https://github.com/apache/airflow-site/pull/989 (no comment) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow-site) branch announce_2.8.4 created (now ef7fc51919)
This is an automated email from the ASF dual-hosted git repository. jedcunningham pushed a change to branch announce_2.8.4 in repository https://gitbox.apache.org/repos/asf/airflow-site.git at ef7fc51919 Announce 2.8.4 This branch includes the following new commits: new ef7fc51919 Announce 2.8.4 The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
[PR] Add executor field to the DB and parameter to the operators [airflow]
ferruzzi opened a new pull request, #38474: URL: https://github.com/apache/airflow/pull/38474 Resumbitting https://github.com/apache/airflow/pull/38054 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow-site) 01/01: Announce 2.8.4
This is an automated email from the ASF dual-hosted git repository. jedcunningham pushed a commit to branch announce_2.8.4 in repository https://gitbox.apache.org/repos/asf/airflow-site.git commit ef7fc51919fc130b3e8a63ff5eadc200f3e8e8d5 Author: Jed Cunningham AuthorDate: Mon Mar 25 18:53:07 2024 -0400 Announce 2.8.4 --- landing-pages/site/content/en/announcements/_index.md | 9 + 1 file changed, 9 insertions(+) diff --git a/landing-pages/site/content/en/announcements/_index.md b/landing-pages/site/content/en/announcements/_index.md index 96f3c42d5b..a96189e340 100644 --- a/landing-pages/site/content/en/announcements/_index.md +++ b/landing-pages/site/content/en/announcements/_index.md @@ -15,6 +15,15 @@ menu: # March 25, 2024 +We’ve just released Apache **Airflow 2.8.4**. + +📦 PyPI: https://pypi.org/project/apache-airflow/2.8.4/ \ +📚 Docs: https://airflow.apache.org/docs/apache-airflow/2.8.4 \ +🛠️ Release Notes: https://airflow.apache.org/docs/apache-airflow/2.8.4/release_notes.html \ +🪶 Sources: https://airflow.apache.org/docs/apache-airflow/2.8.4/installation/installing-from-sources.html + +# March 25, 2024 + We've just released Apache **Airflow Helm chart 1.13.1**. 📦 ArtifactHub: https://artifacthub.io/packages/helm/apache-airflow/airflow \
Re: [PR] Use `importlib_metadata` with compat to Python 3.10/3.12 stdlib [airflow]
potiuk commented on PR #38366: URL: https://github.com/apache/airflow/pull/38366#issuecomment-2019057969 Rebase and merge @Taragolis ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
potiuk commented on PR #38054: URL: https://github.com/apache/airflow/pull/38054#issuecomment-2019056738 yep. @o-nikolas to open a PR or you can do it by cherry-picking the reverted commit -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
potiuk commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538332876 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: yeah. Would be ok now. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
ferruzzi commented on PR #38054: URL: https://github.com/apache/airflow/pull/38054#issuecomment-2019055358 I have reverted my accidental merge here. I don't see a way to reopen this PR so we may need to apply it again later by reverting the revert, unless @o-nikolas opens a new PR? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated: Revert "Add executor field to the DB and parameter to the operators (#38054)" (#38472)
This is an automated email from the ASF dual-hosted git repository. ferruzzi pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new cbca35918b Revert "Add executor field to the DB and parameter to the operators (#38054)" (#38472) cbca35918b is described below commit cbca35918ba9f9edd233c38072a92aed37f08c3c Author: D. Ferruzzi AuthorDate: Mon Mar 25 15:49:04 2024 -0700 Revert "Add executor field to the DB and parameter to the operators (#38054)" (#38472) This reverts commit 41d5e2226c10c78ee6f493f8e54637dca2f72e32. Co-authored-by: Jarek Potiuk --- .../0139_2_10_0_add_new_executor_field_to_db.py| 46 -- airflow/models/abstractoperator.py | 1 - airflow/models/baseoperator.py | 13 - airflow/models/mappedoperator.py | 5 - airflow/models/taskinstance.py | 7 - airflow/serialization/pydantic/taskinstance.py | 1 - airflow/serialization/schema.json | 1 - docs/apache-airflow/img/airflow_erd.sha256 | 2 +- docs/apache-airflow/img/airflow_erd.svg| 811 ++--- docs/apache-airflow/migrations-ref.rst | 4 +- tests/models/test_taskinstance.py | 1 - tests/serialization/test_dag_serialization.py | 1 - tests/www/views/test_views_tasks.py| 7 - 13 files changed, 406 insertions(+), 494 deletions(-) diff --git a/airflow/migrations/versions/0139_2_10_0_add_new_executor_field_to_db.py b/airflow/migrations/versions/0139_2_10_0_add_new_executor_field_to_db.py deleted file mode 100644 index 9e3d615f50..00 --- a/airflow/migrations/versions/0139_2_10_0_add_new_executor_field_to_db.py +++ /dev/null @@ -1,46 +0,0 @@ -# -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. - -"""add new executor field to db - -Revision ID: 677fdbb7fc54 -Revises: b4078ac230a1 -Create Date: 2024-03-25 15:26:59.186579 - -""" - -import sqlalchemy as sa -from alembic import op - - -# revision identifiers, used by Alembic. -revision = '677fdbb7fc54' -down_revision = 'b4078ac230a1' -branch_labels = None -depends_on = None -airflow_version = '2.10.0' - - -def upgrade(): -"""Apply add executor field to task instance""" -op.add_column('task_instance', sa.Column('executor', sa.String(length=1000), default=None)) - - -def downgrade(): -"""Unapply add executor field to task instance""" -op.drop_column('task_instance', 'executor') diff --git a/airflow/models/abstractoperator.py b/airflow/models/abstractoperator.py index 74911fc27c..f2d179f01b 100644 --- a/airflow/models/abstractoperator.py +++ b/airflow/models/abstractoperator.py @@ -59,7 +59,6 @@ if TYPE_CHECKING: DEFAULT_OWNER: str = conf.get_mandatory_value("operators", "default_owner") DEFAULT_POOL_SLOTS: int = 1 DEFAULT_PRIORITY_WEIGHT: int = 1 -DEFAULT_EXECUTOR: str | None = None DEFAULT_QUEUE: str = conf.get_mandatory_value("operators", "default_queue") DEFAULT_IGNORE_FIRST_DEPENDS_ON_PAST: bool = conf.getboolean( "scheduler", "ignore_first_depends_on_past_by_default" diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py index c59b66309a..8636dd6c2e 100644 --- a/airflow/models/baseoperator.py +++ b/airflow/models/baseoperator.py @@ -63,7 +63,6 @@ from airflow.exceptions import ( ) from airflow.lineage import apply_lineage, prepare_lineage from airflow.models.abstractoperator import ( -DEFAULT_EXECUTOR, DEFAULT_IGNORE_FIRST_DEPENDS_ON_PAST, DEFAULT_OWNER, DEFAULT_POOL_SLOTS, @@ -209,7 +208,6 @@ _PARTIAL_DEFAULTS: dict[str, Any] = { "wait_for_past_depends_before_skipping": DEFAULT_WAIT_FOR_PAST_DEPENDS_BEFORE_SKIPPING, "wait_for_downstream": False, "retries": DEFAULT_RETRIES, -"executor": DEFAULT_EXECUTOR, "queue": DEFAULT_QUEUE, "pool_slots": DEFAULT_POOL_SLOTS, "execution_timeout": DEFAULT_TASK_EXECUTION_TIMEOUT, @@ -261,7 +259,6 @@ def partial( on_retry_callback: None | TaskStateChangeCallback | list[TaskStateChangeCallback] | ArgNotSet = NOTSET, on_skipped_callback: None | TaskStateChangeCallback
Re: [PR] Revert "Add executor field to the DB and parameter to the operators" [airflow]
ferruzzi merged PR #38472: URL: https://github.com/apache/airflow/pull/38472 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add UV_REQUEST_TIMEOUT argument/envvar to building CI/PROD images [airflow]
potiuk merged PR #38467: URL: https://github.com/apache/airflow/pull/38467 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [I] Status of testing of Apache Airflow 2.8.4rc1 [airflow]
jedcunningham commented on issue #38334: URL: https://github.com/apache/airflow/issues/38334#issuecomment-2019052053 Thanks for testing! Release is in progress now 🍺 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [I] Status of testing of Apache Airflow 2.8.4rc1 [airflow]
jedcunningham closed issue #38334: Status of testing of Apache Airflow 2.8.4rc1 URL: https://github.com/apache/airflow/issues/38334 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
Taragolis commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538331411 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: So maybe `pydantic != "v2"` would be enough now, and we might extend when we found if it happen in other cases -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch fix-image-cache-2 updated (73e1923443 -> 422ee3a2a6)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache-2 in repository https://gitbox.apache.org/repos/asf/airflow.git discard 73e1923443 Fix image cache optimizations - speeding up the build (#38442) omit bafe734a10 Cleans up runs-on in workflows add ff28969ff3 fix: EmrServerlessStartJobOperator not serializing DAGs correctly when partial/expand is used. (#38022) add 41d5e2226c Add executor field to the DB and parameter to the operators (#38054) add d83dd02816 Cleans up runs-on in workflows (#38463) new 422ee3a2a6 Fix image cache optimizations - speeding up the build (#38442) This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (73e1923443) \ N -- N -- N refs/heads/fix-image-cache-2 (422ee3a2a6) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: ...=> 0139_2_10_0_add_new_executor_field_to_db.py} | 23 +- airflow/models/abstractoperator.py | 1 + airflow/models/baseoperator.py | 13 + airflow/models/mappedoperator.py | 5 + airflow/models/taskinstance.py | 7 + airflow/providers/amazon/aws/operators/emr.py | 62 +- airflow/serialization/pydantic/taskinstance.py | 1 + airflow/serialization/schema.json | 1 + docs/apache-airflow/img/airflow_erd.sha256 | 2 +- docs/apache-airflow/img/airflow_erd.svg| 811 +++-- docs/apache-airflow/migrations-ref.rst | 4 +- tests/models/test_taskinstance.py | 1 + .../amazon/aws/operators/test_emr_serverless.py| 55 ++ tests/serialization/test_dag_serialization.py | 1 + tests/www/views/test_views_tasks.py| 7 + 15 files changed, 572 insertions(+), 422 deletions(-) copy airflow/migrations/versions/{0010_1_6_2_add_password_column_to_user.py => 0139_2_10_0_add_new_executor_field_to_db.py} (67%)
Re: [PR] Fix image cache optimizations - speeding up the build (#38442) [airflow]
potiuk commented on PR #38473: URL: https://github.com/apache/airflow/pull/38473#issuecomment-2019049908 Second attempt on cache optimization restoring -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) 01/01: Fix image cache optimizations - speeding up the build (#38442)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch fix-image-cache-2 in repository https://gitbox.apache.org/repos/asf/airflow.git commit 422ee3a2a63a419971dd862d39ac2ce7587e1614 Author: Jarek Potiuk AuthorDate: Mon Mar 25 18:16:06 2024 +0100 Fix image cache optimizations - speeding up the build (#38442) The recent refactors in workflows broke the way how cache had been used in the CI builds. This PR brings back the optimizations by using the cache and rebuilding it. --- .github/workflows/additional-ci-image-checks.yml | 56 +- .github/workflows/build-images.yml | 3 +- .github/workflows/ci-image-build.yml | 17 +++--- .github/workflows/ci.yml | 2 + .github/workflows/finalize-tests.yml | 67 + .github/workflows/prod-image-build.yml | 9 ++- .github/workflows/prod-image-extra-checks.yml | 3 + .github/workflows/push-image-cache.yml | 68 ++ .../airflow_breeze/utils/docker_command_utils.py | 6 +- dev/breeze/src/airflow_breeze/utils/image.py | 14 + 10 files changed, 128 insertions(+), 117 deletions(-) diff --git a/.github/workflows/additional-ci-image-checks.yml b/.github/workflows/additional-ci-image-checks.yml index 84012fb24f..f73fe08ec7 100644 --- a/.github/workflows/additional-ci-image-checks.yml +++ b/.github/workflows/additional-ci-image-checks.yml @@ -89,42 +89,40 @@ jobs: # delay cache refresh. It does not attempt to upgrade to newer dependencies. # We only push CI cache as PROD cache usually does not gain as much from fresh cache because # it uses prepared airflow and provider packages that invalidate the cache anyway most of the time - # push-early-buildx-cache-to-github-registry: - # name: Push Early Image Cache - # uses: ./.github/workflows/push-image-cache.yml - # permissions: - # contents: read - # # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs - # # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # # "in-workflow-build" condition - # packages: write - # secrets: inherit - # with: - # runs-on: ${{ inputs.runs-on }} - # cache-type: "Early" - # include-prod-images: "false" - # push-latest-images: "false" - # image-tag: ${{ inputs.image-tag }} - # python-versions: ${{ inputs.python-versions }} - # branch: ${{ inputs.branch }} - # use-uv: "true" - # include-success-outputs: ${{ inputs.include-success-outputs }} - # constraints-branch: ${{ inputs.constraints-branch }} - # docker-cache: ${{ inputs.docker-cache }} - # if: inputs.canary-run == 'true' && inputs.branch == 'main' + push-early-buildx-cache-to-github-registry: +name: Push Early Image Cache +uses: ./.github/workflows/push-image-cache.yml +permissions: + contents: read + # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs + # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. + # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the + # "in-workflow-build" condition + packages: write +secrets: inherit +with: + # Runs on Public runners + cache-type: "Early" + include-prod-images: "false" + push-latest-images: "false" + platform: "linux/amd64" + python-versions: ${{ inputs.python-versions }} + branch: ${{ inputs.branch }} + constraints-branch: ${{ inputs.constraints-branch }} + use-uv: "true" + include-success-outputs: ${{ inputs.include-success-outputs }} + docker-cache: ${{ inputs.docker-cache }} +if: inputs.canary-run == 'true' && inputs.branch == 'main' # Check that after earlier cache push, breeze command will build quickly check-that-image-builds-quickly: -timeout-minutes: 5 +timeout-minutes: 11 name: Check that image builds quickly runs-on: ["ubuntu-22.04"] env: UPGRADE_TO_NEWER_DEPENDENCIES: false - PLATFORM: "linux/amd64" PYTHON_MAJOR_MINOR_VERSION: ${{ inputs.default-python-version }} PYTHON_VERSION: ${{ inputs.default-python-version }} - IMAGE_TAG: ${{ inputs.image-tag }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} if: inputs.canary-run == 'true' && inputs.branch == 'main' steps: @@ -142,7 +140,7 @@ jobs: - name: "Login to ghcr.io" run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - name: "Check that image builds quickly" -run: breeze shell --max-time 120 +run: breeze
[PR] Fix image cache optimizations - speeding up the build (#38442) [airflow]
potiuk opened a new pull request, #38473: URL: https://github.com/apache/airflow/pull/38473 The recent refactors in workflows broke the way how cache had been used in the CI builds. This PR brings back the optimizations by using the cache and rebuilding it. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
Taragolis commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538328070 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: Unlikely happen in other cases rather than pydantic. This happen because Weaviate depend on pydantic, and all tests just skipped. In case of pendulum and sqlalchemy cases it should works with both versions. boto3 related to mostly amazon provider (if other even related on them), and there hundreds tests which do not required aiobotocore -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow-site) branch 2.8.4-docs deleted (was 0abbb1cb78)
This is an automated email from the ASF dual-hosted git repository. jedcunningham pushed a change to branch 2.8.4-docs in repository https://gitbox.apache.org/repos/asf/airflow-site.git was 0abbb1cb78 Add documentation for Apache Airflow 2.8.4 The revisions that were on this branch are still contained in other references; therefore, this change does not discard any commits from the repository.
(airflow) branch main updated (41d5e2226c -> d83dd02816)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from 41d5e2226c Add executor field to the DB and parameter to the operators (#38054) add d83dd02816 Cleans up runs-on in workflows (#38463) No new revisions were added by this update. Summary of changes: .github/workflows/additional-ci-image-checks.yml | 2 +- .github/workflows/build-images.yml | 6 +++--- .github/workflows/check-providers.yml| 3 --- .github/workflows/ci.yml | 6 -- .github/workflows/helm-tests.yml | 4 +--- .github/workflows/integration-tests.yml | 1 - .github/workflows/k8s-tests.yml | 1 - .github/workflows/release_dockerhub_image.yml| 4 +--- .github/workflows/run-unit-tests.yml | 1 - 9 files changed, 10 insertions(+), 18 deletions(-)
Re: [PR] Cleans up runs-on in workflows [airflow]
potiuk merged PR #38463: URL: https://github.com/apache/airflow/pull/38463 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add documentation for Apache Airflow 2.8.4 [airflow-site]
jedcunningham merged PR #988: URL: https://github.com/apache/airflow-site/pull/988 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch revert-38054-onikolas/aip-61/db_migration created (now f87998fdde)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch revert-38054-onikolas/aip-61/db_migration in repository https://gitbox.apache.org/repos/asf/airflow.git at f87998fdde Revert "Add executor field to the DB and parameter to the operators (#38054)" This branch includes the following new commits: new f87998fdde Revert "Add executor field to the DB and parameter to the operators (#38054)" The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
potiuk commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538321814 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: (maybe we extract those to a single property ? `is_special_test` ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Revert "Add executor field to the DB and parameter to the operators" [airflow]
ferruzzi opened a new pull request, #38472: URL: https://github.com/apache/airflow/pull/38472 Reverts apache/airflow#38054 ferruzzi pulled the trigger too early, this PR depends on others. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) 01/01: Revert "Add executor field to the DB and parameter to the operators (#38054)"
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch revert-38054-onikolas/aip-61/db_migration in repository https://gitbox.apache.org/repos/asf/airflow.git commit f87998fddee91bcaa2b63cbb6951e5d56d7b472a Author: Jarek Potiuk AuthorDate: Mon Mar 25 23:38:13 2024 +0100 Revert "Add executor field to the DB and parameter to the operators (#38054)" This reverts commit 41d5e2226c10c78ee6f493f8e54637dca2f72e32. --- .../0139_2_10_0_add_new_executor_field_to_db.py| 46 -- airflow/models/abstractoperator.py | 1 - airflow/models/baseoperator.py | 13 - airflow/models/mappedoperator.py | 5 - airflow/models/taskinstance.py | 7 - airflow/serialization/pydantic/taskinstance.py | 1 - airflow/serialization/schema.json | 1 - docs/apache-airflow/img/airflow_erd.sha256 | 2 +- docs/apache-airflow/img/airflow_erd.svg| 811 ++--- docs/apache-airflow/migrations-ref.rst | 4 +- tests/models/test_taskinstance.py | 1 - tests/serialization/test_dag_serialization.py | 1 - tests/www/views/test_views_tasks.py| 7 - 13 files changed, 406 insertions(+), 494 deletions(-) diff --git a/airflow/migrations/versions/0139_2_10_0_add_new_executor_field_to_db.py b/airflow/migrations/versions/0139_2_10_0_add_new_executor_field_to_db.py deleted file mode 100644 index 9e3d615f50..00 --- a/airflow/migrations/versions/0139_2_10_0_add_new_executor_field_to_db.py +++ /dev/null @@ -1,46 +0,0 @@ -# -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. - -"""add new executor field to db - -Revision ID: 677fdbb7fc54 -Revises: b4078ac230a1 -Create Date: 2024-03-25 15:26:59.186579 - -""" - -import sqlalchemy as sa -from alembic import op - - -# revision identifiers, used by Alembic. -revision = '677fdbb7fc54' -down_revision = 'b4078ac230a1' -branch_labels = None -depends_on = None -airflow_version = '2.10.0' - - -def upgrade(): -"""Apply add executor field to task instance""" -op.add_column('task_instance', sa.Column('executor', sa.String(length=1000), default=None)) - - -def downgrade(): -"""Unapply add executor field to task instance""" -op.drop_column('task_instance', 'executor') diff --git a/airflow/models/abstractoperator.py b/airflow/models/abstractoperator.py index 74911fc27c..f2d179f01b 100644 --- a/airflow/models/abstractoperator.py +++ b/airflow/models/abstractoperator.py @@ -59,7 +59,6 @@ if TYPE_CHECKING: DEFAULT_OWNER: str = conf.get_mandatory_value("operators", "default_owner") DEFAULT_POOL_SLOTS: int = 1 DEFAULT_PRIORITY_WEIGHT: int = 1 -DEFAULT_EXECUTOR: str | None = None DEFAULT_QUEUE: str = conf.get_mandatory_value("operators", "default_queue") DEFAULT_IGNORE_FIRST_DEPENDS_ON_PAST: bool = conf.getboolean( "scheduler", "ignore_first_depends_on_past_by_default" diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py index c59b66309a..8636dd6c2e 100644 --- a/airflow/models/baseoperator.py +++ b/airflow/models/baseoperator.py @@ -63,7 +63,6 @@ from airflow.exceptions import ( ) from airflow.lineage import apply_lineage, prepare_lineage from airflow.models.abstractoperator import ( -DEFAULT_EXECUTOR, DEFAULT_IGNORE_FIRST_DEPENDS_ON_PAST, DEFAULT_OWNER, DEFAULT_POOL_SLOTS, @@ -209,7 +208,6 @@ _PARTIAL_DEFAULTS: dict[str, Any] = { "wait_for_past_depends_before_skipping": DEFAULT_WAIT_FOR_PAST_DEPENDS_BEFORE_SKIPPING, "wait_for_downstream": False, "retries": DEFAULT_RETRIES, -"executor": DEFAULT_EXECUTOR, "queue": DEFAULT_QUEUE, "pool_slots": DEFAULT_POOL_SLOTS, "execution_timeout": DEFAULT_TASK_EXECUTION_TIMEOUT, @@ -261,7 +259,6 @@ def partial( on_retry_callback: None | TaskStateChangeCallback | list[TaskStateChangeCallback] | ArgNotSet = NOTSET, on_skipped_callback: None | TaskStateChangeCallback | list[TaskStateChangeCallback] | ArgNotSet = NOTSET, run_as_user: str | None | ArgNotSet = NOTSET, -executor: str | None | ArgNotSet = NOTSET, executor_config: dict | None | ArgNotSet = NOTSET, inle
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
potiuk commented on PR #38054: URL: https://github.com/apache/airflow/pull/38054#issuecomment-2019042536 Yeah I think it's best now. BTW. I had no idea the button is there :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
potiuk commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538319244 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: ```suggestion if pydantic != "v2" or downgrade_sqlalchemy or downgrade_pendulum or remove_arm_packages or upgrade_boto: ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
potiuk commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538320204 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: (and pydantic == 'v2' is default) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
potiuk commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538319654 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: I think this might **potentially** happen in those cases too. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
potiuk commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538319244 ## dev/breeze/src/airflow_breeze/commands/testing_commands.py: ## @@ -679,6 +679,11 @@ def _run_test_command( fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() +if pydantic: Review Comment: ```suggestion if pydantic != "v2" or downgrade_sqlalchemy or downgrade_pendulum or remove_arm_packages: ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add a task instance dependency for mapped dependencies (#37091) [airflow]
stevenschaerer commented on PR #37498: URL: https://github.com/apache/airflow/pull/37498#issuecomment-2019036100 @uranusjr Could you please review my latest changes and responses to your comments? Thanks! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] openlineage: add `opt-in` option [airflow]
JDarDagran commented on code in PR #37725: URL: https://github.com/apache/airflow/pull/37725#discussion_r1538317937 ## airflow/providers/openlineage/plugins/listener.py: ## @@ -51,6 +57,16 @@ def __init__(self): self.log = logging.getLogger(__name__) self.extractor_manager = ExtractorManager() self.adapter = OpenLineageAdapter() +self._selective_enable = conf.getboolean("openlineage", "selective_enable", fallback=False) + Review Comment: Moved logic to `conf.py` and related. ## tests/providers/openlineage/utils/test_selective_enable.py: ## @@ -0,0 +1,72 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +from airflow.decorators import dag, task +from airflow.models import DAG +from airflow.operators.empty import EmptyOperator +from airflow.providers.openlineage.utils.selective_enable import ( +DISABLE_OL_PARAM, +ENABLE_OL_PARAM, +ENABLE_OL_PARAM_NAME, +disable_lineage, +enable_lineage, +) + + +class TestOpenLineageSelectiveEnable: Review Comment: Added test case. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Suppress error if no tests collected in pydantic special Tests [airflow]
Taragolis commented on code in PR #38470: URL: https://github.com/apache/airflow/pull/38470#discussion_r1538314421 ## dev/breeze/src/airflow_breeze/utils/run_tests.py: ## @@ -394,6 +394,10 @@ def generate_args_for_pytest( "--disable-warnings", ] ) +else: +# Avoid edge cases when there are no available tests, e.g. No-Pydantic for Weaviate provider. +# https://docs.pytest.org/en/stable/reference/exit-codes.html +args.append("--suppress-no-test-exit-code") Review Comment: Done -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
ferruzzi commented on PR #38054: URL: https://github.com/apache/airflow/pull/38054#issuecomment-2019030828 I'm sorry! Want me to revert the commit? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch suppress-no-tests-collected updated (9095a20839 -> ddb102ba5d)
This is an automated email from the ASF dual-hosted git repository. taragolis pushed a change to branch suppress-no-tests-collected in repository https://gitbox.apache.org/repos/asf/airflow.git from 9095a20839 Suppress error if no tests collected add ddb102ba5d Add --suppress-no-test-exit-code in case of pydantic tests No new revisions were added by this update. Summary of changes: dev/breeze/src/airflow_breeze/commands/testing_commands.py | 5 + dev/breeze/src/airflow_breeze/utils/run_tests.py | 4 2 files changed, 5 insertions(+), 4 deletions(-)
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
potiuk commented on PR #38054: URL: https://github.com/apache/airflow/pull/38054#issuecomment-2019023197 Ups. We were supposed to merge it after we move branch :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Support multiple Docker hosts in `docker_url` attribute. [airflow]
oboki commented on PR #38466: URL: https://github.com/apache/airflow/pull/38466#issuecomment-2019021779 @Taragolis Thank you for the detailed review. 👍 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Support multiple Docker hosts in `docker_url` attribute. [airflow]
oboki commented on code in PR #38466: URL: https://github.com/apache/airflow/pull/38466#discussion_r1538300908 ## airflow/providers/docker/operators/docker.py: ## @@ -343,13 +345,21 @@ def hook(self) -> DockerHook: assert_hostname=self.tls_hostname, ssl_version=self.tls_ssl_version, ) -return DockerHook( -docker_conn_id=self.docker_conn_id, -base_url=self.docker_url, -version=self.api_version, -tls=tls_config, -timeout=self.timeout, -) +hook = None +for url in self.docker_url: +hook = DockerHook( +docker_conn_id=self.docker_conn_id, +base_url=url, +version=self.api_version, +tls=tls_config, +timeout=self.timeout +) +try: +hook.get_conn() +return hook +except Exception as e: +self.log.error(f"Failed to establish connection to Docker host {url}: {e}") +return hook Review Comment: Adding an error-raising step instead of returning a nullable hook cause the following two tests to fail: ```log == short test summary info === FAILED tests/providers/docker/operators/test_docker.py::test_on_kill_client_not_created - Exception: Failed to establish connection to Docker host. FAILED tests/providers/docker/operators/test_docker_swarm.py::TestDockerSwarmOperator::test_on_kill_client_not_created - Exception: Failed to establish connection to Docker host. = 2 failed, 133 passed, 2 warnings in 21.62s = ``` I'm not very good at dealing with mocks, could I get some additional support? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) 01/01: Fix image cache optimizations - speeding up the build (#38442)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch fix-image-cache-2 in repository https://gitbox.apache.org/repos/asf/airflow.git commit 73e192344322947caf02da8c3ccf5550efaef4bd Author: Jarek Potiuk AuthorDate: Mon Mar 25 18:16:06 2024 +0100 Fix image cache optimizations - speeding up the build (#38442) The recent refactors in workflows broke the way how cache had been used in the CI builds. This PR brings back the optimizations by using the cache and rebuilding it. --- .github/workflows/additional-ci-image-checks.yml | 56 +- .github/workflows/build-images.yml | 3 +- .github/workflows/ci-image-build.yml | 17 +++--- .github/workflows/ci.yml | 2 + .github/workflows/finalize-tests.yml | 67 + .github/workflows/prod-image-build.yml | 9 ++- .github/workflows/prod-image-extra-checks.yml | 3 + .github/workflows/push-image-cache.yml | 68 ++ .../airflow_breeze/utils/docker_command_utils.py | 6 +- dev/breeze/src/airflow_breeze/utils/image.py | 14 + 10 files changed, 128 insertions(+), 117 deletions(-) diff --git a/.github/workflows/additional-ci-image-checks.yml b/.github/workflows/additional-ci-image-checks.yml index 84012fb24f..f73fe08ec7 100644 --- a/.github/workflows/additional-ci-image-checks.yml +++ b/.github/workflows/additional-ci-image-checks.yml @@ -89,42 +89,40 @@ jobs: # delay cache refresh. It does not attempt to upgrade to newer dependencies. # We only push CI cache as PROD cache usually does not gain as much from fresh cache because # it uses prepared airflow and provider packages that invalidate the cache anyway most of the time - # push-early-buildx-cache-to-github-registry: - # name: Push Early Image Cache - # uses: ./.github/workflows/push-image-cache.yml - # permissions: - # contents: read - # # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs - # # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # # "in-workflow-build" condition - # packages: write - # secrets: inherit - # with: - # runs-on: ${{ inputs.runs-on }} - # cache-type: "Early" - # include-prod-images: "false" - # push-latest-images: "false" - # image-tag: ${{ inputs.image-tag }} - # python-versions: ${{ inputs.python-versions }} - # branch: ${{ inputs.branch }} - # use-uv: "true" - # include-success-outputs: ${{ inputs.include-success-outputs }} - # constraints-branch: ${{ inputs.constraints-branch }} - # docker-cache: ${{ inputs.docker-cache }} - # if: inputs.canary-run == 'true' && inputs.branch == 'main' + push-early-buildx-cache-to-github-registry: +name: Push Early Image Cache +uses: ./.github/workflows/push-image-cache.yml +permissions: + contents: read + # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs + # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. + # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the + # "in-workflow-build" condition + packages: write +secrets: inherit +with: + # Runs on Public runners + cache-type: "Early" + include-prod-images: "false" + push-latest-images: "false" + platform: "linux/amd64" + python-versions: ${{ inputs.python-versions }} + branch: ${{ inputs.branch }} + constraints-branch: ${{ inputs.constraints-branch }} + use-uv: "true" + include-success-outputs: ${{ inputs.include-success-outputs }} + docker-cache: ${{ inputs.docker-cache }} +if: inputs.canary-run == 'true' && inputs.branch == 'main' # Check that after earlier cache push, breeze command will build quickly check-that-image-builds-quickly: -timeout-minutes: 5 +timeout-minutes: 11 name: Check that image builds quickly runs-on: ["ubuntu-22.04"] env: UPGRADE_TO_NEWER_DEPENDENCIES: false - PLATFORM: "linux/amd64" PYTHON_MAJOR_MINOR_VERSION: ${{ inputs.default-python-version }} PYTHON_VERSION: ${{ inputs.default-python-version }} - IMAGE_TAG: ${{ inputs.image-tag }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} if: inputs.canary-run == 'true' && inputs.branch == 'main' steps: @@ -142,7 +140,7 @@ jobs: - name: "Login to ghcr.io" run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - name: "Check that image builds quickly" -run: breeze shell --max-time 120 +run: breeze
(airflow) branch fix-image-cache-2 created (now 73e1923443)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache-2 in repository https://gitbox.apache.org/repos/asf/airflow.git at 73e1923443 Fix image cache optimizations - speeding up the build (#38442) This branch includes the following new commits: new 73e1923443 Fix image cache optimizations - speeding up the build (#38442) The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
[PR] Add documentation for Apache Airflow 2.8.4 [airflow-site]
jedcunningham opened a new pull request, #988: URL: https://github.com/apache/airflow-site/pull/988 (no comment) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org