svn commit: r65275 - /release/airflow/clients/python/2.7.3/

2023-11-14 Thread ephraimanierobi
Author: ephraimanierobi
Date: Tue Nov 14 08:46:50 2023
New Revision: 65275

Log:
Release Apache Airflow Python Client 2.7.3 from 2.7.3rc1

Added:
release/airflow/clients/python/2.7.3/
release/airflow/clients/python/2.7.3/apache-airflow-client-2.7.3-bin.tar.gz
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache-airflow-client-2.7.3-bin.tar.gz

release/airflow/clients/python/2.7.3/apache-airflow-client-2.7.3-bin.tar.gz.asc
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache-airflow-client-2.7.3-bin.tar.gz.asc

release/airflow/clients/python/2.7.3/apache-airflow-client-2.7.3-bin.tar.gz.sha512
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache-airflow-client-2.7.3-bin.tar.gz.sha512

release/airflow/clients/python/2.7.3/apache-airflow-client-2.7.3-source.tar.gz
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache-airflow-client-2.7.3-source.tar.gz

release/airflow/clients/python/2.7.3/apache-airflow-client-2.7.3-source.tar.gz.asc
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache-airflow-client-2.7.3-source.tar.gz.asc

release/airflow/clients/python/2.7.3/apache-airflow-client-2.7.3-source.tar.gz.sha512
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache-airflow-client-2.7.3-source.tar.gz.sha512

release/airflow/clients/python/2.7.3/apache_airflow_client-2.7.3-py3-none-any.whl
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache_airflow_client-2.7.3-py3-none-any.whl

release/airflow/clients/python/2.7.3/apache_airflow_client-2.7.3-py3-none-any.whl.asc
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache_airflow_client-2.7.3-py3-none-any.whl.asc

release/airflow/clients/python/2.7.3/apache_airflow_client-2.7.3-py3-none-any.whl.sha512
  - copied unchanged from r65274, 
dev/airflow/clients/python/2.7.3rc1/apache_airflow_client-2.7.3-py3-none-any.whl.sha512



(airflow-client-python) tag 2.7.3 created (now 3e7248a)

2023-11-14 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to tag 2.7.3
in repository https://gitbox.apache.org/repos/asf/airflow-client-python.git


  at 3e7248a  (commit)
No new revisions were added by this update.



(airflow) branch main updated: Add `python_kubernetes_script.jinja2` to package_data (#35626)

2023-11-14 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 9ae57d023b Add `python_kubernetes_script.jinja2` to package_data 
(#35626)
9ae57d023b is described below

commit 9ae57d023b84907c6c6ec62a7d43f2d41cb2ebca
Author: Ephraim Anierobi 
AuthorDate: Tue Nov 14 21:10:57 2023 +0100

Add `python_kubernetes_script.jinja2` to package_data (#35626)

When providers are installed from sources, the 
`python_kubernetes_script.jinja2`
is missing in the kubernetes provider package.

This commit adds the file to package_data so it's available in the 
kubernetes
provider package when installed from sources
---
 setup.py | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/setup.py b/setup.py
index 8a3abf52ec..9b58f3c49b 100644
--- a/setup.py
+++ b/setup.py
@@ -895,6 +895,8 @@ class AirflowDistribution(Distribution):
 provider_yaml_file, str(AIRFLOW_SOURCES_ROOT / "airflow")
 )
 self.package_data["airflow"].append(provider_relative_path)
+# Add python_kubernetes_script.jinja2 to package data
+
self.package_data["airflow"].append("providers/cncf/kubernetes/python_kubernetes_script.jinja2")
 else:
 self.install_requires.extend(
 [



(airflow) branch main updated: Revert "Fix pre-mature evaluation of tasks in mapped task group (#34337)" (#35651)

2023-11-15 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 95bf5dd620 Revert "Fix pre-mature evaluation of tasks in mapped task 
group (#34337)" (#35651)
95bf5dd620 is described below

commit 95bf5dd620ec996dda834ba13048a77314d6d915
Author: Ephraim Anierobi 
AuthorDate: Wed Nov 15 12:53:02 2023 +0100

Revert "Fix pre-mature evaluation of tasks in mapped task group (#34337)" 
(#35651)

This reverts commit 69938fd163045d750b8c218500d79bc89858f9c1.
---
 airflow/ti_deps/deps/trigger_rule_dep.py| 18 ---
 tests/models/test_mappedoperator.py |  4 +--
 tests/ti_deps/deps/test_trigger_rule_dep.py | 47 -
 3 files changed, 8 insertions(+), 61 deletions(-)

diff --git a/airflow/ti_deps/deps/trigger_rule_dep.py 
b/airflow/ti_deps/deps/trigger_rule_dep.py
index 6203b2a79b..ca2a6100a2 100644
--- a/airflow/ti_deps/deps/trigger_rule_dep.py
+++ b/airflow/ti_deps/deps/trigger_rule_dep.py
@@ -27,7 +27,6 @@ from sqlalchemy import and_, func, or_, select
 from airflow.models.taskinstance import PAST_DEPENDS_MET
 from airflow.ti_deps.deps.base_ti_dep import BaseTIDep
 from airflow.utils.state import TaskInstanceState
-from airflow.utils.task_group import MappedTaskGroup
 from airflow.utils.trigger_rule import TriggerRule as TR
 
 if TYPE_CHECKING:
@@ -133,20 +132,6 @@ class TriggerRuleDep(BaseTIDep):
 """
 return ti.task.get_mapped_ti_count(ti.run_id, session=session)
 
-def _iter_expansion_dependencies() -> Iterator[str]:
-from airflow.models.mappedoperator import MappedOperator
-
-if isinstance(ti.task, MappedOperator):
-for op in ti.task.iter_mapped_dependencies():
-yield op.task_id
-task_group = ti.task.task_group
-if task_group and task_group.iter_mapped_task_groups():
-yield from (
-op.task_id
-for tg in task_group.iter_mapped_task_groups()
-for op in tg.iter_mapped_dependencies()
-)
-
 @functools.lru_cache
 def _get_relevant_upstream_map_indexes(upstream_id: str) -> int | 
range | None:
 """Get the given task's map indexes relevant to the current ti.
@@ -157,9 +142,6 @@ class TriggerRuleDep(BaseTIDep):
 """
 if TYPE_CHECKING:
 assert isinstance(ti.task.dag, DAG)
-if isinstance(ti.task.task_group, MappedTaskGroup):
-if upstream_id not in set(_iter_expansion_dependencies()):
-return None
 try:
 expanded_ti_count = _get_expanded_ti_count()
 except (NotFullyPopulated, NotMapped):
diff --git a/tests/models/test_mappedoperator.py 
b/tests/models/test_mappedoperator.py
index 5c2e23c1f9..7244c55774 100644
--- a/tests/models/test_mappedoperator.py
+++ b/tests/models/test_mappedoperator.py
@@ -1305,8 +1305,8 @@ class TestMappedSetupTeardown:
 states = self.get_states(dr)
 expected = {
 "file_transforms.my_setup": {0: "success", 1: "failed", 2: 
"skipped"},
-"file_transforms.my_work": {2: "upstream_failed", 1: 
"upstream_failed", 0: "upstream_failed"},
-"file_transforms.my_teardown": {2: "success", 1: "success", 0: 
"success"},
+"file_transforms.my_work": {0: "success", 1: "upstream_failed", 2: 
"skipped"},
+"file_transforms.my_teardown": {0: "success", 1: 
"upstream_failed", 2: "skipped"},
 }
 
 assert states == expected
diff --git a/tests/ti_deps/deps/test_trigger_rule_dep.py 
b/tests/ti_deps/deps/test_trigger_rule_dep.py
index 1bc8808cb8..00cbcd449a 100644
--- a/tests/ti_deps/deps/test_trigger_rule_dep.py
+++ b/tests/ti_deps/deps/test_trigger_rule_dep.py
@@ -1165,23 +1165,19 @@ def 
test_upstream_in_mapped_group_triggers_only_relevant(dag_maker, session):
 tis = _one_scheduling_decision_iteration()
 assert sorted(tis) == [("tg.t1", 0), ("tg.t1", 1), ("tg.t1", 2)]
 
-# After running the first t1, the remaining t1 must be run before t2 is 
available.
+# After running the first t1, the first t2 becomes immediately available.
 tis["tg.t1", 0].run()
 tis = _one_scheduling_decision_iteration()
-assert sorted(tis) == [("tg.t1", 1), ("tg.t1", 2)]
+assert sorted(tis) == [("tg.t1", 1), ("tg.t1", 2), ("tg.t2", 0)]
 
-# After running all 

(airflow) branch switch-building-airflow-packages-to-generic-image deleted (was 1947165883)

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch 
switch-building-airflow-packages-to-generic-image
in repository https://gitbox.apache.org/repos/asf/airflow.git


 was 1947165883 Switch building airlfow packages to gneneric images instead 
of CI image

The revisions that were on this branch are still contained in
other references; therefore, this change does not discard any commits
from the repository.



(airflow) branch v2-8-test created (now 6fc4a9cb3c)

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


  at 6fc4a9cb3c Update default branches for 2-8

This branch includes the following new commits:

 new 6fc4a9cb3c Update default branches for 2-8

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




(airflow) 01/01: Update default branches for 2-8

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6fc4a9cb3c8923c4455039756bfbbd636fbc7883
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 13:31:03 2023 +0100

Update default branches for 2-8
---
 dev/breeze/src/airflow_breeze/branch_defaults.py  | 4 ++--
 images/breeze/output_ci-image_build.txt   | 2 +-
 images/breeze/output_prod-image_build.txt | 2 +-
 images/breeze/output_release-management_install-provider-packages.txt | 2 +-
 images/breeze/output_release-management_verify-provider-packages.txt  | 2 +-
 images/breeze/output_shell.txt| 2 +-
 images/breeze/output_start-airflow.txt| 2 +-
 7 files changed, 8 insertions(+), 8 deletions(-)

diff --git a/dev/breeze/src/airflow_breeze/branch_defaults.py 
b/dev/breeze/src/airflow_breeze/branch_defaults.py
index c9dbaa8080..755c8cf835 100644
--- a/dev/breeze/src/airflow_breeze/branch_defaults.py
+++ b/dev/breeze/src/airflow_breeze/branch_defaults.py
@@ -37,6 +37,6 @@ Examples:
 """
 from __future__ import annotations
 
-AIRFLOW_BRANCH = "main"
-DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH = "constraints-main"
+AIRFLOW_BRANCH = "v2-8-test"
+DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH = "constraints-2-8"
 DEBIAN_VERSION = "bookworm"
diff --git a/images/breeze/output_ci-image_build.txt 
b/images/breeze/output_ci-image_build.txt
index ab2c24a011..71765b3d76 100644
--- a/images/breeze/output_ci-image_build.txt
+++ b/images/breeze/output_ci-image_build.txt
@@ -1 +1 @@
-26e91cab59fc5836def7404484c619a4
+5a1b532b087c0feb24ffdc9da3ec4e81
diff --git a/images/breeze/output_prod-image_build.txt 
b/images/breeze/output_prod-image_build.txt
index 456dc95358..e511dfeeb4 100644
--- a/images/breeze/output_prod-image_build.txt
+++ b/images/breeze/output_prod-image_build.txt
@@ -1 +1 @@
-1628f7bff3e7e369f0358a646682e674
+bdbf7fe98a65e3384825cc99c9ea03e9
diff --git 
a/images/breeze/output_release-management_install-provider-packages.txt 
b/images/breeze/output_release-management_install-provider-packages.txt
index 47f58a6341..b36dc86e24 100644
--- a/images/breeze/output_release-management_install-provider-packages.txt
+++ b/images/breeze/output_release-management_install-provider-packages.txt
@@ -1 +1 @@
-34c38aca17d23dbb454fe7a6bfd8e630
+05ff214ada04958a95f2aedc1953079e
diff --git 
a/images/breeze/output_release-management_verify-provider-packages.txt 
b/images/breeze/output_release-management_verify-provider-packages.txt
index 88ef90c79e..20070bef37 100644
--- a/images/breeze/output_release-management_verify-provider-packages.txt
+++ b/images/breeze/output_release-management_verify-provider-packages.txt
@@ -1 +1 @@
-13083dc08dc69b40015b61f8be607918
+f7fe0f6356904e7c8ca2a3bdbe841f6e
diff --git a/images/breeze/output_shell.txt b/images/breeze/output_shell.txt
index 7ae7c30c84..0eb563b346 100644
--- a/images/breeze/output_shell.txt
+++ b/images/breeze/output_shell.txt
@@ -1 +1 @@
-64d63199a5c877a0bf9e1da29de02b67
+6c2dd7e3326135c4dd8a781e305a4cd4
diff --git a/images/breeze/output_start-airflow.txt 
b/images/breeze/output_start-airflow.txt
index a6792d243a..6bb4853d68 100644
--- a/images/breeze/output_start-airflow.txt
+++ b/images/breeze/output_start-airflow.txt
@@ -1 +1 @@
-313d97eb6459fe153d4e1cbaa1082d46
+2ee0824b571f94c942f78aa692954e79



(airflow) branch v2-8-stable created (now 6fc4a9cb3c)

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


  at 6fc4a9cb3c Update default branches for 2-8

No new revisions were added by this update.



(airflow) branch constraints-2-8 created (now c7abfb4fb1)

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch constraints-2-8
in repository https://gitbox.apache.org/repos/asf/airflow.git


  at c7abfb4fb1 Updating constraints. Github run id:4693629049

No new revisions were added by this update.



(airflow) branch main updated: Add v2-8 branches to codecov.yml and .asf.yaml (#35750)

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new c07c5925e9 Add v2-8 branches to codecov.yml and .asf.yaml (#35750)
c07c5925e9 is described below

commit c07c5925e93ba0a8f37f20c7deb814d0f6925705
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 15:10:37 2023 +0100

Add v2-8 branches to codecov.yml and .asf.yaml (#35750)
---
 .asf.yaml   | 3 +++
 codecov.yml | 2 ++
 2 files changed, 5 insertions(+)

diff --git a/.asf.yaml b/.asf.yaml
index 6166687a5d..094299d2c4 100644
--- a/.asf.yaml
+++ b/.asf.yaml
@@ -71,6 +71,9 @@ github:
 v2-7-stable:
   required_pull_request_reviews:
 required_approving_review_count: 1
+v2-8-stable:
+  required_pull_request_reviews:
+required_approving_review_count: 1
 
   collaborators:
 - mhenc
diff --git a/codecov.yml b/codecov.yml
index 67ea777302..d1ed5fb446 100644
--- a/codecov.yml
+++ b/codecov.yml
@@ -55,6 +55,8 @@ coverage:
   - v2-6-test
   - v2-7-stable
   - v2-7-test
+  - v2-8-stable
+  - v2-8-test
 if_not_found: success
 if_ci_failed: error
 informational: true



(airflow) branch main updated (c07c5925e9 -> 0d1c8de78c)

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from c07c5925e9 Add v2-8 branches to codecov.yml and .asf.yaml (#35750)
 add 0d1c8de78c Update minor release command (#35751)

No new revisions were added by this update.

Summary of changes:
 dev/breeze/src/airflow_breeze/commands/minor_release_command.py | 1 +
 1 file changed, 1 insertion(+)



(airflow) 02/02: Update RELEASE_NOTES.rst

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 78c5029ce1d8d9e119164d1b6cd9a5c92636a03f
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 22:51:30 2023 +0100

Update RELEASE_NOTES.rst
---
 RELEASE_NOTES.rst   | 193 +++-
 newsfragments/35460.significant.rst |  10 --
 2 files changed, 192 insertions(+), 11 deletions(-)

diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index 62183c8b58..c6c0d1f0e1 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -21,7 +21,198 @@
 
 .. towncrier release notes start
 
-Airflow 2.7.3 (2023-11-04)
+Airflow 2.8.0 (2023-12-14)
+--
+
+Significant Changes
+^^^
+
+- Raw HTML code in DAG docs and DAG params descriptions is disabled by default
+
+  To ensure that no malicious javascript can be injected with DAG descriptions 
or trigger UI forms by DAG authors
+  a new parameter ``webserver.allow_raw_html_descriptions`` was added with 
default value of ``False``.
+  If you trust your DAG authors code and want to allow using raw HTML in DAG 
descriptions and params, you can restore the previous
+  behavior by setting the configuration value to ``True``.
+
+  To ensure Airflow is secure by default, the raw HTML support in trigger UI 
has been super-seeded by markdown support via
+  the ``description_md`` attribute. If you have been using 
``description_html`` please migrate to ``description_md``.
+  The ``custom_html_form`` is now deprecated. (#35460)
+
+New Features
+""""""""""""
+- AIP-58: Add Airflow ObjectStore (AFS) (#34729)
+- Add "literal" wrapper to disable field templating (#35017)
+- Add task context logging feature to allow forwarding messages to task logs 
(#32646)
+- Add Listener hooks for Datasets (#34418)
+- Allow override of navbar text color (#35505)
+- Add lightweight serialization for deltalake tables (#35462)
+- Add support for serialization of iceberg tables (#35456)
+- ``prev_end_date_success`` method access (#34528)
+- Add task parameter to set custom logger name (#34964)
+- Add pyspark decorator (#35247)
+- Add trigger as a valid option for the db clean command (#34908)
+- Add decorators for external and venv python branching operators (#35043)
+- Allow PythonVenvOperator using other index url (#33017)
+- Add Python Virtualenv Operator Caching (#33355)
+- Introduce a generic export for containerized executor logging (#34903)
+- Add ability to clear downstream tis in ``List Task Instances`` view  (#34529)
+- Attribute ``clear_number`` to track DAG run being cleared (#34126)
+- Add BranchPythonVirtualenvOperator (#33356)
+- Allow PythonVenvOperator using other index url (#33017)
+- Add CLI notification commands to providers (#33116)
+
+Improvements
+""""""""""""
+- Move external logs links to top of react logs page (#35668)
+- Change terminal mode to cbreak in ``execute_interactive`` and handle 
``SIGINT`` (#35602)
+- Make raw HTML descriptions configurable (#35460)
+- Allow email field to be templated (#35546)
+- Hide logical date and run id in trigger UI form (#35284)
+- Improved instructions for adding dependencies in TaskFlow (#35406)
+- Add optional exit code to list import errors (#35378)
+- Limit query result on DB rather than client in ``synchronize_log_template`` 
function (#35366)
+- Feature: Allow description to be passed in when using variables CLI (#34791)
+- Allow optional defaults in required fields with manual triggered dags 
(#31301)
+- Permitting airflow kerberos to run in different modes (#35146)
+- Refactor commands to unify daemon context handling (#34945)
+- Add extra fields to plugins endpoint (#34913)
+- Add description to pools view (#34862)
+- Move cli's Connection export and Variable export command print logic to a 
separate function (#34647)
+- Extract and reuse get_kerberos_principle func from get_kerberos_principle 
(#34936)
+- Change type annotation for ``BaseOperatorLink.operators`` (#35003)
+- Optimise and migrate to ``SA2-compatible`` syntax for TaskReschedule (#33720)
+- Consolidate the permissions name in SlaMissModelView (#34949)
+- Add debug log saying what's being run to ``EventScheduler`` (#34808)
+- Increase log reader stream loop sleep duration to 1 second (#34789)
+- Resolve pydantic deprecation warnings re ``update_forward_refs`` (#34657)
+- Unify mapped task group lookup logic (#34637)
+- Allow filtering event logs by attributes (#34417)
+- Make connection login and password TEXT (#32815)
+- Ban import ``Dataset`` from ``airflow`` package in codebase (#34610)
+- Use ``airflow.datasets.Dataset`` in examples and tests (#34605)
+- Enhance task status visibility (#34486)
+- Simplify DAG trigger UI (#34567)
+- Ban import AirflowException fr

(airflow) branch v2-8-test updated (6fc4a9cb3c -> 78c5029ce1)

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 6fc4a9cb3c Update default branches for 2-8
 new 0137ae90be Update version to 2.8.0
 new 78c5029ce1 Update RELEASE_NOTES.rst

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 README.md  |  16 +-
 RELEASE_NOTES.rst  | 193 -
 airflow/__init__.py|   2 +-
 airflow/api_connexion/openapi/v1.yaml  |   2 +-
 .../logging-monitoring/logging-tasks.rst   |   2 +-
 .../installation/supported-versions.rst|   2 +-
 docs/docker-stack/README.md|  10 +-
 .../customizing/own-requirements.sh|   2 +-
 .../extending/add-airflow-configuration/Dockerfile |   2 +-
 .../extending/add-apt-packages/Dockerfile  |   2 +-
 .../add-build-essential-extend/Dockerfile  |   2 +-
 .../extending/add-providers/Dockerfile |   2 +-
 .../extending/add-pypi-packages/Dockerfile |   2 +-
 .../extending/add-requirement-packages/Dockerfile  |   2 +-
 .../extending/custom-providers/Dockerfile  |   2 +-
 .../extending/embedding-dags/Dockerfile|   2 +-
 .../extending/writable-directory/Dockerfile|   2 +-
 docs/docker-stack/entrypoint.rst   |  18 +-
 generated/PYPI_README.md   |  14 +-
 newsfragments/35460.significant.rst|  10 --
 .../ci/pre_commit/pre_commit_supported_versions.py |   2 +-
 21 files changed, 236 insertions(+), 55 deletions(-)
 delete mode 100644 newsfragments/35460.significant.rst



(airflow) 01/02: Update version to 2.8.0

2023-11-20 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0137ae90be060457b76bf6c786ab91e2056be155
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 22:48:45 2023 +0100

Update version to 2.8.0
---
 README.md  | 16 
 airflow/__init__.py|  2 +-
 airflow/api_connexion/openapi/v1.yaml  |  2 +-
 .../logging-monitoring/logging-tasks.rst   |  2 +-
 .../apache-airflow/installation/supported-versions.rst |  2 +-
 docs/docker-stack/README.md| 10 +-
 .../docker-examples/customizing/own-requirements.sh|  2 +-
 .../extending/add-airflow-configuration/Dockerfile |  2 +-
 .../extending/add-apt-packages/Dockerfile  |  2 +-
 .../extending/add-build-essential-extend/Dockerfile|  2 +-
 .../docker-examples/extending/add-providers/Dockerfile |  2 +-
 .../extending/add-pypi-packages/Dockerfile |  2 +-
 .../extending/add-requirement-packages/Dockerfile  |  2 +-
 .../extending/custom-providers/Dockerfile  |  2 +-
 .../extending/embedding-dags/Dockerfile|  2 +-
 .../extending/writable-directory/Dockerfile|  2 +-
 docs/docker-stack/entrypoint.rst   | 18 +-
 generated/PYPI_README.md   | 14 +++---
 scripts/ci/pre_commit/pre_commit_supported_versions.py |  2 +-
 19 files changed, 44 insertions(+), 44 deletions(-)

diff --git a/README.md b/README.md
index 294e52d81a..f2364bee42 100644
--- a/README.md
+++ b/README.md
@@ -90,13 +90,13 @@ Airflow is not a streaming solution, but it is often used 
to process real-time d
 
 Apache Airflow is tested with:
 
-| | Main version (dev) | Stable version (2.7.3)   |
+| | Main version (dev) | Stable version (2.8.0)   |
 |-||--|
 | Python  | 3.8, 3.9, 3.10, 3.11   | 3.8, 3.9, 3.10, 3.11 |
 | Platform| AMD64/ARM64(\*)| AMD64/ARM64(\*)  |
 | Kubernetes  | 1.25, 1.26, 1.27, 1.28 | 1.24, 1.25, 1.26, 1.27, 1.28 |
-| PostgreSQL  | 11, 12, 13, 14, 15, 16 | 11, 12, 13, 14, 15   |
-| MySQL   | 8.0, Innovation| 5.7, 8.0 |
+| PostgreSQL  | 11, 12, 13, 14, 15, 16 | 11, 12, 13, 14, 15, 16   |
+| MySQL   | 8.0, Innovation| 8.0, Innovation  |
 | SQLite  | 3.15.0+| 3.15.0+  |
 | MSSQL   | 2017(\*\*), 2019(\*\*) | 2017(\*\*), 2019(\*\*)   |
 
@@ -175,15 +175,15 @@ them to the appropriate format and workflow that your 
tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.7.3' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.7.3/constraints-3.8.txt";
+pip install 'apache-airflow==2.8.0' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.8.0/constraints-3.8.txt";
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.7.3' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.7.3/constraints-3.8.txt";
+pip install 'apache-airflow[postgres,google]==2.8.0' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.8.0/constraints-3.8.txt";
 ```
 
 For information on installing provider packages, check
@@ -295,7 +295,7 @@ Apache Airflow version life cycle:
 
 | Version   | Current Patch/Minor   | State | First Release   | Limited 
Support   | EOL/Terminated   |
 
|---|---|---|-|---|--|
-| 2 | 2.7.3 | Supported | Dec 17, 2020| TBD
   | TBD  |
+| 2 | 2.8.0 | Supported | Dec 17, 2020| TBD
   | TBD  |
 | 1.10  | 1.10.15   | EOL   | Aug 27, 2018| Dec 17, 
2020  | June 17, 2021|
 | 1.9   | 1.9.0 | EOL   | Jan 03, 2018| Aug 27, 
2018  | Aug 27, 2018 |
 | 1.8   | 1.8.2 | EOL   | Mar 19, 2017| Jan 03, 
2018  | Jan 03, 2018 |
diff --git a/airflow/__init__.py b/airflow/__init__.py
index b63ff1dc05..59117d2950 100644
--- a/airflow/__init__.py
+++ b/airflow/__init__.py
@@ -26,7 +26,7 @@ isort:skip_file
 """
 from __future__ import annotations
 
-__version__ = "2.8.0.dev0"
+__version__ = "2.8.0"
 
 # flake8: noqa: F401
 
diff --git a/airflow/api_connexion/openapi/v1.yaml 
b/airflow/api_connexion/openapi/v1.yaml
index e4ae9c776f..a2f1c938ad 100644
--- a/ai

(airflow) 01/02: Update version to 2.8.0

2023-11-21 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 729c1df9ebac0d697875c8878ebe66a137759010
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 22:48:45 2023 +0100

Update version to 2.8.0
---
 README.md  | 18 +-
 airflow/__init__.py|  2 +-
 airflow/api_connexion/openapi/v1.yaml  |  2 +-
 .../logging-monitoring/logging-tasks.rst   |  2 +-
 .../apache-airflow/installation/supported-versions.rst |  2 +-
 docs/docker-stack/README.md| 10 +-
 .../docker-examples/customizing/own-requirements.sh|  2 +-
 .../extending/add-airflow-configuration/Dockerfile |  2 +-
 .../extending/add-apt-packages/Dockerfile  |  2 +-
 .../extending/add-build-essential-extend/Dockerfile|  2 +-
 .../docker-examples/extending/add-providers/Dockerfile |  2 +-
 .../extending/add-pypi-packages/Dockerfile |  2 +-
 .../extending/add-requirement-packages/Dockerfile  |  2 +-
 .../extending/custom-providers/Dockerfile  |  2 +-
 .../extending/embedding-dags/Dockerfile|  2 +-
 .../extending/writable-directory/Dockerfile|  2 +-
 docs/docker-stack/entrypoint.rst   | 18 +-
 generated/PYPI_README.md   | 16 
 scripts/ci/pre_commit/pre_commit_supported_versions.py |  2 +-
 19 files changed, 46 insertions(+), 46 deletions(-)

diff --git a/README.md b/README.md
index 294e52d81a..378e792766 100644
--- a/README.md
+++ b/README.md
@@ -90,13 +90,13 @@ Airflow is not a streaming solution, but it is often used 
to process real-time d
 
 Apache Airflow is tested with:
 
-| | Main version (dev) | Stable version (2.7.3)   |
+| | Main version (dev) | Stable version (2.8.0)   |
 |-||--|
 | Python  | 3.8, 3.9, 3.10, 3.11   | 3.8, 3.9, 3.10, 3.11 |
 | Platform| AMD64/ARM64(\*)| AMD64/ARM64(\*)  |
-| Kubernetes  | 1.25, 1.26, 1.27, 1.28 | 1.24, 1.25, 1.26, 1.27, 1.28 |
-| PostgreSQL  | 11, 12, 13, 14, 15, 16 | 11, 12, 13, 14, 15   |
-| MySQL   | 8.0, Innovation| 5.7, 8.0 |
+| Kubernetes  | 1.25, 1.26, 1.27, 1.28 | 1.25, 1.26, 1.27, 1.28   |
+| PostgreSQL  | 11, 12, 13, 14, 15, 16 | 11, 12, 13, 14, 15, 16   |
+| MySQL   | 8.0, Innovation| 8.0, Innovation  |
 | SQLite  | 3.15.0+| 3.15.0+  |
 | MSSQL   | 2017(\*\*), 2019(\*\*) | 2017(\*\*), 2019(\*\*)   |
 
@@ -175,15 +175,15 @@ them to the appropriate format and workflow that your 
tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.7.3' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.7.3/constraints-3.8.txt";
+pip install 'apache-airflow==2.8.0' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.8.0/constraints-3.8.txt";
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.7.3' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.7.3/constraints-3.8.txt";
+pip install 'apache-airflow[postgres,google]==2.8.0' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.8.0/constraints-3.8.txt";
 ```
 
 For information on installing provider packages, check
@@ -295,7 +295,7 @@ Apache Airflow version life cycle:
 
 | Version   | Current Patch/Minor   | State | First Release   | Limited 
Support   | EOL/Terminated   |
 
|---|---|---|-|---|--|
-| 2 | 2.7.3 | Supported | Dec 17, 2020| TBD
   | TBD  |
+| 2 | 2.8.0 | Supported | Dec 17, 2020| TBD
   | TBD  |
 | 1.10  | 1.10.15   | EOL   | Aug 27, 2018| Dec 17, 
2020  | June 17, 2021|
 | 1.9   | 1.9.0 | EOL   | Jan 03, 2018| Aug 27, 
2018  | Aug 27, 2018 |
 | 1.8   | 1.8.2 | EOL   | Mar 19, 2017| Jan 03, 
2018  | Jan 03, 2018 |
diff --git a/airflow/__init__.py b/airflow/__init__.py
index b63ff1dc05..59117d2950 100644
--- a/airflow/__init__.py
+++ b/airflow/__init__.py
@@ -26,7 +26,7 @@ isort:skip_file
 """
 from __future__ import annotations
 
-__version__ = "2.8.0.dev0"
+__version__ = "2.8.0"
 
 # flake8: noqa: F401
 
diff --git a/airflow/api_connexion/openapi/v1.yaml 
b/airflow/

(airflow) branch v2-8-test updated (78c5029ce1 -> d6c7d33d43)

2023-11-21 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


omit 78c5029ce1 Update RELEASE_NOTES.rst
omit 0137ae90be Update version to 2.8.0
 new 729c1df9eb Update version to 2.8.0
 new d6c7d33d43 Update RELEASE_NOTES.rst

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (78c5029ce1)
\
 N -- N -- N   refs/heads/v2-8-test (d6c7d33d43)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 README.md|  2 +-
 RELEASE_NOTES.rst| 65 +---
 generated/PYPI_README.md |  2 +-
 3 files changed, 19 insertions(+), 50 deletions(-)



(airflow) 02/02: Update RELEASE_NOTES.rst

2023-11-21 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d6c7d33d435d8aa2d2ea4b8a3893defaf3ba6bc8
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 22:51:30 2023 +0100

Update RELEASE_NOTES.rst
---
 RELEASE_NOTES.rst   | 162 +++-
 newsfragments/35460.significant.rst |  10 ---
 2 files changed, 161 insertions(+), 11 deletions(-)

diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index 62183c8b58..4c374c3814 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -21,7 +21,167 @@
 
 .. towncrier release notes start
 
-Airflow 2.7.3 (2023-11-04)
+Airflow 2.8.0 (2023-12-14)
+--
+
+Significant Changes
+^^^
+
+- Raw HTML code in DAG docs and DAG params descriptions is disabled by default
+
+  To ensure that no malicious javascript can be injected with DAG descriptions 
or trigger UI forms by DAG authors
+  a new parameter ``webserver.allow_raw_html_descriptions`` was added with 
default value of ``False``.
+  If you trust your DAG authors code and want to allow using raw HTML in DAG 
descriptions and params, you can restore the previous
+  behavior by setting the configuration value to ``True``.
+
+  To ensure Airflow is secure by default, the raw HTML support in trigger UI 
has been super-seeded by markdown support via
+  the ``description_md`` attribute. If you have been using 
``description_html`` please migrate to ``description_md``.
+  The ``custom_html_form`` is now deprecated. (#35460)
+
+New Features
+""""""""""""
+- AIP-58: Add Airflow ObjectStore (AFS) (`AIP-58 
<https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-58+milestone%3A%22Airflow+2.8.0%22>`_)
+- Add "literal" wrapper to disable field templating (#35017)
+- Add task context logging feature to allow forwarding messages to task logs 
(#32646, #32693)
+- Add Listener hooks for Datasets (#34418)
+- Allow override of navbar text color (#35505)
+- Add lightweight serialization for deltalake tables (#35462)
+- Add support for serialization of iceberg tables (#35456)
+- ``prev_end_date_success`` method access (#34528)
+- Add task parameter to set custom logger name (#34964)
+- Add pyspark decorator (#35247)
+- Add trigger as a valid option for the db clean command (#34908)
+- Add decorators for external and venv python branching operators (#35043)
+- Allow PythonVenvOperator using other index url (#33017)
+- Add Python Virtualenv Operator Caching (#33355)
+- Introduce a generic export for containerized executor logging (#34903)
+- Add ability to clear downstream tis in ``List Task Instances`` view  (#34529)
+- Attribute ``clear_number`` to track DAG run being cleared (#34126)
+- Add BranchPythonVirtualenvOperator (#33356)
+- Allow PythonVenvOperator using other index url (#33017)
+- Add CLI notification commands to providers (#33116)
+
+Improvements
+""""""""""""
+- Move external logs links to top of react logs page (#35668)
+- Change terminal mode to ``cbreak`` in ``execute_interactive`` and handle 
``SIGINT`` (#35602)
+- Make raw HTML descriptions configurable (#35460)
+- Allow email field to be templated (#35546)
+- Hide logical date and run id in trigger UI form (#35284)
+- Improved instructions for adding dependencies in TaskFlow (#35406)
+- Add optional exit code to list import errors (#35378)
+- Limit query result on DB rather than client in ``synchronize_log_template`` 
function (#35366)
+- Feature: Allow description to be passed in when using variables CLI (#34791)
+- Allow optional defaults in required fields with manual triggered dags 
(#31301)
+- Permitting airflow kerberos to run in different modes (#35146)
+- Refactor commands to unify daemon context handling (#34945)
+- Add extra fields to plugins endpoint (#34913)
+- Add description to pools view (#34862)
+- Move cli's Connection export and Variable export command print logic to a 
separate function (#34647)
+- Extract and reuse get_kerberos_principle func from get_kerberos_principle 
(#34936)
+- Change type annotation for ``BaseOperatorLink.operators`` (#35003)
+- Optimise and migrate to ``SA2-compatible`` syntax for TaskReschedule (#33720)
+- Consolidate the permissions name in SlaMissModelView (#34949)
+- Add debug log saying what's being run to ``EventScheduler`` (#34808)
+- Increase log reader stream loop sleep duration to 1 second (#34789)
+- Resolve pydantic deprecation warnings re ``update_forward_refs`` (#34657)
+- Unify mapped task group lookup logic (#34637)
+- Allow filtering event logs by attributes (#34417)
+- Make connection login and password TEXT (#32815)
+- Ban import ``Dataset`` from ``airflow`` package in codebase (#34610)
+- Use ``airflow.datasets.Dataset`` in examples and

(airflow) 03/37: Remove backcompat with Airflow 2.3/2.4 in providers (#35727)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 619855884356abbd2e6d8a4f7956e4a0510b8c9e
Author: Andrey Anshin 
AuthorDate: Mon Nov 20 18:11:54 2023 +0400

Remove backcompat with Airflow 2.3/2.4 in providers (#35727)

* Remove backcompat with Airflow 2.3/2.4 in providers

* Revert changes in sql.py
---
 airflow/providers/google/cloud/hooks/gcs.py  |  9 +
 airflow/providers/google/cloud/secrets/secret_manager.py | 14 ++
 airflow/providers/microsoft/azure/secrets/key_vault.py   | 14 ++
 3 files changed, 13 insertions(+), 24 deletions(-)

diff --git a/airflow/providers/google/cloud/hooks/gcs.py 
b/airflow/providers/google/cloud/hooks/gcs.py
index d8bb36037f..1a74d09d55 100644
--- a/airflow/providers/google/cloud/hooks/gcs.py
+++ b/airflow/providers/google/cloud/hooks/gcs.py
@@ -45,6 +45,7 @@ from airflow.exceptions import AirflowException, 
AirflowProviderDeprecationWarni
 from airflow.providers.google.cloud.utils.helpers import 
normalize_directory_path
 from airflow.providers.google.common.consts import CLIENT_INFO
 from airflow.providers.google.common.hooks.base_google import 
GoogleBaseAsyncHook, GoogleBaseHook
+from airflow.typing_compat import ParamSpec
 from airflow.utils import timezone
 from airflow.version import version
 
@@ -54,14 +55,6 @@ if TYPE_CHECKING:
 from aiohttp import ClientSession
 from google.api_core.retry import Retry
 
-try:
-# Airflow 2.3 doesn't have this yet
-from airflow.typing_compat import ParamSpec
-except ImportError:
-try:
-from typing import ParamSpec  # type: ignore[no-redef, attr-defined]
-except ImportError:
-from typing_extensions import ParamSpec
 
 RT = TypeVar("RT")
 T = TypeVar("T", bound=Callable)
diff --git a/airflow/providers/google/cloud/secrets/secret_manager.py 
b/airflow/providers/google/cloud/secrets/secret_manager.py
index fd8b8e33e2..a40c6bfbe5 100644
--- a/airflow/providers/google/cloud/secrets/secret_manager.py
+++ b/airflow/providers/google/cloud/secrets/secret_manager.py
@@ -28,7 +28,6 @@ from 
airflow.providers.google.cloud._internal_client.secret_manager_client impor
 from airflow.providers.google.cloud.utils.credentials_provider import 
get_credentials_and_project_id
 from airflow.secrets import BaseSecretsBackend
 from airflow.utils.log.logging_mixin import LoggingMixin
-from airflow.version import version as airflow_version
 
 log = logging.getLogger(__name__)
 
@@ -154,13 +153,12 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 :param conn_id: the connection id
 :return: deserialized Connection
 """
-if _parse_version(airflow_version) >= (2, 3):
-warnings.warn(
-f"Method `{self.__class__.__name__}.get_conn_uri` is 
deprecated and will be removed "
-"in a future release.  Please use method `get_conn_value` 
instead.",
-AirflowProviderDeprecationWarning,
-stacklevel=2,
-)
+warnings.warn(
+f"Method `{self.__class__.__name__}.get_conn_uri` is deprecated 
and will be removed "
+"in a future release.  Please use method `get_conn_value` 
instead.",
+AirflowProviderDeprecationWarning,
+stacklevel=2,
+)
 return self.get_conn_value(conn_id)
 
 def get_variable(self, key: str) -> str | None:
diff --git a/airflow/providers/microsoft/azure/secrets/key_vault.py 
b/airflow/providers/microsoft/azure/secrets/key_vault.py
index 794788206c..bfa9117b11 100644
--- a/airflow/providers/microsoft/azure/secrets/key_vault.py
+++ b/airflow/providers/microsoft/azure/secrets/key_vault.py
@@ -38,7 +38,6 @@ from airflow.exceptions import 
AirflowProviderDeprecationWarning
 from airflow.providers.microsoft.azure.utils import 
get_sync_default_azure_credential
 from airflow.secrets import BaseSecretsBackend
 from airflow.utils.log.logging_mixin import LoggingMixin
-from airflow.version import version as airflow_version
 
 
 def _parse_version(val):
@@ -170,13 +169,12 @@ class AzureKeyVaultBackend(BaseSecretsBackend, 
LoggingMixin):
 :param conn_id: the connection id
 :return: deserialized Connection
 """
-if _parse_version(airflow_version) >= (2, 3):
-warnings.warn(
-f"Method `{self.__class__.__name__}.get_conn_uri` is 
deprecated and will be removed "
-"in a future release.  Please use method `get_conn_value` 
instead.",
-AirflowProviderDeprecationWarning,
-stacklevel=2,
-)
+warnings.warn(
+f"Method `{self.__class__.__name__}.g

(airflow) 15/37: Set mark_end_on_close after set_context (#35761)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f40e1c17ee07ebeb6ebf1494e1fc845a00e599dc
Author: Daniel Standish <15932138+dstand...@users.noreply.github.com>
AuthorDate: Tue Nov 21 05:42:18 2023 -0800

Set mark_end_on_close after set_context (#35761)

In ES task handler, set_context applies its own logic for 
mark_end_on_close, so it we must set the attr after for our override to persist.
---
 airflow/utils/log/task_context_logger.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/utils/log/task_context_logger.py 
b/airflow/utils/log/task_context_logger.py
index 0661789f5b..84ed207e3a 100644
--- a/airflow/utils/log/task_context_logger.py
+++ b/airflow/utils/log/task_context_logger.py
@@ -98,9 +98,9 @@ class TaskContextLogger:
 
 task_handler = copy(self.task_handler)
 try:
+task_handler.set_context(ti, identifier=self.component_name)
 if hasattr(task_handler, "mark_end_on_close"):
 task_handler.mark_end_on_close = False
-task_handler.set_context(ti, identifier=self.component_name)
 filename, lineno, func, stackinfo = logger.findCaller()
 record = logging.LogRecord(
 self.component_name, level, filename, lineno, msg, args, None, 
func=func



(airflow) 14/37: Fix broken link to Weaviate docs (#35776)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 1df306a337c8951b7e8cf06ae6331313b8b23e9c
Author: Pankaj Koti 
AuthorDate: Tue Nov 21 17:34:50 2023 +0530

Fix broken link to Weaviate docs (#35776)
---
 docs/apache-airflow-providers-weaviate/connections.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow-providers-weaviate/connections.rst 
b/docs/apache-airflow-providers-weaviate/connections.rst
index fc46c972a8..5e16164ff6 100644
--- a/docs/apache-airflow-providers-weaviate/connections.rst
+++ b/docs/apache-airflow-providers-weaviate/connections.rst
@@ -46,7 +46,7 @@ Extra (optional)
 * If you'd like to use Vectorizers for your class, configure the API keys 
to use the corresponding
   embedding API. The extras accepts a key ``additional_headers`` 
containing the dictionary
   of API keys for the embedding API authentication. They are mentioned in 
a section here:
-  `addtional_headers 
<https://weaviate.io/developers/academy/zero_to_mvp/hello_weaviate/hands_on#-client-instantiation>__`
+  `addtional_headers 
<https://weaviate.io/developers/academy/zero_to_mvp/hello_weaviate/hands_on#-client-instantiation>`__
 
 Weaviate API Token (optional)
 Specify your Weaviate API Key to connect when API Key option is to be used 
for authentication.



(airflow) 09/37: Update README.md to reflect changes we agreed to the versioning (#35764)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7642b29f5c0eaef898839579304ba0c8223801f1
Author: Jarek Potiuk 
AuthorDate: Mon Nov 20 23:59:16 2023 +0100

Update README.md to reflect changes we agreed to the versioning (#35764)

There were two changes that we agreed and voted in the community
that have not been properly reflected yet in README.md (they are
reflected in detailed documentation though):

* for providers, buping minimum version of Airflow is not a reason
  to bump MAJOR version - only MINOR

* for API clients - they follow their own versioning SemVer scheme,
  independent from Airflow.
---
 README.md | 15 ---
 1 file changed, 4 insertions(+), 11 deletions(-)

diff --git a/README.md b/README.md
index 294e52d81a..f68615dcd1 100644
--- a/README.md
+++ b/README.md
@@ -269,22 +269,15 @@ packages:
   they are present in providers as `install_requires` limitations. We aim to 
keep backwards
   compatibility of providers with all previously released Airflow 2 versions 
but
   there will sometimes be breaking changes that might make some, or all
-  providers, have minimum Airflow version specified. Change of that minimum 
supported Airflow version
-  is a breaking change for provider because installing the new provider might 
automatically
-  upgrade Airflow (which might be an undesired side effect of upgrading 
provider).
+  providers, have minimum Airflow version specified.
 * **Airflow Helm Chart**: SemVer rules apply to changes in the chart only. 
SemVer MAJOR and MINOR
   versions for the chart are independent of the Airflow version. We aim to 
keep backwards
   compatibility of the Helm Chart with all released Airflow 2 versions, but 
some new features might
   only work starting from specific Airflow releases. We might however limit 
the Helm
   Chart to depend on minimal Airflow version.
-* **Airflow API clients**: SemVer MAJOR and MINOR versions follow MAJOR and 
MINOR versions of Airflow.
-  The first MAJOR or MINOR X.Y.0 release of Airflow should always be followed 
by X.Y.0 release of
-  all clients. An airflow PATCH X.Y.Z release can be followed by a PATCH 
release of API clients, only
-  if this PATCH is relevant to the clients.
-  The clients then can release their own PATCH releases with bugfixes, 
independently of Airflow PATCH releases.
-  As a consequence, each API client will have its own PATCH version that may 
or may not be in sync with the Airflow
-  PATCH version. For a specific MAJOR/MINOR Airflow version, users should 
favor the latest PATCH version of clients
-  independently of their Airflow PATCH version.
+* **Airflow API clients**: Their versioning is independent from Airflow 
versions. They follow their own
+  SemVer rules for breaking changes and new features - which for example 
allows to change the way we generate
+  the clients.
 
 ## Version Life Cycle
 



(airflow) 08/37: More detail on mandatory task arguments (#35740)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9b832f6acd79ec1329ad235e65a7abd2be93a0eb
Author: Cai Parry-Jones <97813242+cai...@users.noreply.github.com>
AuthorDate: Mon Nov 20 22:02:59 2023 +

More detail on mandatory task arguments (#35740)

* More detail on mandatory task arguments

Current documentation notes that the arguments 'task_id' and 'owner' are 
both mandatory. This might confuse new users to believe that both arguments 
require user input to avoid an error. But 'owner' has a default default_value, 
so this argument should be less of a concern for user task and dag creation. 
This commit aims to communicate that.

* fix typo

* Update fundamentals.rst

-

Co-authored-by: Jarek Potiuk 
---
 docs/apache-airflow/tutorial/fundamentals.rst | 7 +--
 1 file changed, 5 insertions(+), 2 deletions(-)

diff --git a/docs/apache-airflow/tutorial/fundamentals.rst 
b/docs/apache-airflow/tutorial/fundamentals.rst
index 2c710530b2..e68205aca9 100644
--- a/docs/apache-airflow/tutorial/fundamentals.rst
+++ b/docs/apache-airflow/tutorial/fundamentals.rst
@@ -146,8 +146,11 @@ The precedence rules for a task are as follows:
 2.  Values that exist in the ``default_args`` dictionary
 3.  The operator's default value, if one exists
 
-A task must include or inherit the arguments ``task_id`` and ``owner``,
-otherwise Airflow will raise an exception.
+.. note::
+A task must include or inherit the arguments ``task_id`` and ``owner``,
+otherwise Airflow will raise an exception. A fresh install of Airflow will
+have a default value of 'airflow' set for ``owner``, so you only really 
need
+to worry about ensuring ``task_id`` has a value.
 
 Templating with Jinja
 -



(airflow) 20/37: Check attr on parent not self re TaskContextLogger set_context (#35780)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit feaeb8c5fb246f945be5aa96ba49441c6700ae39
Author: Daniel Standish <15932138+dstand...@users.noreply.github.com>
AuthorDate: Tue Nov 21 09:55:16 2023 -0800

Check attr on parent not self re TaskContextLogger set_context (#35780)

To know whether we should supply `identifier` param, need to check parent 
class.
---
 airflow/providers/amazon/aws/log/s3_task_handler.py| 4 +++-
 airflow/providers/elasticsearch/log/es_task_handler.py | 4 +++-
 airflow/providers/google/cloud/log/gcs_task_handler.py | 4 +++-
 airflow/providers/microsoft/azure/log/wasb_task_handler.py | 4 +++-
 4 files changed, 12 insertions(+), 4 deletions(-)

diff --git a/airflow/providers/amazon/aws/log/s3_task_handler.py 
b/airflow/providers/amazon/aws/log/s3_task_handler.py
index 761c4ce463..f3664f7c41 100644
--- a/airflow/providers/amazon/aws/log/s3_task_handler.py
+++ b/airflow/providers/amazon/aws/log/s3_task_handler.py
@@ -78,7 +78,9 @@ class S3TaskHandler(FileTaskHandler, LoggingMixin):
 )
 
 def set_context(self, ti: TaskInstance, *, identifier: str | None = None) 
-> None:
-if getattr(self, "supports_task_context_logging", False):
+# todo: remove-at-min-airflow-version-2.8
+#   after Airflow 2.8 can always pass `identifier`
+if getattr(super(), "supports_task_context_logging", False):
 super().set_context(ti, identifier=identifier)
 else:
 super().set_context(ti)
diff --git a/airflow/providers/elasticsearch/log/es_task_handler.py 
b/airflow/providers/elasticsearch/log/es_task_handler.py
index 1e8c75b7e3..c9d3a180e1 100644
--- a/airflow/providers/elasticsearch/log/es_task_handler.py
+++ b/airflow/providers/elasticsearch/log/es_task_handler.py
@@ -443,7 +443,9 @@ class ElasticsearchTaskHandler(FileTaskHandler, 
ExternalLoggingMixin, LoggingMix
 self.handler.setLevel(self.level)
 self.handler.setFormatter(self.formatter)
 else:
-if getattr(self, "supports_task_context_logging", False):
+# todo: remove-at-min-airflow-version-2.8
+#   after Airflow 2.8 can always pass `identifier`
+if getattr(super(), "supports_task_context_logging", False):
 super().set_context(ti, identifier=identifier)
 else:
 super().set_context(ti)
diff --git a/airflow/providers/google/cloud/log/gcs_task_handler.py 
b/airflow/providers/google/cloud/log/gcs_task_handler.py
index 39d0f072a8..9921bb8753 100644
--- a/airflow/providers/google/cloud/log/gcs_task_handler.py
+++ b/airflow/providers/google/cloud/log/gcs_task_handler.py
@@ -142,7 +142,9 @@ class GCSTaskHandler(FileTaskHandler, LoggingMixin):
 )
 
 def set_context(self, ti: TaskInstance, *, identifier: str | None = None) 
-> None:
-if getattr(self, "supports_task_context_logging", False):
+# todo: remove-at-min-airflow-version-2.8
+#   after Airflow 2.8 can always pass `identifier`
+if getattr(super(), "supports_task_context_logging", False):
 super().set_context(ti, identifier=identifier)
 else:
 super().set_context(ti)
diff --git a/airflow/providers/microsoft/azure/log/wasb_task_handler.py 
b/airflow/providers/microsoft/azure/log/wasb_task_handler.py
index f3a00e8432..c57de1acb1 100644
--- a/airflow/providers/microsoft/azure/log/wasb_task_handler.py
+++ b/airflow/providers/microsoft/azure/log/wasb_task_handler.py
@@ -96,7 +96,9 @@ class WasbTaskHandler(FileTaskHandler, LoggingMixin):
 return None
 
 def set_context(self, ti: TaskInstance, *, identifier: str | None = None) 
-> None:
-if getattr(self, "supports_task_context_logging", False):
+# todo: remove-at-min-airflow-version-2.8
+#   after Airflow 2.8 can always pass `identifier`
+if getattr(super(), "supports_task_context_logging", False):
 super().set_context(ti, identifier=identifier)
 else:
 super().set_context(ti)



(airflow) 35/37: Add borderWidthRight to grid for Firefox scrollbar (#35346)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 55914e14cbb587e6712c027e039929bb5bb105c0
Author: Victor Chiapaikeo 
AuthorDate: Wed Nov 22 11:58:10 2023 -0500

Add borderWidthRight to grid for Firefox scrollbar (#35346)
---
 airflow/www/static/js/dag/grid/index.tsx | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/www/static/js/dag/grid/index.tsx 
b/airflow/www/static/js/dag/grid/index.tsx
index 638f4268c0..dc73aecee3 100644
--- a/airflow/www/static/js/dag/grid/index.tsx
+++ b/airflow/www/static/js/dag/grid/index.tsx
@@ -169,7 +169,7 @@ const Grid = ({
 mt={8}
 overscrollBehavior="contain"
   >
-
+
   
 

(airflow) 19/37: Implement login and logout in AWS auth manager (#35488)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e2e89668d46363bc0ecfa31058d02bc3e041de88
Author: Vincent <97131062+vincb...@users.noreply.github.com>
AuthorDate: Tue Nov 21 12:35:10 2023 -0500

Implement login and logout in AWS auth manager (#35488)
---
 CONTRIBUTING.rst   |   6 +-
 Dockerfile |   4 +-
 Dockerfile.ci  |   4 +-
 INSTALL|   6 +-
 airflow/auth/managers/base_auth_manager.py |  32 +++--
 airflow/auth/managers/fab/fab_auth_manager.py  |  16 +--
 airflow/auth/managers/fab/models/__init__.py   |   3 +
 airflow/auth/managers/models/base_user.py  |   7 +-
 .../amazon/aws/auth_manager/__init__.py}   |  17 ---
 .../amazon/aws/auth_manager/aws_auth_manager.py| 143 +++
 .../amazon/aws/auth_manager/constants.py}  |  20 +--
 .../aws/auth_manager/security_manager/__init__.py} |  17 ---
 .../aws_security_manager_override.py}  |  25 ++--
 airflow/providers/amazon/aws/auth_manager/user.py  |  51 +++
 .../amazon/aws/auth_manager/views/__init__.py} |  17 ---
 .../amazon/aws/auth_manager/views/auth.py  | 149 
 airflow/providers/amazon/provider.yaml |  25 
 airflow/www/extensions/init_appbuilder.py  |   2 +-
 airflow/www/extensions/init_auth_manager.py|   6 +-
 airflow/www/extensions/init_security.py|   4 +-
 airflow/www/security_manager.py|  33 +++--
 airflow/www/utils.py   |   5 +-
 docs/apache-airflow/extra-packages-ref.rst |   2 +
 docs/docker-stack/changelog.rst|   3 +
 docs/spelling_wordlist.txt |   1 +
 scripts/docker/install_os_dependencies.sh  |   4 +-
 setup.py   |   7 +
 .../endpoints/test_forward_to_fab_endpoint.py  |   2 +-
 tests/auth/managers/fab/test_fab_auth_manager.py   |  38 +
 tests/auth/managers/test_base_auth_manager.py  |  48 +--
 .../providers/amazon/aws/auth_manager/__init__.py  |  17 ---
 .../aws/auth_manager/security_manager/__init__.py  |  17 ---
 .../test_aws_security_manager_override.py  |  56 
 .../aws/auth_manager/test_aws_auth_manager.py  | 114 +++
 .../amazon/aws/auth_manager/test_constants.py  |  23 +--
 .../providers/amazon/aws/auth_manager/test_user.py |  32 +++--
 .../amazon/aws/auth_manager/views/__init__.py  |  17 ---
 .../amazon/aws/auth_manager/views/test_auth.py | 156 +
 38 files changed, 865 insertions(+), 264 deletions(-)

diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 5a5fae65ca..1114c16074 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -679,9 +679,9 @@ github_enterprise, google, google_auth, grpc, hashicorp, 
hdfs, hive, http, imap,
 jenkins, kerberos, kubernetes, ldap, leveldb, microsoft.azure, 
microsoft.mssql, microsoft.psrp,
 microsoft.winrm, mongo, mssql, mysql, neo4j, odbc, openai, openfaas, 
openlineage, opensearch,
 opsgenie, oracle, otel, pagerduty, pandas, papermill, password, pgvector, 
pinecone, pinot, plexus,
-postgres, presto, rabbitmq, redis, s3, s3fs, salesforce, samba, segment, 
sendgrid, sentry, sftp,
-singularity, slack, smtp, snowflake, spark, sqlite, ssh, statsd, tableau, 
tabular, telegram, trino,
-vertica, virtualenv, weaviate, webhdfs, winrm, yandex, zendesk
+postgres, presto, rabbitmq, redis, s3, s3fs, salesforce, samba, saml, segment, 
sendgrid, sentry,
+sftp, singularity, slack, smtp, snowflake, spark, sqlite, ssh, statsd, 
tableau, tabular, telegram,
+trino, vertica, virtualenv, weaviate, webhdfs, winrm, yandex, zendesk
   .. END EXTRAS HERE
 
 Provider packages
diff --git a/Dockerfile b/Dockerfile
index 94794b3735..b9b358d0c7 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -97,7 +97,7 @@ function get_dev_apt_deps() {
 DEV_APT_DEPS="apt-transport-https apt-utils build-essential 
ca-certificates dirmngr \
 freetds-bin freetds-dev git gosu graphviz graphviz-dev krb5-user ldap-utils 
libffi-dev libgeos-dev \
 libkrb5-dev libldap2-dev libleveldb1d libleveldb-dev libsasl2-2 libsasl2-dev 
libsasl2-modules \
-libssl-dev locales lsb-release openssh-client pkgconf sasl2-bin \
+libssl-dev libxmlsec1 libxmlsec1-dev locales lsb-release openssh-client 
pkgconf sasl2-bin \
 software-properties-common sqlite3 sudo unixodbc unixodbc-dev"
 export DEV_APT_DEPS
 fi
@@ -123,7 +123,7 @@ function get_runtime_apt_deps() {
 if [[ "${RUNTIME_APT_DEPS=}" == "" ]]; then
 RUNTIME_APT_DEPS="apt-transport-https apt-utils ca-certificates \
 curl dumb-init freetds-bin gosu krb5-user libgeos-dev \
-ldap-

(airflow) 30/37: Fix for infinite recursion due to secrets_masker (#35048)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 673f7f837fa00471008ebae88b192f7cec729d55
Author: Usiel Riedl 
AuthorDate: Wed Nov 22 18:19:04 2023 +0800

Fix for infinite recursion due to secrets_masker (#35048)

* Fix for infinite recursion due to secrets_masker

We can get into trouble for types that cannot be initiated with re2's 
`type(obj)()` call. The `secrets_masker` thus fails, which triggers a warning 
log, which also fails because we pass the object to the logger, which is then 
masked again, and so forth.

We can break the recursion by emitting a log without trying to redact the 
value again (this ensures no new bug will cause a stack overflow). This issue 
has occured previously: 
https://github.com/apache/airflow/issues/19816#issuecomment-983311373
Additionally, we fix this particular bug by ensuring whatever re2 receives 
is a simple `str`.

I noticed this issue while working with a DAG that calls Airflow's DB 
cleanup function.

Example DAG:
```
from datetime import datetime

from airflow import DAG
from airflow.models import Variable
from airflow.operators.python import PythonOperator

class MyStringClass(str):
def __init__(self, required_arg):
pass

def fail(task_instance):
# make sure the `SecretsMasker` has a replacer
Variable.set(key="secret", value="secret_value")
Variable.get("secret")
# trigger the infinite recursion
task_instance.log.info("%s", MyStringClass("secret_value"))

with DAG(
dag_id="secrets_masker_recursion",
start_date=datetime(2023, 9, 26),
):
PythonOperator(task_id="fail", python_callable=fail)

```

* Improve error message

-

Co-authored-by: Tzu-ping Chung 
---
 airflow/utils/log/secrets_masker.py|  9 +
 tests/utils/log/test_secrets_masker.py | 18 ++
 2 files changed, 23 insertions(+), 4 deletions(-)

diff --git a/airflow/utils/log/secrets_masker.py 
b/airflow/utils/log/secrets_masker.py
index 246377c169..0b1b65f840 100644
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -261,7 +261,7 @@ class SecretsMasker(logging.Filter):
 # We can't replace specific values, but the key-based 
redacting
 # can still happen, so we can't short-circuit, we need to 
walk
 # the structure.
-return self.replacer.sub("***", item)
+return self.replacer.sub("***", str(item))
 return item
 elif isinstance(item, (tuple, set)):
 # Turn set in to tuple!
@@ -276,14 +276,15 @@ class SecretsMasker(logging.Filter):
 return item
 # I think this should never happen, but it does not hurt to leave it 
just in case
 # Well. It happened (see 
https://github.com/apache/airflow/issues/19816#issuecomment-983311373)
-# but it caused infinite recursion, so we need to cast it to str first.
+# but it caused infinite recursion, to avoid this we mark the log as 
already filtered.
 except Exception as exc:
 log.warning(
-"Unable to redact %r, please report this via 
<https://github.com/apache/airflow/issues>. "
-"Error was: %s: %s",
+"Unable to redact value of type %s, please report this via "
+"<https://github.com/apache/airflow/issues>. Error was: %s: 
%s",
 item,
 type(exc).__name__,
 exc,
+extra={self.ALREADY_FILTERED_FLAG: True},
 )
 return item
 
diff --git a/tests/utils/log/test_secrets_masker.py 
b/tests/utils/log/test_secrets_masker.py
index ffaf2977ae..4657a7c1f3 100644
--- a/tests/utils/log/test_secrets_masker.py
+++ b/tests/utils/log/test_secrets_masker.py
@@ -305,6 +305,24 @@ class TestSecretsMasker:
 got = redact(val, max_depth=max_depth)
 assert got == expected
 
+def test_redact_with_str_type(self, logger, caplog):
+"""
+SecretsMasker's re2 replacer has issues handling a redactable item of 
type
+`str` with required constructor args. This test ensures there is a 
shim in
+place that avoids any issues.
+See: 
https://github.com/apache/airflow/issues/19816#issuecomment-983311373
+"""
+
+class StrLikeClassWithRequiredConstructorArg(str):
+def __init__(self, required_arg):
+pass
+
+text = StrLikeClassWithR

(airflow) 33/37: Fix HttpOperator pagination with `str` data (#35782)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e02271003931c2d9df58a2b769c89cac8d4eb0a2
Author: Joffrey Bienvenu 
AuthorDate: Wed Nov 22 15:34:34 2023 +0100

Fix HttpOperator pagination with `str` data (#35782)

* feat: Restrict `data` parameter typing

Follows the hook's typing

* feat: Implement `data` override when string

* feat: Improve docstring about merging and overriding behavior

* fix: Add correct typing for mypy

* feat: add test

* fix: remove unused imports

* fix: Update SimpleHttpOperator docstring

* feat: Correctly test parameters overriding
---
 airflow/providers/http/operators/http.py| 50 +++
 airflow/providers/http/triggers/http.py |  2 +-
 tests/providers/http/operators/test_http.py | 63 ++---
 3 files changed, 84 insertions(+), 31 deletions(-)

diff --git a/airflow/providers/http/operators/http.py 
b/airflow/providers/http/operators/http.py
index 96415ed977..524de8c585 100644
--- a/airflow/providers/http/operators/http.py
+++ b/airflow/providers/http/operators/http.py
@@ -56,12 +56,17 @@ class HttpOperator(BaseOperator):
 :param pagination_function: A callable that generates the parameters used 
to call the API again,
 based on the previous response. Typically used when the API is 
paginated and returns for e.g a
 cursor, a 'next page id', or a 'next page URL'. When provided, the 
Operator will call the API
-repeatedly until this callable returns None. Also, the result of the 
Operator will become by
-default a list of Response.text objects (instead of a single response 
object). Same with the
-other injected functions (like response_check, response_filter, ...) 
which will also receive a
-list of Response object. This function receives a Response object form 
previous call, and should
-return a dict of parameters (`endpoint`, `data`, `headers`, 
`extra_options`), which will be merged
-and will override the one used in the initial API call.
+repeatedly until this callable returns None. The result of the 
Operator will become by default a
+list of Response.text objects (instead of a single response object). 
Same with the other injected
+functions (like response_check, response_filter, ...) which will also 
receive a list of Response
+objects. This function receives a Response object form previous call, 
and should return a nested
+dictionary with the following optional keys: `endpoint`, `data`, 
`headers` and `extra_options.
+Those keys will be merged and/or override the parameters provided into 
the HttpOperator declaration.
+Parameters are merged when they are both a dictionary (e.g.: 
HttpOperator.headers will be merged
+with the `headers` dict provided by this function). When merging, dict 
items returned by this
+function will override initial ones (e.g: if both HttpOperator.headers 
and `headers` have a 'cookie'
+item, the one provided by `headers` is kept). Parameters are simply 
overridden when any of them are
+string (e.g.: HttpOperator.endpoint is overridden by `endpoint`).
 :param response_check: A check against the 'requests' response object.
 The callable takes the response object as the first positional argument
 and optionally any number of keyword arguments available in the 
context dictionary.
@@ -101,7 +106,7 @@ class HttpOperator(BaseOperator):
 *,
 endpoint: str | None = None,
 method: str = "POST",
-data: Any = None,
+data: dict[str, Any] | str | None = None,
 headers: dict[str, str] | None = None,
 pagination_function: Callable[..., Any] | None = None,
 response_check: Callable[..., bool] | None = None,
@@ -271,9 +276,16 @@ class HttpOperator(BaseOperator):
 :param next_page_params: A dictionary containing the parameters for 
the next page.
 :return: A dictionary containing the merged parameters.
 """
+data: str | dict | None = None  # makes mypy happy
+next_page_data_param = next_page_params.get("data")
+if isinstance(self.data, dict) and isinstance(next_page_data_param, 
dict):
+data = merge_dicts(self.data, next_page_data_param)
+else:
+data = next_page_data_param or self.data
+
 return dict(
 endpoint=next_page_params.get("endpoint") or self.endpoint,
-data=merge_dicts(self.data, next_page_params.get("data", {})),
+data=data,
 headers=merge_dicts(self.headers, next_page_params.get("headers", 
{})),
 ex

(airflow) 28/37: added Topic params for schema_settings and message_retention_duration. (#35767)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 08785f50f15a419e91a3eec03a60d9d0b9173343
Author: dmedora <21352998+dmed...@users.noreply.github.com>
AuthorDate: Wed Nov 22 01:21:59 2023 -0800

added Topic params for schema_settings and message_retention_duration. 
(#35767)
---
 airflow/providers/google/cloud/hooks/pubsub.py| 10 ++
 airflow/providers/google/cloud/operators/pubsub.py|  7 +++
 tests/providers/google/cloud/hooks/test_pubsub.py |  9 -
 tests/providers/google/cloud/operators/test_pubsub.py |  4 
 4 files changed, 29 insertions(+), 1 deletion(-)

diff --git a/airflow/providers/google/cloud/hooks/pubsub.py 
b/airflow/providers/google/cloud/hooks/pubsub.py
index 1573f32841..0f6e5fcff8 100644
--- a/airflow/providers/google/cloud/hooks/pubsub.py
+++ b/airflow/providers/google/cloud/hooks/pubsub.py
@@ -57,6 +57,7 @@ if TYPE_CHECKING:
 PushConfig,
 ReceivedMessage,
 RetryPolicy,
+SchemaSettings,
 )
 
 
@@ -182,6 +183,8 @@ class PubSubHook(GoogleBaseHook):
 labels: dict[str, str] | None = None,
 message_storage_policy: dict | MessageStoragePolicy = None,
 kms_key_name: str | None = None,
+schema_settings: dict | SchemaSettings = None,
+message_retention_duration: str | None = None,
 retry: Retry | _MethodDefault = DEFAULT,
 timeout: float | None = None,
 metadata: Sequence[tuple[str, str]] = (),
@@ -206,6 +209,11 @@ class PubSubHook(GoogleBaseHook):
 to be used to protect access to messages published on this topic.
 The expected format is
 ``projects/*/locations/*/keyRings/*/cryptoKeys/*``.
+:param schema_settings: (Optional) Settings for validating messages 
published against an
+existing schema. The expected format is ``projects/*/schemas/*``.
+:param message_retention_duration: (Optional) Indicates the minimum 
duration to retain a
+message after it is published to the topic. The expected format is 
a duration in
+seconds with up to nine fractional digits, ending with 's'. 
Example: "3.5s".
 :param retry: (Optional) A retry object used to retry requests.
 If None is specified, requests will not be retried.
 :param timeout: (Optional) The amount of time, in seconds, to wait for 
the request
@@ -228,6 +236,8 @@ class PubSubHook(GoogleBaseHook):
 "labels": labels,
 "message_storage_policy": message_storage_policy,
 "kms_key_name": kms_key_name,
+"schema_settings": schema_settings,
+"message_retention_duration": message_retention_duration,
 },
 retry=retry,
 timeout=timeout,
diff --git a/airflow/providers/google/cloud/operators/pubsub.py 
b/airflow/providers/google/cloud/operators/pubsub.py
index b8c90be0fe..3751e9a371 100644
--- a/airflow/providers/google/cloud/operators/pubsub.py
+++ b/airflow/providers/google/cloud/operators/pubsub.py
@@ -35,6 +35,7 @@ from google.cloud.pubsub_v1.types import (
 PushConfig,
 ReceivedMessage,
 RetryPolicy,
+SchemaSettings,
 )
 
 from airflow.providers.google.cloud.hooks.pubsub import PubSubHook
@@ -130,6 +131,8 @@ class PubSubCreateTopicOperator(GoogleCloudBaseOperator):
 labels: dict[str, str] | None = None,
 message_storage_policy: dict | MessageStoragePolicy = None,
 kms_key_name: str | None = None,
+schema_settings: dict | SchemaSettings = None,
+message_retention_duration: str | None = None,
 retry: Retry | _MethodDefault = DEFAULT,
 timeout: float | None = None,
 metadata: Sequence[tuple[str, str]] = (),
@@ -144,6 +147,8 @@ class PubSubCreateTopicOperator(GoogleCloudBaseOperator):
 self.labels = labels
 self.message_storage_policy = message_storage_policy
 self.kms_key_name = kms_key_name
+self.schema_settings = schema_settings
+self.message_retention_duration = message_retention_duration
 self.retry = retry
 self.timeout = timeout
 self.metadata = metadata
@@ -163,6 +168,8 @@ class PubSubCreateTopicOperator(GoogleCloudBaseOperator):
 labels=self.labels,
 message_storage_policy=self.message_storage_policy,
 kms_key_name=self.kms_key_name,
+schema_settings=self.schema_settings,
+message_retention_duration=self.message_retention_duration,
 retry=self.retry,
 timeout=self.timeout,
 metadata=self.metadata,
diff --git a/tests/providers/google/cloud/hooks/test_pubsub.py 
b/tests/providers/google/cloud/hooks/test_pubs

(airflow) 12/37: Remove backcompat inheritance for DbApiHook (#35754)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 73106932771ba3a1e8a08de3905b569d5462c416
Author: Andrey Anshin 
AuthorDate: Tue Nov 21 11:12:56 2023 +0400

Remove backcompat inheritance for DbApiHook (#35754)

* Remove backcompat inheritance for DbApiHook

* jwt_file > jwt__file

* simplify trino test
---
 airflow/providers/apache/impala/hooks/impala.py|  3 +-
 airflow/providers/common/sql/hooks/sql.py  | 18 +-
 airflow/providers/common/sql/hooks/sql.pyi |  4 +--
 .../providers/elasticsearch/hooks/elasticsearch.py |  4 +--
 airflow/providers/trino/hooks/trino.py | 19 ---
 tests/providers/odbc/hooks/test_odbc.py|  3 +-
 tests/providers/trino/hooks/test_trino.py  | 38 +++---
 7 files changed, 49 insertions(+), 40 deletions(-)

diff --git a/airflow/providers/apache/impala/hooks/impala.py 
b/airflow/providers/apache/impala/hooks/impala.py
index ab19865a9e..b8c79b4e25 100644
--- a/airflow/providers/apache/impala/hooks/impala.py
+++ b/airflow/providers/apache/impala/hooks/impala.py
@@ -35,7 +35,8 @@ class ImpalaHook(DbApiHook):
 hook_name = "Impala"
 
 def get_conn(self) -> Connection:
-connection = self.get_connection(self.impala_conn_id)  # pylint: 
disable=no-member
+conn_id: str = getattr(self, self.conn_name_attr)
+connection = self.get_connection(conn_id)
 return connect(
 host=connection.host,
 port=connection.port,
diff --git a/airflow/providers/common/sql/hooks/sql.py 
b/airflow/providers/common/sql/hooks/sql.py
index ab4eda5d8e..bb85dedc1c 100644
--- a/airflow/providers/common/sql/hooks/sql.py
+++ b/airflow/providers/common/sql/hooks/sql.py
@@ -34,12 +34,10 @@ from typing import (
 from urllib.parse import urlparse
 
 import sqlparse
-from packaging.version import Version
 from sqlalchemy import create_engine
 
 from airflow.exceptions import AirflowException
 from airflow.hooks.base import BaseHook
-from airflow.version import version
 
 if TYPE_CHECKING:
 from pandas import DataFrame
@@ -120,21 +118,7 @@ class ConnectorProtocol(Protocol):
 """
 
 
-# In case we are running it on Airflow 2.4+, we should use BaseHook, but on 
Airflow 2.3 and below
-# We want the DbApiHook to derive from the original DbApiHook from airflow, 
because otherwise
-# SqlSensor and BaseSqlOperator from "airflow.operators" and "airflow.sensors" 
will refuse to
-# accept the new Hooks as not derived from the original DbApiHook
-if Version(version) < Version("2.4"):
-try:
-from airflow.hooks.dbapi import DbApiHook as BaseForDbApiHook
-except ImportError:
-# just in case we have a problem with circular import
-BaseForDbApiHook: type[BaseHook] = BaseHook  # type: ignore[no-redef]
-else:
-BaseForDbApiHook: type[BaseHook] = BaseHook  # type: ignore[no-redef]
-
-
-class DbApiHook(BaseForDbApiHook):
+class DbApiHook(BaseHook):
 """
 Abstract base class for sql hooks.
 
diff --git a/airflow/providers/common/sql/hooks/sql.pyi 
b/airflow/providers/common/sql/hooks/sql.pyi
index dedac037df..41bd6ebf47 100644
--- a/airflow/providers/common/sql/hooks/sql.pyi
+++ b/airflow/providers/common/sql/hooks/sql.pyi
@@ -32,8 +32,8 @@ Definition of the public interface for 
airflow.providers.common.sql.hooks.sql
 isort:skip_file
 """
 from _typeshed import Incomplete
-from airflow.hooks.dbapi import DbApiHook as BaseForDbApiHook
-from typing import Any, Callable, Iterable, Mapping, Sequence
+from airflow.hooks.base import BaseHook as BaseForDbApiHook
+from typing import Any, Callable, Iterable, Mapping, Sequence, Union
 from typing_extensions import Protocol
 
 def return_single_query_results(
diff --git a/airflow/providers/elasticsearch/hooks/elasticsearch.py 
b/airflow/providers/elasticsearch/hooks/elasticsearch.py
index 6c93586892..2d9fca4a97 100644
--- a/airflow/providers/elasticsearch/hooks/elasticsearch.py
+++ b/airflow/providers/elasticsearch/hooks/elasticsearch.py
@@ -108,9 +108,7 @@ class ElasticsearchSQLHook(DbApiHook):
 if conn.extra_dejson.get("timeout", False):
 conn_args["timeout"] = conn.extra_dejson["timeout"]
 
-conn = connect(**conn_args)
-
-return conn
+return connect(**conn_args)
 
 def get_uri(self) -> str:
 conn_id = getattr(self, self.conn_name_attr)
diff --git a/airflow/providers/trino/hooks/trino.py 
b/airflow/providers/trino/hooks/trino.py
index 03195fe452..798109dc3f 100644
--- a/airflow/providers/trino/hooks/trino.py
+++ b/airflow/providers/trino/hooks/trino.py
@@ -19,6 +19,7 @@ from __future__ import annotations
 
 import json
 import os
+from pathlib import Path
 fr

(airflow) 27/37: Add missing docker test_exceptions.py (#35674)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7937cba0dc7e2d6db80209cfaf80aafe1d3b1a4c
Author: Riyas P K <16943217+rp...@users.noreply.github.com>
AuthorDate: Wed Nov 22 06:08:45 2023 +0530

Add missing docker test_exceptions.py (#35674)

* test case for docker exceptions

* removed docker test exception

* re-wrote test case based on feedback
---
 tests/always/test_project_structure.py|  1 -
 tests/providers/docker/test_exceptions.py | 80 +++
 2 files changed, 80 insertions(+), 1 deletion(-)

diff --git a/tests/always/test_project_structure.py 
b/tests/always/test_project_structure.py
index a58f33d755..967c3ecbb5 100644
--- a/tests/always/test_project_structure.py
+++ b/tests/always/test_project_structure.py
@@ -108,7 +108,6 @@ class TestProjectStructure:
 "tests/providers/cncf/kubernetes/utils/test_xcom_sidecar.py",
 "tests/providers/daskexecutor/executors/test_dask_executor.py",
 "tests/providers/databricks/hooks/test_databricks_base.py",
-"tests/providers/docker/test_exceptions.py",
 "tests/providers/google/cloud/fs/test_gcs.py",
 "tests/providers/google/cloud/links/test_automl.py",
 "tests/providers/google/cloud/links/test_base.py",
diff --git a/tests/providers/docker/test_exceptions.py 
b/tests/providers/docker/test_exceptions.py
new file mode 100644
index 00..1f3653c39d
--- /dev/null
+++ b/tests/providers/docker/test_exceptions.py
@@ -0,0 +1,80 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Test for Exceptions used by Docker provider."""
+
+from __future__ import annotations
+
+from unittest import mock
+
+import pytest
+from docker import APIClient
+
+from airflow.providers.docker.exceptions import (
+DockerContainerFailedException,
+DockerContainerFailedSkipException,
+)
+from airflow.providers.docker.operators.docker import DockerOperator
+
+FAILED_MESSAGE = {"StatusCode": 1}
+FAILED_LOGS = ["unicode container log 😁   ", b"byte string container log"]
+EXPECTED_MESSAGE = f"Docker container failed: {FAILED_MESSAGE}"
+FAILED_SKIP_MESSAGE = {"StatusCode": 2}
+SKIP_ON_EXIT_CODE = 2
+EXPECTED_SKIP_MESSAGE = f"Docker container returned exit code 
{[SKIP_ON_EXIT_CODE]}. Skipping."
+
+
+@pytest.mark.parametrize(
+"failed_msg, log_line, expected_message, skip_on_exit_code",
+[
+(FAILED_MESSAGE, FAILED_LOGS, EXPECTED_MESSAGE, None),
+(FAILED_SKIP_MESSAGE, FAILED_LOGS, EXPECTED_SKIP_MESSAGE, 
SKIP_ON_EXIT_CODE),
+],
+)
+class TestDockerContainerExceptions:
+@pytest.fixture(autouse=True, scope="function")
+def setup_patchers(self, docker_api_client_patcher):
+self.client_mock = mock.MagicMock(spec=APIClient)
+self.client_mock.wait.return_value = {"StatusCode": 0}
+self.log_messages = ["container log  😁   ", b"byte string container 
log"]
+self.client_mock.attach.return_value = self.log_messages
+
+self.client_mock.logs.side_effect = (
+lambda **kwargs: iter(self.log_messages[-kwargs["tail"] :])
+if "tail" in kwargs
+else iter(self.log_messages)
+)
+
+docker_api_client_patcher.return_value = self.client_mock
+
+def test_docker_failed_exception(self, failed_msg, log_line, 
expected_message, skip_on_exit_code):
+self.client_mock.attach.return_value = log_line
+self.client_mock.wait.return_value = failed_msg
+
+operator = DockerOperator(
+image="ubuntu", owner="unittest", task_id="unittest", 
skip_on_exit_code=skip_on_exit_code
+)
+
+if skip_on_exit_code:
+with pytest.raises(DockerContainerFailedSkipException) as 
raised_exception:
+operator.execute(None)
+else:
+with pytest.raises(DockerContainerFailedException) as 
raised_exception:
+operator.execute(None)
+
+assert str(raised_exception.value) == expected_message
+assert raised_exception.value.logs == [log_line[0].strip(), 
log_line[1].decode("utf-8")]



(airflow) 32/37: Upgrade to Pydantic v2 (#35551)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b331cb2e0e3c47e3e2322decba29cba1d6f1aecf
Author: Sin-Woo Bang 
AuthorDate: Wed Nov 22 20:56:43 2023 +0900

Upgrade to Pydantic v2 (#35551)

* Replace deprecated Config with ConfigDict

* Drop Pydantic v1 compatibility as bumping it to 2.3.0
---
 airflow/configuration.py   |  4 
 airflow/serialization/pydantic/dag.py  | 29 +-
 airflow/serialization/pydantic/dag_run.py  |  9 ++--
 airflow/serialization/pydantic/dataset.py  | 27 +---
 airflow/serialization/pydantic/job.py  |  8 ++-
 airflow/serialization/pydantic/taskinstance.py |  9 ++--
 airflow/serialization/serde.py |  9 +---
 airflow/serialization/serialized_objects.py|  5 +
 setup.cfg  |  6 +-
 9 files changed, 24 insertions(+), 82 deletions(-)

diff --git a/airflow/configuration.py b/airflow/configuration.py
index 6b03759033..1df62c9b99 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -62,10 +62,6 @@ if not sys.warnoptions:
 warnings.filterwarnings(action="default", category=DeprecationWarning, 
module="airflow")
 warnings.filterwarnings(action="default", 
category=PendingDeprecationWarning, module="airflow")
 
-# Temporarily suppress warnings from pydantic until we upgrade minimum 
version of pydantic to v2
-# Which should happen in Airflow 2.8.0
-warnings.filterwarnings(action="ignore", category=UserWarning, 
module=r"pydantic._internal._config")
-
 _SQLITE3_VERSION_PATTERN = re2.compile(r"(?P^\d+(?:\.\d+)*)\D?.*$")
 
 ConfigType = Union[str, int, float, bool]
diff --git a/airflow/serialization/pydantic/dag.py 
b/airflow/serialization/pydantic/dag.py
index 6631afdf73..04b2472355 100644
--- a/airflow/serialization/pydantic/dag.py
+++ b/airflow/serialization/pydantic/dag.py
@@ -21,7 +21,13 @@ from datetime import datetime, timedelta
 from typing import Any, List, Optional
 
 from dateutil import relativedelta
-from pydantic import BaseModel as BaseModelPydantic, PlainSerializer, 
PlainValidator, ValidationInfo
+from pydantic import (
+BaseModel as BaseModelPydantic,
+ConfigDict,
+PlainSerializer,
+PlainValidator,
+ValidationInfo,
+)
 from typing_extensions import Annotated
 
 from airflow import DAG, settings
@@ -86,12 +92,7 @@ class DagOwnerAttributesPydantic(BaseModelPydantic):
 owner: str
 link: str
 
-class Config:
-"""Make sure it deals automatically with SQLAlchemy ORM classes."""
-
-from_attributes = True
-orm_mode = True  # Pydantic 1.x compatibility.
-arbitrary_types_allowed = True
+model_config = ConfigDict(from_attributes=True, 
arbitrary_types_allowed=True)
 
 
 class DagTagPydantic(BaseModelPydantic):
@@ -100,12 +101,7 @@ class DagTagPydantic(BaseModelPydantic):
 name: str
 dag_id: str
 
-class Config:
-"""Make sure it deals automatically with SQLAlchemy ORM classes."""
-
-from_attributes = True
-orm_mode = True  # Pydantic 1.x compatibility.
-arbitrary_types_allowed = True
+model_config = ConfigDict(from_attributes=True, 
arbitrary_types_allowed=True)
 
 
 class DagModelPydantic(BaseModelPydantic):
@@ -141,12 +137,7 @@ class DagModelPydantic(BaseModelPydantic):
 
 _processor_dags_folder: Optional[str] = None
 
-class Config:
-"""Make sure it deals automatically with SQLAlchemy ORM classes."""
-
-from_attributes = True
-orm_mode = True  # Pydantic 1.x compatibility.
-arbitrary_types_allowed = True
+model_config = ConfigDict(from_attributes=True, 
arbitrary_types_allowed=True)
 
 @property
 def relative_fileloc(self) -> pathlib.Path:
diff --git a/airflow/serialization/pydantic/dag_run.py 
b/airflow/serialization/pydantic/dag_run.py
index aaa4372a50..cd0886ecaf 100644
--- a/airflow/serialization/pydantic/dag_run.py
+++ b/airflow/serialization/pydantic/dag_run.py
@@ -19,7 +19,7 @@ from __future__ import annotations
 from datetime import datetime
 from typing import TYPE_CHECKING, Iterable, List, Optional
 
-from pydantic import BaseModel as BaseModelPydantic
+from pydantic import BaseModel as BaseModelPydantic, ConfigDict
 
 from airflow.serialization.pydantic.dag import PydanticDag
 from airflow.serialization.pydantic.dataset import DatasetEventPydantic
@@ -56,12 +56,7 @@ class DagRunPydantic(BaseModelPydantic):
 dag: Optional[PydanticDag]
 consumed_dataset_events: List[DatasetEventPydantic]  # noqa
 
-class Config:
-"""Make sure it deals automatically w

(airflow) 29/37: Added retry strategy parameter to Amazon AWS provider Batch Operator to allow dynamic Batch retry strategies (#35789)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0e6b5d8f8e71df326d237c2028940d5cbf3bac2c
Author: Evgeny 
AuthorDate: Wed Nov 22 04:26:42 2023 -0500

Added retry strategy parameter to Amazon AWS provider Batch Operator to 
allow dynamic Batch retry strategies (#35789)
---
 airflow/providers/amazon/aws/operators/batch.py| 5 +
 tests/providers/amazon/aws/operators/test_batch.py | 6 ++
 2 files changed, 11 insertions(+)

diff --git a/airflow/providers/amazon/aws/operators/batch.py 
b/airflow/providers/amazon/aws/operators/batch.py
index 7cf2cbfeac..e917d1d81d 100644
--- a/airflow/providers/amazon/aws/operators/batch.py
+++ b/airflow/providers/amazon/aws/operators/batch.py
@@ -112,6 +112,7 @@ class BatchOperator(BaseOperator):
 "array_properties",
 "node_overrides",
 "parameters",
+"retry_strategy",
 "waiters",
 "tags",
 "wait_for_completion",
@@ -122,6 +123,7 @@ class BatchOperator(BaseOperator):
 "container_overrides": "json",
 "parameters": "json",
 "node_overrides": "json",
+"retry_strategy": "json",
 }
 
 @property
@@ -160,6 +162,7 @@ class BatchOperator(BaseOperator):
 share_identifier: str | None = None,
 scheduling_priority_override: int | None = None,
 parameters: dict | None = None,
+retry_strategy: dict | None = None,
 job_id: str | None = None,
 waiters: Any | None = None,
 max_retries: int = 4200,
@@ -201,6 +204,7 @@ class BatchOperator(BaseOperator):
 self.scheduling_priority_override = scheduling_priority_override
 self.array_properties = array_properties
 self.parameters = parameters or {}
+self.retry_strategy = retry_strategy or {}
 self.waiters = waiters
 self.tags = tags or {}
 self.wait_for_completion = wait_for_completion
@@ -287,6 +291,7 @@ class BatchOperator(BaseOperator):
 "tags": self.tags,
 "containerOverrides": self.container_overrides,
 "nodeOverrides": self.node_overrides,
+"retryStrategy": self.retry_strategy,
 "shareIdentifier": self.share_identifier,
 "schedulingPriorityOverride": self.scheduling_priority_override,
 }
diff --git a/tests/providers/amazon/aws/operators/test_batch.py 
b/tests/providers/amazon/aws/operators/test_batch.py
index 8eb6601dfd..8a0d0e788a 100644
--- a/tests/providers/amazon/aws/operators/test_batch.py
+++ b/tests/providers/amazon/aws/operators/test_batch.py
@@ -63,6 +63,7 @@ class TestBatchOperator:
 max_retries=self.MAX_RETRIES,
 status_retries=self.STATUS_RETRIES,
 parameters=None,
+retry_strategy=None,
 container_overrides={},
 array_properties=None,
 aws_conn_id="airflow_test",
@@ -96,6 +97,7 @@ class TestBatchOperator:
 assert self.batch.hook.max_retries == self.MAX_RETRIES
 assert self.batch.hook.status_retries == self.STATUS_RETRIES
 assert self.batch.parameters == {}
+assert self.batch.retry_strategy == {}
 assert self.batch.container_overrides == {}
 assert self.batch.array_properties is None
 assert self.batch.node_overrides is None
@@ -119,6 +121,7 @@ class TestBatchOperator:
 "array_properties",
 "node_overrides",
 "parameters",
+"retry_strategy",
 "waiters",
 "tags",
 "wait_for_completion",
@@ -143,6 +146,7 @@ class TestBatchOperator:
 containerOverrides={},
 jobDefinition="hello-world",
 parameters={},
+retryStrategy={},
 tags={},
 )
 
@@ -166,6 +170,7 @@ class TestBatchOperator:
 containerOverrides={},
 jobDefinition="hello-world",
 parameters={},
+retryStrategy={},
 tags={},
 )
 
@@ -232,6 +237,7 @@ class TestBatchOperator:
 "jobName": JOB_NAME,
 "jobDefinition": "hello-world",
 "parameters": {},
+"retryStrategy": {},
 "tags": {},
 }
 if override == "overrides":



(airflow) branch v2-8-test updated (d6c7d33d43 -> 5bbc46005f)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


omit d6c7d33d43 Update RELEASE_NOTES.rst
omit 729c1df9eb Update version to 2.8.0
 new 1fc0633f89 Add v2-8 branches to codecov.yml and .asf.yaml (#35750)
 new 70e2f419e6 Update minor release command (#35751)
 new 6198558843 Remove backcompat with Airflow 2.3/2.4 in providers (#35727)
 new b88d0d7ce2 OpenLineage integration tried to use non-existed method in 
SnowflakeHook (#35752)
 new 25990d159c Remove usage of deprecated method from 
BigQueryToBigQueryOperator (#35605)
 new bd644fab3e feature(providers): added `OpsgenieNotifier` (#35530)
 new 38afe9ffb1 Rename --aditional-extras flag to 
--aditional-airflow-extras (#35760)
 new 9b832f6acd More detail on mandatory task arguments (#35740)
 new 7642b29f5c Update README.md to reflect changes we agreed to the 
versioning (#35764)
 new 5d69fc142d Add basic metrics to stats collector. (#35368)
 new d9cfdd8131 Extend task context logging support for remote logging 
using Elasticsearch (#32977)
 new 7310693277 Remove backcompat inheritance for DbApiHook (#35754)
 new 26d5e3f4e7 Improve ownership fixing for Breeze (#35759)
 new 1df306a337 Fix broken link to Weaviate docs (#35776)
 new f40e1c17ee Set mark_end_on_close after set_context (#35761)
 new c0a1dfe9ff Make passing build args explicit in ci/prod builds (#35768)
 new 4454fc870c Move `BaseOperatorLink` into the separate module (#35032)
 new b51aaf59d2 Add OpenLineage support to GCSToBigQueryOperator (#35778)
 new e2e89668d4 Implement login and logout in AWS auth manager (#35488)
 new feaeb8c5fb Check attr on parent not self re TaskContextLogger 
set_context (#35780)
 new 528d2bc51b Remove --force-build command in cache steps in CI (#35784)
 new 599189e41e Fix DataFusion example type annotations (#35753)
 new b904523c5a Fix permission check on menus (#35781)
 new 2a240e65d5 Remove pendulum as dependency of breeze (#35786)
 new 58743f28d5 Reflect drop/add support of DB Backends versions in 
documentation (#35785)
 new e0736092a8 Update emr.py (#35787)
 new 7937cba0dc Add missing docker test_exceptions.py (#35674)
 new 08785f50f1 added Topic params for schema_settings and 
message_retention_duration. (#35767)
 new 0e6b5d8f8e Added retry strategy parameter to Amazon AWS provider Batch 
Operator to allow dynamic Batch retry strategies (#35789)
 new 673f7f837f Fix for infinite recursion due to secrets_masker (#35048)
 new d568543d87 feat: K8S resource operator - CRD (#35600)
 new b331cb2e0e Upgrade to Pydantic v2 (#35551)
 new e022710039 Fix HttpOperator pagination with `str` data (#35782)
 new 840707cedf Updated docstring: `check_key_async` is now in line with 
description of `_check_key_async` (#35799)
 new 55914e14cb Add borderWidthRight to grid for Firefox scrollbar (#35346)
 new 20cb92e52a Update version to 2.8.0
 new 5bbc46005f Update RELEASE_NOTES.rst

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (d6c7d33d43)
\
 N -- N -- N   refs/heads/v2-8-test (5bbc46005f)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 37 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .asf.yaml  |   3 +
 .github/actions/build-ci-images/action.yml |   4 -
 .github/actions/build-prod-images/action.yml   |   4 -
 .github/actions/post_tests_failure/action.yml  |   3 -
 .github/actions/post_tests_success/action.yml  |   3 -
 .github/workflows/build-images.yml |   3 -
 .github/workflows/ci.yml   |  55 ---
 .github/workflows/release_dockerhub_image.yml  |   3 -
 .pre-commit-config.yaml|  23 +-
 BREEZE.rst |   2 +-
 CONTRIBUTING.rst   |   6 +-
 Dockerfile |   4 +-
 Dockerfile.ci  |   6 +-
 IMAGES.rst  

(airflow) 05/37: Remove usage of deprecated method from BigQueryToBigQueryOperator (#35605)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 25990d159c1f2350a2ee678d719251da888f27cd
Author: Maksim 
AuthorDate: Mon Nov 20 21:12:02 2023 +0100

Remove usage of deprecated method from BigQueryToBigQueryOperator (#35605)
---
 .../google/cloud/transfers/bigquery_to_bigquery.py | 88 +++-
 .../cloud/transfers/test_bigquery_to_bigquery.py   | 94 ++
 2 files changed, 127 insertions(+), 55 deletions(-)

diff --git a/airflow/providers/google/cloud/transfers/bigquery_to_bigquery.py 
b/airflow/providers/google/cloud/transfers/bigquery_to_bigquery.py
index a54ced136f..671ad33488 100644
--- a/airflow/providers/google/cloud/transfers/bigquery_to_bigquery.py
+++ b/airflow/providers/google/cloud/transfers/bigquery_to_bigquery.py
@@ -18,10 +18,8 @@
 """This module contains Google BigQuery to BigQuery operator."""
 from __future__ import annotations
 
-import warnings
 from typing import TYPE_CHECKING, Sequence
 
-from airflow.exceptions import AirflowProviderDeprecationWarning
 from airflow.models import BaseOperator
 from airflow.providers.google.cloud.hooks.bigquery import BigQueryHook
 from airflow.providers.google.cloud.links.bigquery import BigQueryTableLink
@@ -110,6 +108,58 @@ class BigQueryToBigQueryOperator(BaseOperator):
 self.encryption_configuration = encryption_configuration
 self.location = location
 self.impersonation_chain = impersonation_chain
+self.hook: BigQueryHook | None = None
+
+def _prepare_job_configuration(self):
+self.source_project_dataset_tables = (
+[self.source_project_dataset_tables]
+if not isinstance(self.source_project_dataset_tables, list)
+else self.source_project_dataset_tables
+)
+
+source_project_dataset_tables_fixup = []
+for source_project_dataset_table in self.source_project_dataset_tables:
+source_project, source_dataset, source_table = 
self.hook.split_tablename(
+table_input=source_project_dataset_table,
+default_project_id=self.hook.project_id,
+var_name="source_project_dataset_table",
+)
+source_project_dataset_tables_fixup.append(
+{"projectId": source_project, "datasetId": source_dataset, 
"tableId": source_table}
+)
+
+destination_project, destination_dataset, destination_table = 
self.hook.split_tablename(
+table_input=self.destination_project_dataset_table,
+default_project_id=self.hook.project_id,
+)
+configuration = {
+"copy": {
+"createDisposition": self.create_disposition,
+"writeDisposition": self.write_disposition,
+"sourceTables": source_project_dataset_tables_fixup,
+"destinationTable": {
+"projectId": destination_project,
+"datasetId": destination_dataset,
+"tableId": destination_table,
+},
+}
+}
+
+if self.labels:
+configuration["labels"] = self.labels
+
+if self.encryption_configuration:
+configuration["copy"]["destinationEncryptionConfiguration"] = 
self.encryption_configuration
+
+return configuration
+
+def _submit_job(
+self,
+hook: BigQueryHook,
+configuration: dict,
+) -> str:
+job = hook.insert_job(configuration=configuration, 
project_id=hook.project_id)
+return job.job_id
 
 def execute(self, context: Context) -> None:
 self.log.info(
@@ -122,24 +172,20 @@ class BigQueryToBigQueryOperator(BaseOperator):
 location=self.location,
 impersonation_chain=self.impersonation_chain,
 )
+self.hook = hook
 
-with warnings.catch_warnings():
-warnings.simplefilter("ignore", AirflowProviderDeprecationWarning)
-job_id = hook.run_copy(
-
source_project_dataset_tables=self.source_project_dataset_tables,
-
destination_project_dataset_table=self.destination_project_dataset_table,
-write_disposition=self.write_disposition,
-create_disposition=self.create_disposition,
-labels=self.labels,
-encryption_configuration=self.encryption_configuration,
-)
+if not hook.project_id:
+raise ValueError("The project_id should be set")
 
-job = hook.get_job(job_id=job_id, 
location=self.location).to_api_repr()
-conf = job["configuration"

(airflow) 04/37: OpenLineage integration tried to use non-existed method in SnowflakeHook (#35752)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b88d0d7ce2944b5ed3c04d30fffcac37eaaa1859
Author: Andrey Anshin 
AuthorDate: Mon Nov 20 19:55:30 2023 +0400

OpenLineage integration tried to use non-existed method in SnowflakeHook 
(#35752)
---
 airflow/providers/snowflake/hooks/snowflake.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/providers/snowflake/hooks/snowflake.py 
b/airflow/providers/snowflake/hooks/snowflake.py
index f7f7a9a16f..081d734ac3 100644
--- a/airflow/providers/snowflake/hooks/snowflake.py
+++ b/airflow/providers/snowflake/hooks/snowflake.py
@@ -480,7 +480,7 @@ class SnowflakeHook(DbApiHook):
 from airflow.providers.openlineage.sqlparser import SQLParser
 
 connection = self.get_connection(getattr(self, self.conn_name_attr))
-namespace = 
SQLParser.create_namespace(self.get_database_info(connection))
+namespace = 
SQLParser.create_namespace(self.get_openlineage_database_info(connection))
 
 if self.query_ids:
 return OperatorLineage(



(airflow) 07/37: Rename --aditional-extras flag to --aditional-airflow-extras (#35760)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 38afe9ffb1e723528ebaa3f005f880aeadd8e926
Author: Jarek Potiuk 
AuthorDate: Mon Nov 20 22:38:35 2023 +0100

Rename --aditional-extras flag to --aditional-airflow-extras (#35760)

The previous flag was wrongly named so it was converted to
ADDITIONAL_EXTRAS ARG rather than ADDITIONA_AIRFLOW_EXTRAS.
---
 BREEZE.rst |  2 +-
 IMAGES.rst | 10 +-
 .../src/airflow_breeze/commands/ci_image_commands_config.py|  2 +-
 .../commands/production_image_commands_config.py   |  2 +-
 dev/breeze/src/airflow_breeze/params/build_ci_params.py|  4 ++--
 dev/breeze/src/airflow_breeze/params/build_prod_params.py  |  2 +-
 dev/breeze/src/airflow_breeze/params/common_build_params.py|  2 +-
 dev/breeze/src/airflow_breeze/utils/common_options.py  |  2 +-
 images/breeze/output_ci-image_build.svg|  2 +-
 images/breeze/output_ci-image_build.txt|  2 +-
 images/breeze/output_prod-image_build.svg  |  2 +-
 images/breeze/output_prod-image_build.txt  |  2 +-
 12 files changed, 17 insertions(+), 17 deletions(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index 04eecace44..d5bd3c73a8 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1626,7 +1626,7 @@ but here typical examples are presented:
 
 .. code-block:: bash
 
- breeze prod-image build --additional-extras "jira"
+ breeze prod-image build --additional-airflow-extras "jira"
 
 This installs additional ``jira`` extra while installing airflow in the image.
 
diff --git a/IMAGES.rst b/IMAGES.rst
index 240952a0e3..7c69d58e70 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -97,7 +97,7 @@ By adding ``--python `` parameter 
you can build the
 image version for the chosen Python version.
 
 The images are built with default extras - different extras for CI and 
production image and you
-can change the extras via the ``--extras`` parameters and add new ones with 
``--additional-extras``.
+can change the extras via the ``--extras`` parameters and add new ones with 
``--additional-airflow-extras``.
 
 For example if you want to build Python 3.8 version of production image with
 "all" extras installed you should run this command:
@@ -110,7 +110,7 @@ If you just want to add new extras you can add them like 
that:
 
 .. code-block:: bash
 
-  breeze prod-image build --python 3.8 --additional-extras "all"
+  breeze prod-image build --python 3.8 --additional-airflow-extras "all"
 
 The command that builds the CI image is optimized to minimize the time needed 
to rebuild the image when
 the source code of Airflow evolves. This means that if you already have the 
image locally downloaded and
@@ -128,7 +128,7 @@ parameter to Breeze:
 
 .. code-block:: bash
 
-  breeze prod-image build --python 3.8 --additional-extras=trino 
--install-airflow-version=2.0.0
+  breeze prod-image build --python 3.8 --additional-airflow-extras=trino 
--install-airflow-version=2.0.0
 
 This will build the image using command similar to:
 
@@ -170,7 +170,7 @@ You can also skip installing airflow and install it from 
locally provided files
 
 .. code-block:: bash
 
-  breeze prod-image build --python 3.8 --additional-extras=trino 
--install-packages-from-context
+  breeze prod-image build --python 3.8 --additional-airflow-extras=trino 
--install-packages-from-context
 
 In this case you airflow and all packages (.whl files) should be placed in 
``docker-context-files`` folder.
 
@@ -331,7 +331,7 @@ the same image can be built using ``breeze`` (it supports 
auto-completion of the
 
 .. code-block:: bash
 
-  breeze ci-image build --python 3.8 --additional-extras=jdbc 
--additional-python-deps="pandas" \
+  breeze ci-image build --python 3.8 --additional-airflow-extras=jdbc 
--additional-python-deps="pandas" \
   --additional-dev-apt-deps="gcc g++"
 
 You can customize more aspects of the image - such as additional commands 
executed before apt dependencies
diff --git a/dev/breeze/src/airflow_breeze/commands/ci_image_commands_config.py 
b/dev/breeze/src/airflow_breeze/commands/ci_image_commands_config.py
index f256406ea4..87e6ab6e91 100644
--- a/dev/breeze/src/airflow_breeze/commands/ci_image_commands_config.py
+++ b/dev/breeze/src/airflow_breeze/commands/ci_image_commands_config.py
@@ -60,7 +60,7 @@ CI_IMAGE_TOOLS_PARAMETERS: dict[str, list[dict[str, str | 
list[str = {
 "--airflow-constraints-reference",
 "--python-image",
 "--additional-python-deps",
-"--additional-extras",
+"

(airflow) 23/37: Fix permission check on menus (#35781)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b904523c5a2d8e7d0c359d45eef8514297b1f52c
Author: Vincent <97131062+vincb...@users.noreply.github.com>
AuthorDate: Tue Nov 21 14:44:29 2023 -0500

Fix permission check on menus (#35781)
---
 airflow/www/security_manager.py|  23 
 tests/www/test_security_manager.py | 115 +
 2 files changed, 126 insertions(+), 12 deletions(-)

diff --git a/airflow/www/security_manager.py b/airflow/www/security_manager.py
index 78d8ef00fe..13342ff7f7 100644
--- a/airflow/www/security_manager.py
+++ b/airflow/www/security_manager.py
@@ -137,15 +137,7 @@ class AirflowSecurityManagerV2(LoggingMixin):
 user = g.user
 
 is_authorized_method = 
self._get_auth_manager_is_authorized_method(resource_name)
-if is_authorized_method:
-return is_authorized_method(action_name, resource_pk, user)
-else:
-# This means the page the user is trying to access is specific to 
the auth manager used
-# Example: the user list view in FabAuthManager
-action_name = ACTION_CAN_READ if action_name == 
ACTION_CAN_ACCESS_MENU else action_name
-return get_auth_manager().is_authorized_custom_view(
-fab_action_name=action_name, fab_resource_name=resource_name, 
user=user
-)
+return is_authorized_method(action_name, resource_pk, user)
 
 def create_admin_standalone(self) -> tuple[str | None, str | None]:
 """Perform the required steps when initializing airflow for standalone 
mode.
@@ -331,7 +323,7 @@ class AirflowSecurityManagerV2(LoggingMixin):
 ),
 }
 
-def _get_auth_manager_is_authorized_method(self, fab_resource_name: str) 
-> Callable | None:
+def _get_auth_manager_is_authorized_method(self, fab_resource_name: str) 
-> Callable:
 is_authorized_method = 
self._auth_manager_is_authorized_map.get(fab_resource_name)
 if is_authorized_method:
 return is_authorized_method
@@ -340,12 +332,19 @@ class AirflowSecurityManagerV2(LoggingMixin):
 # least one dropdown child
 return self._is_authorized_category_menu(fab_resource_name)
 else:
-return None
+# This means the page the user is trying to access is specific to 
the auth manager used
+# Example: the user list view in FabAuthManager
+return lambda action, resource_pk, user: 
get_auth_manager().is_authorized_custom_view(
+fab_action_name=ACTION_CAN_READ if action == 
ACTION_CAN_ACCESS_MENU else action,
+fab_resource_name=fab_resource_name,
+user=user,
+)
 
 def _is_authorized_category_menu(self, category: str) -> Callable:
 items = {item.name for item in 
self.appbuilder.menu.find(category).childs}
 return lambda action, resource_pk, user: any(
-
self._get_auth_manager_is_authorized_method(fab_resource_name=item) for item in 
items
+
self._get_auth_manager_is_authorized_method(fab_resource_name=item)(action, 
resource_pk, user)
+for item in items
 )
 
 """
diff --git a/tests/www/test_security_manager.py 
b/tests/www/test_security_manager.py
new file mode 100644
index 00..fafb539ee3
--- /dev/null
+++ b/tests/www/test_security_manager.py
@@ -0,0 +1,115 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import Mock
+
+import pytest
+
+from airflow.security.permissions import (
+ACTION_CAN_READ,
+RESOURCE_ADMIN_MENU,
+RESOURCE_BROWSE_MENU,
+RESOURCE_DOCS_MENU,
+RESOURCE_VARIABLE,
+)
+from airflow.www import app as application
+
+
+@pytest.fixture()
+def app():
+return application.create_app(testing=True)
+
+
+@pytest.fixture()
+def app_builder(app):
+return app.appbuilder
+
+
+@pytest.fixture()
+def security_manager(app_builder):
+return app_buil

(airflow) 02/37: Update minor release command (#35751)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 70e2f419e69987715b195a5fae7b3f73d10024f9
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 15:10:55 2023 +0100

Update minor release command (#35751)

When creating the constraints file for the new release, pull the 
constraints-main
first
---
 dev/breeze/src/airflow_breeze/commands/minor_release_command.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/dev/breeze/src/airflow_breeze/commands/minor_release_command.py 
b/dev/breeze/src/airflow_breeze/commands/minor_release_command.py
index 0ccb9edb25..2fcbd621ac 100644
--- a/dev/breeze/src/airflow_breeze/commands/minor_release_command.py
+++ b/dev/breeze/src/airflow_breeze/commands/minor_release_command.py
@@ -138,6 +138,7 @@ def instruction_update_version_branch(version_branch):
 def create_constraints(version_branch):
 if confirm_action("Do you want to create branches from the constraints 
main?"):
 run_command(["git", "checkout", "constraints-main"], 
dry_run_override=DRY_RUN, check=True)
+run_command(["git", "pull", "origin", "constraints-main"], 
dry_run_override=DRY_RUN, check=True)
 run_command(
 ["git", "checkout", "-b", f"constraints-{version_branch}"], 
dry_run_override=DRY_RUN, check=True
 )



(airflow) 10/37: Add basic metrics to stats collector. (#35368)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 5d69fc142d9c475940d11b1bbb56bf1a6c36c7fb
Author: Jakub Dardzinski 
AuthorDate: Tue Nov 21 06:45:04 2023 +0100

Add basic metrics to stats collector. (#35368)

Signed-off-by: Jakub Dardzinski 
Co-authored-by: Elad Kalif <45845474+elad...@users.noreply.github.com>
---
 airflow/providers/openlineage/plugins/adapter.py   |  5 +-
 .../plugins/test_openlineage_adapter.py| 75 +++---
 2 files changed, 69 insertions(+), 11 deletions(-)

diff --git a/airflow/providers/openlineage/plugins/adapter.py 
b/airflow/providers/openlineage/plugins/adapter.py
index fabb14eaa3..a925ddf8e6 100644
--- a/airflow/providers/openlineage/plugins/adapter.py
+++ b/airflow/providers/openlineage/plugins/adapter.py
@@ -38,6 +38,7 @@ from openlineage.client.run import Job, Run, RunEvent, 
RunState
 from airflow.configuration import conf
 from airflow.providers.openlineage import __version__ as 
OPENLINEAGE_PROVIDER_VERSION
 from airflow.providers.openlineage.utils.utils import OpenLineageRedactor
+from airflow.stats import Stats
 from airflow.utils.log.logging_mixin import LoggingMixin
 
 if TYPE_CHECKING:
@@ -113,8 +114,10 @@ class OpenLineageAdapter(LoggingMixin):
 self._client = self.get_or_create_openlineage_client()
 redacted_event: RunEvent = self._redacter.redact(event, max_depth=20)  
# type: ignore[assignment]
 try:
-return self._client.emit(redacted_event)
+with Stats.timer("ol.emit.attempts"):
+return self._client.emit(redacted_event)
 except Exception as e:
+Stats.incr("ol.emit.failed")
 self.log.warning("Failed to emit OpenLineage event of id %s", 
event.run.runId)
 self.log.debug("OpenLineage emission failure: %s", e)
 
diff --git a/tests/providers/openlineage/plugins/test_openlineage_adapter.py 
b/tests/providers/openlineage/plugins/test_openlineage_adapter.py
index 685e88c725..bcb92b2b9b 100644
--- a/tests/providers/openlineage/plugins/test_openlineage_adapter.py
+++ b/tests/providers/openlineage/plugins/test_openlineage_adapter.py
@@ -39,7 +39,10 @@ from tests.test_utils.config import conf_vars
 pytestmark = pytest.mark.db_test
 
 
-@patch.dict(os.environ, {"OPENLINEAGE_URL": "http://ol-api:5000";, 
"OPENLINEAGE_API_KEY": "api-key"})
+@patch.dict(
+os.environ,
+{"OPENLINEAGE_URL": "http://ol-api:5000";, "OPENLINEAGE_API_KEY": 
"api-key"},
+)
 def test_create_client_from_ol_env():
 client = OpenLineageAdapter().get_or_create_openlineage_client()
 
@@ -90,7 +93,11 @@ def test_create_client_from_env_var_config():
 
 
 @patch.dict(
-os.environ, {"OPENLINEAGE_URL": "http://ol-from-env:5000";, 
"OPENLINEAGE_API_KEY": "api-key-from-env"}
+os.environ,
+{
+"OPENLINEAGE_URL": "http://ol-from-env:5000";,
+"OPENLINEAGE_API_KEY": "api-key-from-env",
+},
 )
 @patch.dict(os.environ, {"OPENLINEAGE_CONFIG": "some/config.yml"})
 def test_create_client_overrides_env_vars():
@@ -108,7 +115,9 @@ def test_create_client_overrides_env_vars():
 assert client.transport.kind == "console"
 
 
-def test_emit_start_event():
+@mock.patch("airflow.providers.openlineage.plugins.adapter.Stats.timer")
+@mock.patch("airflow.providers.openlineage.plugins.adapter.Stats.incr")
+def test_emit_start_event(mock_stats_incr, mock_stats_timer):
 client = MagicMock()
 adapter = OpenLineageAdapter(client)
 
@@ -138,7 +147,8 @@ def test_emit_start_event():
 runId=run_id,
 facets={
 "nominalTime": NominalTimeRunFacet(
-nominalStartTime="2022-01-01T00:00:00", 
nominalEndTime="2022-01-01T00:00:00"
+nominalStartTime="2022-01-01T00:00:00",
+nominalEndTime="2022-01-01T00:00:00",
 ),
 "processing_engine": ProcessingEngineRunFacet(
 version=ANY, name="Airflow", 
openlineageAdapterVersion=ANY
@@ -158,8 +168,13 @@ def test_emit_start_event():
 in client.emit.mock_calls
 )
 
+mock_stats_incr.assert_not_called()
+mock_stats_timer.assert_called_with("ol.emit.attempts")
+
 
-def test_emit_complete_event():
+@mock.patch("airflow.providers.openlineage.plugins.adapter.Stats.timer")
+@mock.patch("airflow.providers.openlineage.plugins.adapter.Stats.incr")
+def test_emit_complete_ev

(airflow) 21/37: Remove --force-build command in cache steps in CI (#35784)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 528d2bc51be862950ff86e815c33fdd78c71c175
Author: Jarek Potiuk 
AuthorDate: Tue Nov 21 19:40:02 2023 +0100

Remove --force-build command in cache steps in CI (#35784)

The `--force-build` command has not been used for a long time
in the `ci-image build` command for quite some time because the
sheer fact that you run the command means that you want to run
the build (so it made no sense to have it). The command only makes
sense when you want to force build when running `breeze` or
`breeze start-airflow` command.

However commands to build cache in CI still had this command used
and it caused the builds to fail after merging #35768 when the
option has been removed (it was not detected there, because
cache steps only run in main build)
---
 .github/workflows/ci.yml | 2 --
 1 file changed, 2 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 45e0a9fca0..1e8eabc7a5 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -232,7 +232,6 @@ jobs:
   --builder airflow_cache
   --prepare-buildx-cache
   --run-in-parallel
-  --force-build
   --platform ${{ matrix.platform }}
 env:
   DEBUG_RESOURCES: ${{needs.build-info.outputs.debug-resources}}
@@ -1890,7 +1889,6 @@ jobs:
   --builder airflow_cache
   --prepare-buildx-cache
   --run-in-parallel
-  --force-build
   --platform ${{ matrix.platform }}
 env:
   DEBUG_RESOURCES: ${{needs.build-info.outputs.debug-resources}}



(airflow) 22/37: Fix DataFusion example type annotations (#35753)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 599189e41e27a3b30a2e2d3f598f9b45a96b8547
Author: Andrey Anshin 
AuthorDate: Tue Nov 21 23:44:09 2023 +0400

Fix DataFusion example type annotations (#35753)
---
 tests/system/providers/google/cloud/datafusion/example_datafusion.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/tests/system/providers/google/cloud/datafusion/example_datafusion.py 
b/tests/system/providers/google/cloud/datafusion/example_datafusion.py
index 38f443a706..506323dccb 100644
--- a/tests/system/providers/google/cloud/datafusion/example_datafusion.py
+++ b/tests/system/providers/google/cloud/datafusion/example_datafusion.py
@@ -215,7 +215,7 @@ with DAG(
 # [END howto_cloud_data_fusion_update_instance_operator]
 
 @task(task_id="get_artifacts_versions")
-def get_artifacts_versions(ti) -> dict:
+def get_artifacts_versions(ti=None):
 hook = DataFusionHook()
 instance_url = ti.xcom_pull(task_ids="get_instance", 
key="return_value")["apiEndpoint"]
 artifacts = hook.get_instance_artifacts(instance_url=instance_url, 
namespace="default")
@@ -319,7 +319,7 @@ with DAG(
 # TEST BODY
 >> create_instance
 >> get_instance
->> get_artifacts_versions()  # type: ignore[call-arg]
+>> get_artifacts_versions()
 >> restart_instance
 >> update_instance
 >> create_pipeline



(airflow) 25/37: Reflect drop/add support of DB Backends versions in documentation (#35785)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 58743f28d5d362c6f4eda9cc10c51eaecd9ad6ab
Author: Andrey Anshin 
AuthorDate: Wed Nov 22 01:43:16 2023 +0400

Reflect drop/add support of DB Backends versions in documentation (#35785)
---
 .../administration-and-deployment/scheduler.rst| 14 --
 docs/apache-airflow/howto/set-up-database.rst  | 10 +++---
 docs/apache-airflow/installation/prerequisites.rst | 11 +++
 3 files changed, 14 insertions(+), 21 deletions(-)

diff --git a/docs/apache-airflow/administration-and-deployment/scheduler.rst 
b/docs/apache-airflow/administration-and-deployment/scheduler.rst
index 06769d1595..a01020a686 100644
--- a/docs/apache-airflow/administration-and-deployment/scheduler.rst
+++ b/docs/apache-airflow/administration-and-deployment/scheduler.rst
@@ -118,7 +118,7 @@ This does, however, place some requirements on the Database.
 Database Requirements
 """""""""""""""""""""
 
-The short version is that users of PostgreSQL 10+ or MySQL 8+ are all ready to 
go -- you can start running as
+The short version is that users of PostgreSQL 12+ or MySQL 8.0+ are all ready 
to go -- you can start running as
 many copies of the scheduler as you like -- there is no further set up or 
config options needed. If you are
 using a different database please read on.
 
@@ -134,8 +134,8 @@ UPDATE NOWAIT`` but the exact query is slightly different).
 
 The following databases are fully supported and provide an "optimal" 
experience:
 
-- PostgreSQL 10+
-- MySQL 8+
+- PostgreSQL 12+
+- MySQL 8.0+
 
 .. warning::
 
@@ -144,15 +144,9 @@ The following databases are fully supported and provide an 
"optimal" experience:
   Without these features, running multiple schedulers is not supported and 
deadlock errors have been reported. MariaDB
   10.6.0 and following may work appropriately with multiple schedulers, but 
this has not been tested.
 
-.. warning::
-
-  MySQL 5.x does not support ``SKIP LOCKED`` or ``NOWAIT``, and additionally 
is more prone to deciding
-  queries are deadlocked, so running with more than a single scheduler on 
MySQL 5.x is not supported or
-  recommended.
-
 .. note::
 
-  Microsoft SQLServer has not been tested with HA.
+  Microsoft SQL Server has not been tested with HA.
 
 .. _fine-tuning-scheduler:
 
diff --git a/docs/apache-airflow/howto/set-up-database.rst 
b/docs/apache-airflow/howto/set-up-database.rst
index 3c8f7a9bb1..b164552814 100644
--- a/docs/apache-airflow/howto/set-up-database.rst
+++ b/docs/apache-airflow/howto/set-up-database.rst
@@ -32,8 +32,8 @@ By default, Airflow uses **SQLite**, which is intended for 
development purposes
 
 Airflow supports the following database engine versions, so make sure which 
version you have. Old versions may not support all SQL statements.
 
-* PostgreSQL: 11, 12, 13, 14, 15
-* MySQL: 5.7, 8
+* PostgreSQL: 12, 13, 14, 15, 16
+* MySQL: 8.0, `Innovation 
<https://dev.mysql.com/blog-archive/introducing-mysql-innovation-and-long-term-support-lts-versions>`_
 * MSSQL (Experimental, **Discontinued soon**): 2017, 2019
 * SQLite: 3.15.0+
 
@@ -309,7 +309,11 @@ In addition, you also should pay particular attention to 
MySQL's encoding. Altho
 
 .. note::
 
-   In strict mode, MySQL doesn't allow ``-00-00`` as a valid date. Then 
you might get errors like ``"Invalid default value for 'end_date'"`` in some 
cases (some Airflow tables use ``-00-00 00:00:00`` as timestamp field 
default value). To avoid this error, you could disable ``NO_ZERO_DATE`` mode on 
you MySQL server. Read 
https://stackoverflow.com/questions/9192027/invalid-default-value-for-create-date-timestamp-field
 for how to disable it. See `SQL Mode - NO_ZERO_DATE <https://dev [...]
+In strict mode, MySQL doesn't allow ``-00-00`` as a valid date. Then 
you might get errors like
+``"Invalid default value for 'end_date'"`` in some cases (some Airflow 
tables use ``-00-00 00:00:00`` as timestamp field default value).
+To avoid this error, you could disable ``NO_ZERO_DATE`` mode on you MySQL 
server.
+Read 
https://stackoverflow.com/questions/9192027/invalid-default-value-for-create-date-timestamp-field
 for how to disable it.
+See `SQL Mode - NO_ZERO_DATE 
<https://dev.mysql.com/doc/refman/8.0/en/sql-mode.html#sqlmode_no_zero_date>`__ 
for more information.
 
 Setting up a MsSQL Database
 ---
diff --git a/docs/apache-airflow/installation/prerequisites.rst 
b/docs/apache-airflow/installation/prerequisites.rst
index 3b8f5c84d9..f2fefbfbaa 100644
--- a/docs/apache-airflow/installation/prerequisites.rst
+++ b/docs/apache-airflow/install

(airflow) 18/37: Add OpenLineage support to GCSToBigQueryOperator (#35778)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b51aaf59d2280e4341807ed39e3a110a26627426
Author: Kacper Muda 
AuthorDate: Tue Nov 21 18:14:36 2023 +0100

Add OpenLineage support to GCSToBigQueryOperator (#35778)
---
 .../google/cloud/transfers/gcs_to_bigquery.py  |  77 +
 .../google/cloud/transfers/test_gcs_to_bigquery.py | 380 -
 2 files changed, 455 insertions(+), 2 deletions(-)

diff --git a/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py 
b/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py
index 94c233d6c5..798bc8a52b 100644
--- a/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py
+++ b/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py
@@ -41,6 +41,7 @@ from airflow.providers.google.cloud.hooks.bigquery import 
BigQueryHook, BigQuery
 from airflow.providers.google.cloud.hooks.gcs import GCSHook
 from airflow.providers.google.cloud.links.bigquery import BigQueryTableLink
 from airflow.providers.google.cloud.triggers.bigquery import 
BigQueryInsertJobTrigger
+from airflow.utils.helpers import merge_dicts
 
 if TYPE_CHECKING:
 from google.api_core.retry import Retry
@@ -294,6 +295,8 @@ class GCSToBigQueryOperator(BaseOperator):
 self.reattach_states: set[str] = reattach_states or set()
 self.cancel_on_kill = cancel_on_kill
 
+self.source_uris: list[str] = []
+
 def _submit_job(
 self,
 hook: BigQueryHook,
@@ -731,3 +734,77 @@ class GCSToBigQueryOperator(BaseOperator):
 self.hook.cancel_job(job_id=self.job_id, location=self.location)  
# type: ignore[union-attr]
 else:
 self.log.info("Skipping to cancel job: %s.%s", self.location, 
self.job_id)
+
+def get_openlineage_facets_on_complete(self, task_instance):
+"""Implementing on_complete as we will include final BQ job id."""
+from pathlib import Path
+
+from openlineage.client.facet import (
+ExternalQueryRunFacet,
+SymlinksDatasetFacet,
+SymlinksDatasetFacetIdentifiers,
+)
+from openlineage.client.run import Dataset
+
+from airflow.providers.google.cloud.hooks.gcs import _parse_gcs_url
+from airflow.providers.google.cloud.utils.openlineage import (
+get_facets_from_bq_table,
+get_identity_column_lineage_facet,
+)
+from airflow.providers.openlineage.extractors import OperatorLineage
+
+table_object = self.hook.get_client(self.hook.project_id).get_table(
+self.destination_project_dataset_table
+)
+
+output_dataset_facets = get_facets_from_bq_table(table_object)
+
+input_dataset_facets = {
+"schema": output_dataset_facets["schema"],
+}
+input_datasets = []
+for uri in sorted(self.source_uris):
+bucket, blob = _parse_gcs_url(uri)
+additional_facets = {}
+
+if "*" in blob:
+# If wildcard ("*") is used in gcs path, we want the name of 
dataset to be directory name,
+# but we create a symlink to the full object path with 
wildcard.
+additional_facets = {
+"symlink": SymlinksDatasetFacet(
+identifiers=[
+SymlinksDatasetFacetIdentifiers(
+namespace=f"gs://{bucket}", name=blob, 
type="file"
+)
+]
+),
+}
+blob = Path(blob).parent.as_posix()
+if blob == ".":
+# blob path does not have leading slash, but we need root 
dataset name to be "/"
+blob = "/"
+
+dataset = Dataset(
+namespace=f"gs://{bucket}",
+name=blob,
+facets=merge_dicts(input_dataset_facets, additional_facets),
+)
+input_datasets.append(dataset)
+
+output_dataset_facets["columnLineage"] = 
get_identity_column_lineage_facet(
+field_names=[field.name for field in table_object.schema], 
input_datasets=input_datasets
+)
+
+output_dataset = Dataset(
+namespace="bigquery",
+name=str(table_object.reference),
+facets=output_dataset_facets,
+)
+
+run_facets = {}
+if self.job_id:
+run_facets = {
+"externalQuery": 
ExternalQueryRunFacet(externalQueryId=self.job_id, source="bigquery"),
+}
+
+return OperatorLineage(inputs=input_datasets, 
outputs=[output_d

(airflow) 06/37: feature(providers): added `OpsgenieNotifier` (#35530)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit bd644fab3e548fa8d1cedc56bf35787772d0f14b
Author: Eric Mendes <42689328+eric-men...@users.noreply.github.com>
AuthorDate: Mon Nov 20 17:49:27 2023 -0300

feature(providers): added `OpsgenieNotifier` (#35530)



-

Co-authored-by: Utkarsh Sharma 
Co-authored-by: Andrey Anshin 
---
 .../providers/opsgenie/notifications/__init__.py   |  16 +++
 .../providers/opsgenie/notifications/opsgenie.py   |  81 ++
 airflow/providers/opsgenie/provider.yaml   |   3 +
 airflow/providers/opsgenie/typing/__init__.py  |  16 +++
 airflow/providers/opsgenie/typing/opsgenie.py  |  59 ++
 docs/apache-airflow-providers-opsgenie/index.rst   |   1 +
 .../notifications/index.rst|  30 ++
 .../notifications/opsgenie_notifier.rst|  33 ++
 tests/providers/opsgenie/notifications/__init__.py |  16 +++
 .../opsgenie/notifications/test_opsgenie.py| 119 +
 tests/providers/opsgenie/typing/__init__.py|  16 +++
 tests/providers/opsgenie/typing/test_opsgenie.py   |  29 +
 .../opsgenie/example_opsgenie_notifier.py  |  42 
 13 files changed, 461 insertions(+)

diff --git a/airflow/providers/opsgenie/notifications/__init__.py 
b/airflow/providers/opsgenie/notifications/__init__.py
new file mode 100644
index 00..13a83393a9
--- /dev/null
+++ b/airflow/providers/opsgenie/notifications/__init__.py
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
diff --git a/airflow/providers/opsgenie/notifications/opsgenie.py 
b/airflow/providers/opsgenie/notifications/opsgenie.py
new file mode 100644
index 00..950d92939e
--- /dev/null
+++ b/airflow/providers/opsgenie/notifications/opsgenie.py
@@ -0,0 +1,81 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from functools import cached_property
+from typing import TYPE_CHECKING, Sequence
+
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+try:
+from airflow.notifications.basenotifier import BaseNotifier
+except ImportError:
+raise AirflowOptionalProviderFeatureException(
+"Failed to import BaseNotifier. This feature is only available in 
Airflow versions >= 2.6.0"
+)
+
+from airflow.providers.opsgenie.hooks.opsgenie import OpsgenieAlertHook
+
+if TYPE_CHECKING:
+from airflow.providers.opsgenie.typing.opsgenie import CreateAlertPayload
+from airflow.utils.context import Context
+
+
+class OpsgenieNotifier(BaseNotifier):
+"""
+This notifier allows you to post alerts to Opsgenie.
+
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+This notifier sets the domain to conn_id.host, and if not set will default
+to ``https://api.opsgenie.com``.
+
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this notifier.
+
+.. seealso::
+For more information on how to use this notifier, take a look at the 
guide:
+:ref:`howto/notifier:OpsgenieNotifier`
+
+:param payload: The payload necessary for creating an alert.
+:param opsgenie_conn_id: Optional. T

(airflow) 11/37: Extend task context logging support for remote logging using Elasticsearch (#32977)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d9cfdd8131d797735b406554674282ea0f8dcbae
Author: Pankaj Koti 
AuthorDate: Tue Nov 21 11:55:03 2023 +0530

Extend task context logging support for remote logging using Elasticsearch 
(#32977)

* Extend task context logging support for remote logging using Elasticsearch

With the addition of task context logging feature in PR #32646,
this PR extends the feature to Elasticsearch when is it set as
remote logging store. Here, backward compatibility is ensured for
older versions of Airflow that do not have the feature included
in Airflow Core.

* update ensure_ti

-

Co-authored-by: Daniel Standish 
<15932138+dstand...@users.noreply.github.com>
---
 .../providers/elasticsearch/log/es_task_handler.py | 46 +++---
 1 file changed, 41 insertions(+), 5 deletions(-)

diff --git a/airflow/providers/elasticsearch/log/es_task_handler.py 
b/airflow/providers/elasticsearch/log/es_task_handler.py
index 79f9ad0b41..1e8c75b7e3 100644
--- a/airflow/providers/elasticsearch/log/es_task_handler.py
+++ b/airflow/providers/elasticsearch/log/es_task_handler.py
@@ -34,7 +34,7 @@ import pendulum
 from elasticsearch.exceptions import NotFoundError
 
 from airflow.configuration import conf
-from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.exceptions import AirflowException, 
AirflowProviderDeprecationWarning
 from airflow.models.dagrun import DagRun
 from airflow.providers.elasticsearch.log.es_json_formatter import 
ElasticsearchJSONFormatter
 from airflow.providers.elasticsearch.log.es_response import 
ElasticSearchResponse, Hit
@@ -46,7 +46,8 @@ from airflow.utils.session import create_session
 if TYPE_CHECKING:
 from datetime import datetime
 
-from airflow.models.taskinstance import TaskInstance
+from airflow.models.taskinstance import TaskInstance, TaskInstanceKey
+
 
 LOG_LINE_DEFAULTS = {"exc_text": "", "stack_info": ""}
 # Elasticsearch hosted log type
@@ -84,6 +85,32 @@ def get_es_kwargs_from_config() -> dict[str, Any]:
 return kwargs_dict
 
 
+def _ensure_ti(ti: TaskInstanceKey | TaskInstance, session) -> TaskInstance:
+"""Given TI | TIKey, return a TI object.
+
+Will raise exception if no TI is found in the database.
+"""
+from airflow.models.taskinstance import TaskInstance, TaskInstanceKey
+
+if not isinstance(ti, TaskInstanceKey):
+return ti
+val = (
+session.query(TaskInstance)
+.filter(
+TaskInstance.task_id == ti.task_id,
+TaskInstance.dag_id == ti.dag_id,
+TaskInstance.run_id == ti.run_id,
+TaskInstance.map_index == ti.map_index,
+)
+.one_or_none()
+)
+if isinstance(val, TaskInstance):
+val._try_number = ti.try_number
+return val
+else:
+raise AirflowException(f"Could not find TaskInstance for {ti}")
+
+
 class ElasticsearchTaskHandler(FileTaskHandler, ExternalLoggingMixin, 
LoggingMixin):
 """
 ElasticsearchTaskHandler is a python log handler that reads logs from 
Elasticsearch.
@@ -182,8 +209,12 @@ class ElasticsearchTaskHandler(FileTaskHandler, 
ExternalLoggingMixin, LoggingMix
 
 return host
 
-def _render_log_id(self, ti: TaskInstance, try_number: int) -> str:
+def _render_log_id(self, ti: TaskInstance | TaskInstanceKey, try_number: 
int) -> str:
+from airflow.models.taskinstance import TaskInstanceKey
+
 with create_session() as session:
+if isinstance(ti, TaskInstanceKey):
+ti = _ensure_ti(ti, session)
 dag_run = ti.get_dagrun(session=session)
 if USE_PER_RUN_LOG_ID:
 log_id_template = 
dag_run.get_log_template(session=session).elasticsearch_id
@@ -377,11 +408,13 @@ class ElasticsearchTaskHandler(FileTaskHandler, 
ExternalLoggingMixin, LoggingMix
 setattr(record, self.offset_field, int(time.time() * (10**9)))
 self.handler.emit(record)
 
-def set_context(self, ti: TaskInstance, **kwargs) -> None:
+def set_context(self, ti: TaskInstance, *, identifier: str | None = None) 
-> None:
 """
 Provide task_instance context to airflow task handler.
 
 :param ti: task instance object
+:param identifier: if set, identifies the Airflow component which is 
relaying logs from
+exceptional scenarios related to the task instance
 """
 is_trigger_log_context = getattr(ti, "is_trigger_log_context", None)
 is_ti_raw = getattr(ti, "raw", None)
@@ -410,7 +443,10 @@ class ElasticsearchTaskHandl

(airflow) 17/37: Move `BaseOperatorLink` into the separate module (#35032)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4454fc870cda6924cdac100c3ca8e81d53716d40
Author: Andrey Anshin 
AuthorDate: Tue Nov 21 21:10:40 2023 +0400

Move `BaseOperatorLink` into the separate module (#35032)

* Move `BaseOperatorLink` into the separate module

* Add `airflow.models.baseoperatorlink` as part of Public Interface of 
Airflow

* Ban `airflow.models.baseoperator.BaseOperatorLink` usage in codebase

* Return back check-base-operator-usage pre-commit hooks
---
 .pre-commit-config.yaml| 23 +++--
 STATIC_CODE_CHECKS.rst |  3 +-
 airflow/models/__init__.py |  5 +-
 airflow/models/abstractoperator.py |  3 +-
 airflow/models/baseoperator.py | 49 +--
 airflow/models/baseoperatorlink.py | 57 ++
 airflow/models/mappedoperator.py   |  3 +-
 airflow/operators/trigger_dagrun.py|  3 +-
 airflow/sensors/external_task.py   |  2 +-
 airflow/serialization/serialized_objects.py|  2 +-
 docs/apache-airflow/howto/define-extra-link.rst|  9 ++--
 docs/apache-airflow/public-airflow-interface.rst   |  7 +++
 docs/conf.py   |  1 +
 pyproject.toml |  1 +
 .../endpoints/test_extra_link_endpoint.py  |  2 +-
 .../endpoints/test_plugin_endpoint.py  |  2 +-
 tests/api_connexion/schemas/test_plugin_schema.py  |  2 +-
 tests/serialization/test_dag_serialization.py  |  3 +-
 tests/test_utils/mock_operators.py |  3 +-
 tests/www/views/test_views_extra_links.py  |  3 +-
 20 files changed, 135 insertions(+), 48 deletions(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 606f56947e..f7e886afd1 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -576,9 +576,24 @@ repos:
 files: 
^airflow/models/dag\.py$|^airflow/(?:decorators|utils)/task_group\.py$
   - id: check-base-operator-usage
 language: pygrep
-name: Check BaseOperator[Link] core imports
-description: Make sure BaseOperator[Link] is imported from 
airflow.models.baseoperator in core
-entry: "from airflow\\.models import.* BaseOperator"
+name: Check BaseOperator core imports
+description: Make sure BaseOperator is imported from 
airflow.models.baseoperator in core
+entry: "from airflow\\.models import.* BaseOperator\\b"
+files: \.py$
+pass_filenames: true
+exclude: >
+  (?x)
+  ^airflow/decorators/.*$|
+  ^airflow/hooks/.*$|
+  ^airflow/operators/.*$|
+  ^airflow/providers/.*$|
+  ^airflow/sensors/.*$|
+  ^dev/provider_packages/.*$
+  - id: check-base-operator-usage
+language: pygrep
+name: Check BaseOperatorLink core imports
+description: Make sure BaseOperatorLink is imported from 
airflow.models.baseoperatorlink in core
+entry: "from airflow\\.models import.* BaseOperatorLink"
 files: \.py$
 pass_filenames: true
 exclude: >
@@ -593,7 +608,7 @@ repos:
 language: pygrep
 name: Check BaseOperator[Link] other imports
 description: Make sure BaseOperator[Link] is imported from 
airflow.models outside of core
-entry: "from airflow\\.models\\.baseoperator import.* BaseOperator"
+entry: "from airflow\\.models\\.baseoperator(link)? import.* 
BaseOperator"
 pass_filenames: true
 files: >
   (?x)
diff --git a/STATIC_CODE_CHECKS.rst b/STATIC_CODE_CHECKS.rst
index da606c9b40..56533885de 100644
--- a/STATIC_CODE_CHECKS.rst
+++ b/STATIC_CODE_CHECKS.rst
@@ -154,7 +154,8 @@ require Breeze Docker image to be built locally.
 
+---+--+-+
 | check-base-operator-partial-arguments | Check 
BaseOperator and partial() arguments   | |
 
+---+--+-+
-| check-base-operator-usage | * Check 
BaseOperator[Link] core imports  | |
+| check-base-operator-usage | * Check 
BaseOperator core imports| |
+|   | * Check 
BaseOperatorLink core imports| |
 |   

(airflow) 01/37: Add v2-8 branches to codecov.yml and .asf.yaml (#35750)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 1fc0633f894ed8902b8a73af21421eeb380af839
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 15:10:37 2023 +0100

Add v2-8 branches to codecov.yml and .asf.yaml (#35750)
---
 .asf.yaml   | 3 +++
 codecov.yml | 2 ++
 2 files changed, 5 insertions(+)

diff --git a/.asf.yaml b/.asf.yaml
index 6166687a5d..094299d2c4 100644
--- a/.asf.yaml
+++ b/.asf.yaml
@@ -71,6 +71,9 @@ github:
 v2-7-stable:
   required_pull_request_reviews:
 required_approving_review_count: 1
+v2-8-stable:
+  required_pull_request_reviews:
+required_approving_review_count: 1
 
   collaborators:
 - mhenc
diff --git a/codecov.yml b/codecov.yml
index 67ea777302..d1ed5fb446 100644
--- a/codecov.yml
+++ b/codecov.yml
@@ -55,6 +55,8 @@ coverage:
   - v2-6-test
   - v2-7-stable
   - v2-7-test
+  - v2-8-stable
+  - v2-8-test
 if_not_found: success
 if_ci_failed: error
 informational: true



(airflow) 24/37: Remove pendulum as dependency of breeze (#35786)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2a240e65d5a9da93eb1008e8e0a9ef93eb216190
Author: Jarek Potiuk 
AuthorDate: Tue Nov 21 22:29:37 2023 +0100

Remove pendulum as dependency of breeze (#35786)

For some reason (likely importing some stuff from setup.py in the
old days) pendulum was added as dependency in breeze - which still
caused a problem when `pipx` decided to use Python 3.12 to install
Breeze (despite #35652 that was supposed to supersede #35620).

The #35620 adding a need to specify python additionally when you
install breeze added it's own complexity (which python?), it
turns out that breeze does not need to have pendulum installed
at all now (we stopped depending on airflow being installed
and stopped importing things from setup.py or __version__ in
favour of directly parsing __version__ variable from python code.

This PR removes pendulum entirely as Breeze dependency.
---
 dev/breeze/README.md  | 2 +-
 dev/breeze/pyproject.toml | 1 -
 2 files changed, 1 insertion(+), 2 deletions(-)

diff --git a/dev/breeze/README.md b/dev/breeze/README.md
index b2d7130210..9db81c1fd4 100644
--- a/dev/breeze/README.md
+++ b/dev/breeze/README.md
@@ -66,6 +66,6 @@ PLEASE DO NOT MODIFY THE HASH BELOW! IT IS AUTOMATICALLY 
UPDATED BY PRE-COMMIT.
 
 
-
 
-Package config hash: 
a5878ba073fa5924f21660531f0988f287269f0d3aca741095cad62b3a1f3ccb262f76df203aff1f02cfec691f839da02bc6844342f49e40f896a1c9b3c450d8
+Package config hash: 
c7d80ab49c6dc4bf2b54957663b0126ab9c8f48df28a34c0eb56340540cb1f52d063ef99ee5f9cacbd375b1a711278884f9ef9aab41e620fa70fffd81f7ece3c
 
 
-
diff --git a/dev/breeze/pyproject.toml b/dev/breeze/pyproject.toml
index 0f0e47a517..aca59140ac 100644
--- a/dev/breeze/pyproject.toml
+++ b/dev/breeze/pyproject.toml
@@ -54,7 +54,6 @@ dependencies = [
 "jinja2>=3.1.0",
 "jsonschema>=4.19.1",
 "packaging>=23.2",
-"pendulum>=2.1.2,<3",
 "pre-commit>=3.5.0",
 "psutil>=5.9.6",
 "pygithub>=2.1.1",



(airflow) 13/37: Improve ownership fixing for Breeze (#35759)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 26d5e3f4e72c349b9f041fbb11f2e4791e6da1db
Author: Jarek Potiuk 
AuthorDate: Tue Nov 21 10:30:50 2023 +0100

Improve ownership fixing for Breeze (#35759)

When we run airflow package preparation in docker containers and
asset compilation, we use docker containers - similarly as breeze
CI image, those constainers use root user in container. This means
that the files created by those containers are owned by root on
Linux - which means that we should change ownership of these files
after they were generated.

Used the opportunity to rewrtie "fix_ownership" part to Python
and remove bash scripts used for it so far.
---
 .github/actions/build-ci-images/action.yml |  4 --
 .github/actions/build-prod-images/action.yml   |  4 --
 .github/actions/post_tests_failure/action.yml  |  3 -
 .github/actions/post_tests_success/action.yml  |  3 -
 .github/workflows/build-images.yml |  3 -
 .github/workflows/ci.yml   | 53 --
 .github/workflows/release_dockerhub_image.yml  |  3 -
 Dockerfile.ci  |  2 -
 ...-root-ownership-after-exiting-docker-command.md | 60 
 .../airflow_breeze/commands/developer_commands.py  |  7 ++
 .../commands/release_management_commands.py| 21 ++
 .../airflow_breeze/commands/testing_commands.py|  5 ++
 .../airflow_breeze/utils/docker_command_utils.py   | 17 +++--
 dev/breeze/src/airflow_breeze/utils/parallel.py|  6 ++
 dev/breeze/src/airflow_breeze/utils/run_utils.py   |  2 +-
 scripts/docker/entrypoint_ci.sh|  2 -
 scripts/in_container/_in_container_utils.sh| 40 ---
 scripts/in_container/run_fix_ownership.py  | 84 ++
 scripts/in_container/run_fix_ownership.sh  | 21 --
 .../in_container/run_prepare_airflow_packages.py   | 12 ++--
 20 files changed, 201 insertions(+), 151 deletions(-)

diff --git a/.github/actions/build-ci-images/action.yml 
b/.github/actions/build-ci-images/action.yml
index d8c0216bb4..f950c53b4e 100644
--- a/.github/actions/build-ci-images/action.yml
+++ b/.github/actions/build-ci-images/action.yml
@@ -45,7 +45,3 @@ runs:
 name: source-constraints
 path: ./files/constraints-*/constraints-*.txt
 retention-days: 7
-- name: "Fix ownership"
-  shell: bash
-  run: breeze ci fix-ownership
-  if: always()
diff --git a/.github/actions/build-prod-images/action.yml 
b/.github/actions/build-prod-images/action.yml
index 7572153169..5fdbb795c4 100644
--- a/.github/actions/build-prod-images/action.yml
+++ b/.github/actions/build-prod-images/action.yml
@@ -72,7 +72,3 @@ runs:
   env:
 COMMIT_SHA: ${{ github.sha }}
   if: ${{ inputs.build-provider-packages != 'true' }}
-- name: "Fix ownership"
-  shell: bash
-  run: breeze ci fix-ownership
-  if: always()
diff --git a/.github/actions/post_tests_failure/action.yml 
b/.github/actions/post_tests_failure/action.yml
index 96e43bbe0e..5b51db97ed 100644
--- a/.github/actions/post_tests_failure/action.yml
+++ b/.github/actions/post_tests_failure/action.yml
@@ -39,6 +39,3 @@ runs:
 name: container-logs-${{env.JOB_ID}}
 path: "./files/other_logs*"
 retention-days: 7
-- name: "Fix ownership"
-  shell: bash
-  run: breeze ci fix-ownership
diff --git a/.github/actions/post_tests_success/action.yml 
b/.github/actions/post_tests_success/action.yml
index 1325c5dde3..ac1e4fb8d2 100644
--- a/.github/actions/post_tests_success/action.yml
+++ b/.github/actions/post_tests_success/action.yml
@@ -40,6 +40,3 @@ runs:
 name: coverage-${{env.JOB_ID}}
 flags: 
python-${{env.PYTHON_MAJOR_MINOR_VERSION}},${{env.BACKEND}}-${{env.BACKEND_VERSION}}
 directory: "./files/coverage-reposts/"
-- name: "Fix ownership"
-  shell: bash
-  run: breeze ci fix-ownership
diff --git a/.github/workflows/build-images.yml 
b/.github/workflows/build-images.yml
index 82498ebde0..b29d49de17 100644
--- a/.github/workflows/build-images.yml
+++ b/.github/workflows/build-images.yml
@@ -362,6 +362,3 @@ jobs:
   - name: "Stop ARM instance"
 run: ./scripts/ci/images/ci_stop_arm_instance.sh
 if: always()
-  - name: "Fix ownership"
-run: breeze ci fix-ownership
-if: always()
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index da4dbfde53..45e0a9fca0 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -260,11 +260,6 @@ jobs:
 if: >
   matrix.platform == 'linux/amd64' && 
needs.build-info.outputs.canary-run == &

(airflow) 26/37: Update emr.py (#35787)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e0736092a89b1b8c46d389554c5e6abadcf8de21
Author: Tony Zhang <45370652+t...@users.noreply.github.com>
AuthorDate: Tue Nov 21 15:24:49 2023 -0800

Update emr.py (#35787)
---
 airflow/providers/amazon/aws/operators/emr.py | 18 +++---
 1 file changed, 11 insertions(+), 7 deletions(-)

diff --git a/airflow/providers/amazon/aws/operators/emr.py 
b/airflow/providers/amazon/aws/operators/emr.py
index 1067464474..1f0b247b75 100644
--- a/airflow/providers/amazon/aws/operators/emr.py
+++ b/airflow/providers/amazon/aws/operators/emr.py
@@ -1270,14 +1270,18 @@ class EmrServerlessStartJobOperator(BaseOperator):
 )
 self.log.info("Starting job on Application: %s", self.application_id)
 self.name = self.name or self.config.pop("name", 
f"emr_serverless_job_airflow_{uuid4()}")
-response = self.hook.conn.start_job_run(
-clientToken=self.client_request_token,
-applicationId=self.application_id,
-executionRoleArn=self.execution_role_arn,
-jobDriver=self.job_driver,
-configurationOverrides=self.configuration_overrides,
-name=self.name,
+args = {
+"clientToken": self.client_request_token,
+"applicationId": self.application_id,
+"executionRoleArn": self.execution_role_arn,
+"jobDriver": self.job_driver,
+"name": self.name,
 **self.config,
+}
+if self.configuration_overrides is not None:
+args["configurationOverrides"] = self.configuration_overrides
+response = self.hook.conn.start_job_run(
+**args,
 )
 
 if response["ResponseMetadata"]["HTTPStatusCode"] != 200:



(airflow) 37/37: Update RELEASE_NOTES.rst

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 5bbc46005f20c20380d18907899b0ddff33eecde
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 22:51:30 2023 +0100

Update RELEASE_NOTES.rst
---
 RELEASE_NOTES.rst   | 169 +++-
 newsfragments/35460.significant.rst |  10 ---
 2 files changed, 168 insertions(+), 11 deletions(-)

diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index 62183c8b58..7a5883154f 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -21,7 +21,174 @@
 
 .. towncrier release notes start
 
-Airflow 2.7.3 (2023-11-04)
+Airflow 2.8.0 (2023-12-14)
+--
+
+Significant Changes
+^^^
+
+- Raw HTML code in DAG docs and DAG params descriptions is disabled by default
+
+  To ensure that no malicious javascript can be injected with DAG descriptions 
or trigger UI forms by DAG authors
+  a new parameter ``webserver.allow_raw_html_descriptions`` was added with 
default value of ``False``.
+  If you trust your DAG authors code and want to allow using raw HTML in DAG 
descriptions and params, you can restore the previous
+  behavior by setting the configuration value to ``True``.
+
+  To ensure Airflow is secure by default, the raw HTML support in trigger UI 
has been super-seeded by markdown support via
+  the ``description_md`` attribute. If you have been using 
``description_html`` please migrate to ``description_md``.
+  The ``custom_html_form`` is now deprecated. (#35460)
+
+New Features
+""""""""""""
+- AIP-58: Add Airflow ObjectStore (AFS) (`AIP-58 
<https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-58+milestone%3A%22Airflow+2.8.0%22>`_)
+- Add "literal" wrapper to disable field templating (#35017)
+- Add task context logging feature to allow forwarding messages to task logs 
(#32646, #32693)
+- Add Listener hooks for Datasets (#34418)
+- Allow override of navbar text color (#35505)
+- Add lightweight serialization for deltalake tables (#35462)
+- Add support for serialization of iceberg tables (#35456)
+- ``prev_end_date_success`` method access (#34528)
+- Add task parameter to set custom logger name (#34964)
+- Add pyspark decorator (#35247)
+- Add trigger as a valid option for the db clean command (#34908)
+- Add decorators for external and venv python branching operators (#35043)
+- Allow PythonVenvOperator using other index url (#33017)
+- Add Python Virtualenv Operator Caching (#33355)
+- Introduce a generic export for containerized executor logging (#34903)
+- Add ability to clear downstream tis in ``List Task Instances`` view  (#34529)
+- Attribute ``clear_number`` to track DAG run being cleared (#34126)
+- Add BranchPythonVirtualenvOperator (#33356)
+- Allow PythonVenvOperator using other index url (#33017)
+- Add CLI notification commands to providers (#33116)
+
+Improvements
+""""""""""""
+- Move ``BaseOperatorLink`` into the separate module (#35032)
+- Set mark_end_on_close after set_context (#35761)
+- Move external logs links to top of react logs page (#35668)
+- Change terminal mode to ``cbreak`` in ``execute_interactive`` and handle 
``SIGINT`` (#35602)
+- Make raw HTML descriptions configurable (#35460)
+- Allow email field to be templated (#35546)
+- Hide logical date and run id in trigger UI form (#35284)
+- Improved instructions for adding dependencies in TaskFlow (#35406)
+- Add optional exit code to list import errors (#35378)
+- Limit query result on DB rather than client in ``synchronize_log_template`` 
function (#35366)
+- Feature: Allow description to be passed in when using variables CLI (#34791)
+- Allow optional defaults in required fields with manual triggered dags 
(#31301)
+- Permitting airflow kerberos to run in different modes (#35146)
+- Refactor commands to unify daemon context handling (#34945)
+- Add extra fields to plugins endpoint (#34913)
+- Add description to pools view (#34862)
+- Move cli's Connection export and Variable export command print logic to a 
separate function (#34647)
+- Extract and reuse get_kerberos_principle func from get_kerberos_principle 
(#34936)
+- Change type annotation for ``BaseOperatorLink.operators`` (#35003)
+- Optimise and migrate to ``SA2-compatible`` syntax for TaskReschedule (#33720)
+- Consolidate the permissions name in SlaMissModelView (#34949)
+- Add debug log saying what's being run to ``EventScheduler`` (#34808)
+- Increase log reader stream loop sleep duration to 1 second (#34789)
+- Resolve pydantic deprecation warnings re ``update_forward_refs`` (#34657)
+- Unify mapped task group lookup logic (#34637)
+- Allow filtering event logs by attributes (#34417)
+- Make connection login and password TEXT (#32815)
+- Ban i

(airflow) 34/37: Updated docstring: `check_key_async` is now in line with description of `_check_key_async` (#35799)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 840707cedf75a65b297645e671f467148b704f10
Author: Aman Gupta 
AuthorDate: Wed Nov 22 22:24:29 2023 +0530

Updated docstring: `check_key_async` is now in line with description of 
`_check_key_async` (#35799)
---
 airflow/providers/amazon/aws/hooks/s3.py | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)

diff --git a/airflow/providers/amazon/aws/hooks/s3.py 
b/airflow/providers/amazon/aws/hooks/s3.py
index ef54dc0099..ad25c5b5c5 100644
--- a/airflow/providers/amazon/aws/hooks/s3.py
+++ b/airflow/providers/amazon/aws/hooks/s3.py
@@ -518,7 +518,11 @@ class S3Hook(AwsBaseHook):
 wildcard_match: bool,
 ) -> bool:
 """
-Check for all keys in bucket and returns boolean value.
+Get a list of files that a key matching a wildcard expression or get 
the head object.
+
+If wildcard_match is True get list of files that a key matching a 
wildcard
+expression exists in a bucket asynchronously and return the boolean 
value. If wildcard_match
+is False get the head object from the bucket and return the boolean 
value.
 
 :param client: aiobotocore client
 :param bucket: the name of the bucket



(airflow) 31/37: feat: K8S resource operator - CRD (#35600)

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d568543d879d4a56c675dd1d510e14143f201c5e
Author: raphaelauv 
AuthorDate: Wed Nov 22 11:23:46 2023 +0100

feat: K8S resource operator - CRD (#35600)

* feat: K8S resource operator - CRD

* clean

* tests

* remove sensor ( for another PR )

* clean

* test on k8s_resource_iterator
---
 .../cncf/kubernetes/operators/resource.py  | 62 +
 .../cncf/kubernetes/utils/k8s_resource_iterator.py | 46 +++
 .../cncf/kubernetes/operators/test_resource.py | 63 +
 .../kubernetes/utils/test_k8s_resource_iterator.py | 65 ++
 4 files changed, 225 insertions(+), 11 deletions(-)

diff --git a/airflow/providers/cncf/kubernetes/operators/resource.py 
b/airflow/providers/cncf/kubernetes/operators/resource.py
index 598731b639..569b5861a6 100644
--- a/airflow/providers/cncf/kubernetes/operators/resource.py
+++ b/airflow/providers/cncf/kubernetes/operators/resource.py
@@ -27,9 +27,10 @@ from kubernetes.utils import create_from_yaml
 from airflow.models import BaseOperator
 from airflow.providers.cncf.kubernetes.hooks.kubernetes import KubernetesHook
 from airflow.providers.cncf.kubernetes.utils.delete_from import 
delete_from_yaml
+from airflow.providers.cncf.kubernetes.utils.k8s_resource_iterator import 
k8s_resource_iterator
 
 if TYPE_CHECKING:
-from kubernetes.client import ApiClient
+from kubernetes.client import ApiClient, CustomObjectsApi
 
 __all__ = ["KubernetesCreateResourceOperator", 
"KubernetesDeleteResourceOperator"]
 
@@ -56,17 +57,23 @@ class KubernetesResourceBaseOperator(BaseOperator):
 yaml_conf: str,
 namespace: str | None = None,
 kubernetes_conn_id: str | None = KubernetesHook.default_conn_name,
+custom_resource_definition: bool = False,
 **kwargs,
 ) -> None:
 super().__init__(**kwargs)
 self._namespace = namespace
 self.kubernetes_conn_id = kubernetes_conn_id
 self.yaml_conf = yaml_conf
+self.custom_resource_definition = custom_resource_definition
 
 @cached_property
 def client(self) -> ApiClient:
 return self.hook.api_client
 
+@cached_property
+def custom_object_client(self) -> CustomObjectsApi:
+return self.hook.custom_object_client
+
 @cached_property
 def hook(self) -> KubernetesHook:
 hook = KubernetesHook(conn_id=self.kubernetes_conn_id)
@@ -78,24 +85,57 @@ class KubernetesResourceBaseOperator(BaseOperator):
 else:
 return self.hook.get_namespace() or "default"
 
+def get_crd_fields(self, body: dict) -> tuple[str, str, str, str]:
+api_version = body["apiVersion"]
+group = api_version[0 : api_version.find("/")]
+version = api_version[api_version.find("/") + 1 :]
+
+namespace = None
+if body.get("metadata"):
+metadata: dict = body.get("metadata", None)
+namespace = metadata.get("namespace", None)
+if namespace is None:
+namespace = self.get_namespace()
+
+plural = body["kind"].lower() + "s"
+
+return group, version, namespace, plural
+
 
 class KubernetesCreateResourceOperator(KubernetesResourceBaseOperator):
 """Create a resource in a kubernetes."""
 
+def create_custom_from_yaml_object(self, body: dict):
+group, version, namespace, plural = self.get_crd_fields(body)
+self.custom_object_client.create_namespaced_custom_object(group, 
version, namespace, plural, body)
+
 def execute(self, context) -> None:
-create_from_yaml(
-k8s_client=self.client,
-yaml_objects=yaml.safe_load_all(self.yaml_conf),
-namespace=self.get_namespace(),
-)
+resources = yaml.safe_load_all(self.yaml_conf)
+if not self.custom_resource_definition:
+create_from_yaml(
+k8s_client=self.client,
+yaml_objects=resources,
+namespace=self.get_namespace(),
+)
+else:
+k8s_resource_iterator(self.create_custom_from_yaml_object, 
resources)
 
 
 class KubernetesDeleteResourceOperator(KubernetesResourceBaseOperator):
 """Delete a resource in a kubernetes."""
 
+def delete_custom_from_yaml_object(self, body: dict):
+name = body["metadata"]["name"]
+group, version, namespace, plural = self.get_crd_fields(body)
+self.custom_object_client.delete_namespaced_custom_object(group, 
version, namespace, plural, name)
+
 def execute(self, context) -

(airflow) 36/37: Update version to 2.8.0

2023-11-22 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 20cb92e52a9762da3eb1e64cf9c3791b9dadfa38
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 22:48:45 2023 +0100

Update version to 2.8.0
---
 README.md  | 18 +-
 airflow/__init__.py|  2 +-
 airflow/api_connexion/openapi/v1.yaml  |  2 +-
 .../logging-monitoring/logging-tasks.rst   |  2 +-
 .../apache-airflow/installation/supported-versions.rst |  2 +-
 docs/docker-stack/README.md| 10 +-
 .../docker-examples/customizing/own-requirements.sh|  2 +-
 .../extending/add-airflow-configuration/Dockerfile |  2 +-
 .../extending/add-apt-packages/Dockerfile  |  2 +-
 .../extending/add-build-essential-extend/Dockerfile|  2 +-
 .../docker-examples/extending/add-providers/Dockerfile |  2 +-
 .../extending/add-pypi-packages/Dockerfile |  2 +-
 .../extending/add-requirement-packages/Dockerfile  |  2 +-
 .../extending/custom-providers/Dockerfile  |  2 +-
 .../extending/embedding-dags/Dockerfile|  2 +-
 .../extending/writable-directory/Dockerfile|  2 +-
 docs/docker-stack/entrypoint.rst   | 18 +-
 generated/PYPI_README.md   | 16 
 scripts/ci/pre_commit/pre_commit_supported_versions.py |  2 +-
 19 files changed, 46 insertions(+), 46 deletions(-)

diff --git a/README.md b/README.md
index f68615dcd1..5cc0968245 100644
--- a/README.md
+++ b/README.md
@@ -90,13 +90,13 @@ Airflow is not a streaming solution, but it is often used 
to process real-time d
 
 Apache Airflow is tested with:
 
-| | Main version (dev) | Stable version (2.7.3)   |
+| | Main version (dev) | Stable version (2.8.0)   |
 |-||--|
 | Python  | 3.8, 3.9, 3.10, 3.11   | 3.8, 3.9, 3.10, 3.11 |
 | Platform| AMD64/ARM64(\*)| AMD64/ARM64(\*)  |
-| Kubernetes  | 1.25, 1.26, 1.27, 1.28 | 1.24, 1.25, 1.26, 1.27, 1.28 |
-| PostgreSQL  | 11, 12, 13, 14, 15, 16 | 11, 12, 13, 14, 15   |
-| MySQL   | 8.0, Innovation| 5.7, 8.0 |
+| Kubernetes  | 1.25, 1.26, 1.27, 1.28 | 1.25, 1.26, 1.27, 1.28   |
+| PostgreSQL  | 12, 13, 14, 15, 16 | 12, 13, 14, 15, 16   |
+| MySQL   | 8.0, Innovation| 8.0, Innovation  |
 | SQLite  | 3.15.0+| 3.15.0+  |
 | MSSQL   | 2017(\*\*), 2019(\*\*) | 2017(\*\*), 2019(\*\*)   |
 
@@ -175,15 +175,15 @@ them to the appropriate format and workflow that your 
tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.7.3' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.7.3/constraints-3.8.txt";
+pip install 'apache-airflow==2.8.0' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.8.0/constraints-3.8.txt";
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.7.3' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.7.3/constraints-3.8.txt";
+pip install 'apache-airflow[postgres,google]==2.8.0' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.8.0/constraints-3.8.txt";
 ```
 
 For information on installing provider packages, check
@@ -288,7 +288,7 @@ Apache Airflow version life cycle:
 
 | Version   | Current Patch/Minor   | State | First Release   | Limited 
Support   | EOL/Terminated   |
 
|---|---|---|-|---|--|
-| 2 | 2.7.3 | Supported | Dec 17, 2020| TBD
   | TBD  |
+| 2 | 2.8.0 | Supported | Dec 17, 2020| TBD
   | TBD  |
 | 1.10  | 1.10.15   | EOL   | Aug 27, 2018| Dec 17, 
2020  | June 17, 2021|
 | 1.9   | 1.9.0 | EOL   | Jan 03, 2018| Aug 27, 
2018  | Aug 27, 2018 |
 | 1.8   | 1.8.2 | EOL   | Mar 19, 2017| Jan 03, 
2018  | Jan 03, 2018 |
diff --git a/airflow/__init__.py b/airflow/__init__.py
index b63ff1dc05..59117d2950 100644
--- a/airflow/__init__.py
+++ b/airflow/__init__.py
@@ -26,7 +26,7 @@ isort:skip_file
 """
 from __future__ import annotations
 
-__version__ = "2.8.0.dev0"
+__version__ = "2.8.0"
 
 # flake8: noqa: F401
 
diff --git a/airflow/api_connexion/openapi/v1.yaml 
b/airflow/

(airflow) branch v2-8-test updated (5bbc46005f -> 02ffb4b415)

2023-11-25 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


 discard 5bbc46005f Update RELEASE_NOTES.rst
 discard 20cb92e52a Update version to 2.8.0
 discard 55914e14cb Add borderWidthRight to grid for Firefox scrollbar (#35346)
 discard 840707cedf Updated docstring: `check_key_async` is now in line with 
description of `_check_key_async` (#35799)
 discard e022710039 Fix HttpOperator pagination with `str` data (#35782)
 discard b331cb2e0e Upgrade to Pydantic v2 (#35551)
 discard d568543d87 feat: K8S resource operator - CRD (#35600)
 discard 673f7f837f Fix for infinite recursion due to secrets_masker (#35048)
 discard 0e6b5d8f8e Added retry strategy parameter to Amazon AWS provider Batch 
Operator to allow dynamic Batch retry strategies (#35789)
 discard 08785f50f1 added Topic params for schema_settings and 
message_retention_duration. (#35767)
 discard 7937cba0dc Add missing docker test_exceptions.py (#35674)
 discard e0736092a8 Update emr.py (#35787)
 discard 58743f28d5 Reflect drop/add support of DB Backends versions in 
documentation (#35785)
 discard 2a240e65d5 Remove pendulum as dependency of breeze (#35786)
 discard b904523c5a Fix permission check on menus (#35781)
 discard 599189e41e Fix DataFusion example type annotations (#35753)
 discard 528d2bc51b Remove --force-build command in cache steps in CI (#35784)
 discard feaeb8c5fb Check attr on parent not self re TaskContextLogger 
set_context (#35780)
 discard e2e89668d4 Implement login and logout in AWS auth manager (#35488)
 discard b51aaf59d2 Add OpenLineage support to GCSToBigQueryOperator (#35778)
 discard 4454fc870c Move `BaseOperatorLink` into the separate module (#35032)
 discard c0a1dfe9ff Make passing build args explicit in ci/prod builds (#35768)
 discard f40e1c17ee Set mark_end_on_close after set_context (#35761)
 discard 1df306a337 Fix broken link to Weaviate docs (#35776)
 discard 26d5e3f4e7 Improve ownership fixing for Breeze (#35759)
 discard 7310693277 Remove backcompat inheritance for DbApiHook (#35754)
 discard d9cfdd8131 Extend task context logging support for remote logging 
using Elasticsearch (#32977)
 discard 5d69fc142d Add basic metrics to stats collector. (#35368)
 discard 7642b29f5c Update README.md to reflect changes we agreed to the 
versioning (#35764)
 discard 9b832f6acd More detail on mandatory task arguments (#35740)
 discard 38afe9ffb1 Rename --aditional-extras flag to 
--aditional-airflow-extras (#35760)
 discard bd644fab3e feature(providers): added `OpsgenieNotifier` (#35530)
 discard 25990d159c Remove usage of deprecated method from 
BigQueryToBigQueryOperator (#35605)
 discard b88d0d7ce2 OpenLineage integration tried to use non-existed method in 
SnowflakeHook (#35752)
 discard 6198558843 Remove backcompat with Airflow 2.3/2.4 in providers (#35727)
 discard 70e2f419e6 Update minor release command (#35751)
 discard 1fc0633f89 Add v2-8 branches to codecov.yml and .asf.yaml (#35750)
 add 4d6eb837e3 Add v2-8 branches to codecov.yml and .asf.yaml (#35750)
 add 8357765f38 Update minor release command (#35751)
 add 1be1ed3060 Remove backcompat with Airflow 2.3/2.4 in providers (#35727)
 add c1ece7f174 OpenLineage integration tried to use non-existed method in 
SnowflakeHook (#35752)
 add c122379058 Remove usage of deprecated method from 
BigQueryToBigQueryOperator (#35605)
 add 5369fdc819 feature(providers): added `OpsgenieNotifier` (#35530)
 add 10a40a2a54 Rename --aditional-extras flag to 
--aditional-airflow-extras (#35760)
 add 96da30a4f7 More detail on mandatory task arguments (#35740)
 add e81309e3b6 Update README.md to reflect changes we agreed to the 
versioning (#35764)
 add 546e55a3d9 Add basic metrics to stats collector. (#35368)
 add 6a64883876 Extend task context logging support for remote logging 
using Elasticsearch (#32977)
 add 3b288fa391 Remove backcompat inheritance for DbApiHook (#35754)
 add f95cb427d8 Improve ownership fixing for Breeze (#35759)
 add 4e342d2485 Fix broken link to Weaviate docs (#35776)
 add 82d00f1601 Set mark_end_on_close after set_context (#35761)
 add b959de605d Make passing build args explicit in ci/prod builds (#35768)
 add cfaeb69f76 Move `BaseOperatorLink` into the separate module (#35032)
 add efb4694144 Add OpenLineage support to GCSToBigQueryOperator (#35778)
 add 3e1d7bc57c Implement login and logout in AWS auth manager (#35488)
 add 7c7b6ddee6 Check attr on parent not self re TaskContextLogger 
set_context (#35780)
 add 83c66f6aea Remove --force-build command in cache steps in CI (#35784)
 add c068b869c9 Fix DataFusion example type annotations (#35753)
 add d35e982a31 Fix permission check on menus (#35781)
 add 07a4b8b26b Remove pendulum as dependency of breeze (#35786)
 add 8d42d9c697 Reflect drop/add support of DB Backends versions in

(airflow) branch main updated: Revert "Add v2-8 branches to codecov.yml and .asf.yaml (#35750)" (#35883)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 1e730f21cd Revert "Add v2-8 branches to codecov.yml and .asf.yaml 
(#35750)" (#35883)
1e730f21cd is described below

commit 1e730f21cdb46344eb1e4fc01342e494ef411a33
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 27 09:55:13 2023 +0100

Revert "Add v2-8 branches to codecov.yml and .asf.yaml (#35750)" (#35883)

This reverts commit c07c5925e93ba0a8f37f20c7deb814d0f6925705.

Temporarily revert this so I can push to v2-8-stable branch
---
 .asf.yaml   | 3 ---
 codecov.yml | 2 --
 2 files changed, 5 deletions(-)

diff --git a/.asf.yaml b/.asf.yaml
index 094299d2c4..6166687a5d 100644
--- a/.asf.yaml
+++ b/.asf.yaml
@@ -71,9 +71,6 @@ github:
 v2-7-stable:
   required_pull_request_reviews:
 required_approving_review_count: 1
-v2-8-stable:
-  required_pull_request_reviews:
-required_approving_review_count: 1
 
   collaborators:
 - mhenc
diff --git a/codecov.yml b/codecov.yml
index d1ed5fb446..67ea777302 100644
--- a/codecov.yml
+++ b/codecov.yml
@@ -55,8 +55,6 @@ coverage:
   - v2-6-test
   - v2-7-stable
   - v2-7-test
-  - v2-8-stable
-  - v2-8-test
 if_not_found: success
 if_ci_failed: error
 informational: true



(airflow) 01/01: Update default branches for 2-8

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9a4857bd8701effbd49c750123db08edff77aaed
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 13:31:03 2023 +0100

Update default branches for 2-8
---
 dev/breeze/src/airflow_breeze/branch_defaults.py  | 4 ++--
 images/breeze/output_ci-image_build.txt   | 2 +-
 images/breeze/output_prod-image_build.txt | 2 +-
 images/breeze/output_release-management_install-provider-packages.txt | 2 +-
 images/breeze/output_release-management_verify-provider-packages.txt  | 2 +-
 images/breeze/output_shell.txt| 2 +-
 images/breeze/output_start-airflow.txt| 2 +-
 7 files changed, 8 insertions(+), 8 deletions(-)

diff --git a/dev/breeze/src/airflow_breeze/branch_defaults.py 
b/dev/breeze/src/airflow_breeze/branch_defaults.py
index c9dbaa8080..755c8cf835 100644
--- a/dev/breeze/src/airflow_breeze/branch_defaults.py
+++ b/dev/breeze/src/airflow_breeze/branch_defaults.py
@@ -37,6 +37,6 @@ Examples:
 """
 from __future__ import annotations
 
-AIRFLOW_BRANCH = "main"
-DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH = "constraints-main"
+AIRFLOW_BRANCH = "v2-8-test"
+DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH = "constraints-2-8"
 DEBIAN_VERSION = "bookworm"
diff --git a/images/breeze/output_ci-image_build.txt 
b/images/breeze/output_ci-image_build.txt
index 9c882681f1..8b648a93f6 100644
--- a/images/breeze/output_ci-image_build.txt
+++ b/images/breeze/output_ci-image_build.txt
@@ -1 +1 @@
-3109e8a0e9ff98e6d0b802e6ce09192a
+8c6cf9446a9ee3920f03c33f5eeb8d77
diff --git a/images/breeze/output_prod-image_build.txt 
b/images/breeze/output_prod-image_build.txt
index a145933340..5ef5d86aad 100644
--- a/images/breeze/output_prod-image_build.txt
+++ b/images/breeze/output_prod-image_build.txt
@@ -1 +1 @@
-efdd516ea49fd7f8f9c770fa466d879b
+1a2cec141b5b0b9ecd704e484ac5b4e3
diff --git 
a/images/breeze/output_release-management_install-provider-packages.txt 
b/images/breeze/output_release-management_install-provider-packages.txt
index 47f58a6341..b36dc86e24 100644
--- a/images/breeze/output_release-management_install-provider-packages.txt
+++ b/images/breeze/output_release-management_install-provider-packages.txt
@@ -1 +1 @@
-34c38aca17d23dbb454fe7a6bfd8e630
+05ff214ada04958a95f2aedc1953079e
diff --git 
a/images/breeze/output_release-management_verify-provider-packages.txt 
b/images/breeze/output_release-management_verify-provider-packages.txt
index 88ef90c79e..20070bef37 100644
--- a/images/breeze/output_release-management_verify-provider-packages.txt
+++ b/images/breeze/output_release-management_verify-provider-packages.txt
@@ -1 +1 @@
-13083dc08dc69b40015b61f8be607918
+f7fe0f6356904e7c8ca2a3bdbe841f6e
diff --git a/images/breeze/output_shell.txt b/images/breeze/output_shell.txt
index 287cbec8da..872abb221b 100644
--- a/images/breeze/output_shell.txt
+++ b/images/breeze/output_shell.txt
@@ -1 +1 @@
-9dd3658bf3e2e6e605c2bae9d350f162
+5da947ec063544915e0d12bf27c242ed
diff --git a/images/breeze/output_start-airflow.txt 
b/images/breeze/output_start-airflow.txt
index b95bf95d4d..0b1d67480c 100644
--- a/images/breeze/output_start-airflow.txt
+++ b/images/breeze/output_start-airflow.txt
@@ -1 +1 @@
-69fb2419a12c9b3feb208c3337443e1a
+5654d42e06be1f13051157d5db79e87d



(airflow) branch v2-8-stable updated (6fc4a9cb3c -> 9a4857bd87)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


omit 6fc4a9cb3c Update default branches for 2-8
 add c07c5925e9 Add v2-8 branches to codecov.yml and .asf.yaml (#35750)
 add 0d1c8de78c Update minor release command (#35751)
 add d8075cd04c Remove backcompat with Airflow 2.3/2.4 in providers (#35727)
 add f8dd192483 OpenLineage integration tried to use non-existed method in 
SnowflakeHook (#35752)
 add 9207e7d5e5 Remove usage of deprecated method from 
BigQueryToBigQueryOperator (#35605)
 add 2d01cfd4a8 feature(providers): added `OpsgenieNotifier` (#35530)
 add 31aeed048b Rename --aditional-extras flag to 
--aditional-airflow-extras (#35760)
 add 0711d50597 More detail on mandatory task arguments (#35740)
 add 1e95b06948 Update README.md to reflect changes we agreed to the 
versioning (#35764)
 add ecbf02386a Add basic metrics to stats collector. (#35368)
 add 747f00f2aa Extend task context logging support for remote logging 
using Elasticsearch (#32977)
 add 2a469b3713 Remove backcompat inheritance for DbApiHook (#35754)
 add de95d0fb5a Improve ownership fixing for Breeze (#35759)
 add b75dc62620 Fix broken link to Weaviate docs (#35776)
 add 7389782fb8 Set mark_end_on_close after set_context (#35761)
 add 50e0b928be Make passing build args explicit in ci/prod builds (#35768)
 add b1a9ebb2fe Move `BaseOperatorLink` into the separate module (#35032)
 add 1fae1a50e9 Add OpenLineage support to GCSToBigQueryOperator (#35778)
 add 379b7c09d1 Implement login and logout in AWS auth manager (#35488)
 add 2a06e278d2 Check attr on parent not self re TaskContextLogger 
set_context (#35780)
 add 5a8d9d644d Remove --force-build command in cache steps in CI (#35784)
 add 67ebc3a6cd Fix DataFusion example type annotations (#35753)
 add d58e0450b2 Fix permission check on menus (#35781)
 add 14a58e2dae Remove pendulum as dependency of breeze (#35786)
 add b31b8c08c4 Reflect drop/add support of DB Backends versions in 
documentation (#35785)
 add 2d811d526a Update emr.py (#35787)
 add f4e5571384 Add missing docker test_exceptions.py (#35674)
 add 72ba63e0b9 added Topic params for schema_settings and 
message_retention_duration. (#35767)
 add b71c14c74a Added retry strategy parameter to Amazon AWS provider Batch 
Operator to allow dynamic Batch retry strategies (#35789)
 add 6e8f646bf9 Fix for infinite recursion due to secrets_masker (#35048)
 add 8dc1b23116 feat: K8S resource operator - CRD (#35600)
 add 172f57355e Upgrade to Pydantic v2 (#35551)
 add 5588a956c0 Fix HttpOperator pagination with `str` data (#35782)
 add bcb5eebd62 Updated docstring: `check_key_async` is now in line with 
description of `_check_key_async` (#35799)
 add b06c4b0f04 Add borderWidthRight to grid for Firefox scrollbar (#35346)
 add fcb91f47c9 Avoid breeze self-upgrade run in some pre-commits. (#35802)
 add 9e159fc48d Add OpenLineage support to S3Operators - Copy, Delete and 
Create Object (#35796)
 add ac977c4e57 Make EksPodOperator exec config not  rely on log level 
(#35771)
 add d35578e210 Fix pre-commit script output (#35807)
 add a794e0d020 Fix Batch operator's retry_strategy (#35808)
 add 910c95ea64 Consolidate environment variable calculation in ShellParams 
in Breeze (#35801)
 add ef2ad070c2 Add OpenLineage support to `S3FileTransformOperator` 
(#35819)
 add ca97feed18 Revert Remove PodLoggingStatus object #35422 (#35822)
 add ca1202fd31 Add `EC2HibernateInstanceOperator` and 
`EC2RebootInstanceOperator` (#35790)
 add eb691fc013 Use `OpenSearch` instead of `Open Search` and `Opensearch` 
(#35821)
 add b07d79908c Implements JSON-string connection representation generator 
(#35723)
 add afe14ceb62 breeze - docs added pipx python flag (#35827)
 add 0e157b38a3 Fix K8S executor override config using pod_override_object 
(#35185)
 add e8f62e8ee5 Add DagModel attributes before dumping DagDetailSchema for 
get_dag_details API endpoint (#34947)
 add 0b23d5601c Prepare docs 2nd wave of Providers November 2023 (#35836)
 add cc042a2b7e Create directories based on `AIRFLOW_CONFIG` path (#35818)
 add 4298c433bc Add missing default for version suffix (#35842)
 add c068089c65 Update README_RELEASE_PROVIDER_PACKAGES.md (#35846)
 add 39107dfeb4 change indent to 4 (#35824)
 add c905fe88de Update information about links into the provider.yaml files 
(#35837)
 add 770f16425c Add support for service account impersonation with 
computeEngineSSHHook (google provider) and IAP tunnel (#35136)
 add e2a5dbf8b4 allow multiple elements in impersonation chain (#35694)
 add 9059f72668 Enhance `attribute_value` in `DynamoDBValueSensor` to 
accept list (#35831)
 add 196a235358 fix: set dry_run 

(airflow) 01/02: Update version to 2.8.0

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4fad6e586213c998d81fd7f28eff428d300c0b3c
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 22:48:45 2023 +0100

Update version to 2.8.0
---
 README.md  | 18 +-
 airflow/__init__.py|  2 +-
 airflow/api_connexion/openapi/v1.yaml  |  2 +-
 .../logging-monitoring/logging-tasks.rst   |  2 +-
 .../apache-airflow/installation/supported-versions.rst |  2 +-
 docs/docker-stack/README.md| 10 +-
 .../docker-examples/customizing/own-requirements.sh|  2 +-
 .../extending/add-airflow-configuration/Dockerfile |  2 +-
 .../extending/add-apt-packages/Dockerfile  |  2 +-
 .../extending/add-build-essential-extend/Dockerfile|  2 +-
 .../docker-examples/extending/add-providers/Dockerfile |  2 +-
 .../extending/add-pypi-packages/Dockerfile |  2 +-
 .../extending/add-requirement-packages/Dockerfile  |  2 +-
 .../extending/custom-providers/Dockerfile  |  2 +-
 .../extending/embedding-dags/Dockerfile|  2 +-
 .../extending/writable-directory/Dockerfile|  2 +-
 docs/docker-stack/entrypoint.rst   | 18 +-
 generated/PYPI_README.md   | 16 
 scripts/ci/pre_commit/pre_commit_supported_versions.py |  2 +-
 19 files changed, 46 insertions(+), 46 deletions(-)

diff --git a/README.md b/README.md
index f68615dcd1..5cc0968245 100644
--- a/README.md
+++ b/README.md
@@ -90,13 +90,13 @@ Airflow is not a streaming solution, but it is often used 
to process real-time d
 
 Apache Airflow is tested with:
 
-| | Main version (dev) | Stable version (2.7.3)   |
+| | Main version (dev) | Stable version (2.8.0)   |
 |-||--|
 | Python  | 3.8, 3.9, 3.10, 3.11   | 3.8, 3.9, 3.10, 3.11 |
 | Platform| AMD64/ARM64(\*)| AMD64/ARM64(\*)  |
-| Kubernetes  | 1.25, 1.26, 1.27, 1.28 | 1.24, 1.25, 1.26, 1.27, 1.28 |
-| PostgreSQL  | 11, 12, 13, 14, 15, 16 | 11, 12, 13, 14, 15   |
-| MySQL   | 8.0, Innovation| 5.7, 8.0 |
+| Kubernetes  | 1.25, 1.26, 1.27, 1.28 | 1.25, 1.26, 1.27, 1.28   |
+| PostgreSQL  | 12, 13, 14, 15, 16 | 12, 13, 14, 15, 16   |
+| MySQL   | 8.0, Innovation| 8.0, Innovation  |
 | SQLite  | 3.15.0+| 3.15.0+  |
 | MSSQL   | 2017(\*\*), 2019(\*\*) | 2017(\*\*), 2019(\*\*)   |
 
@@ -175,15 +175,15 @@ them to the appropriate format and workflow that your 
tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.7.3' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.7.3/constraints-3.8.txt";
+pip install 'apache-airflow==2.8.0' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.8.0/constraints-3.8.txt";
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.7.3' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.7.3/constraints-3.8.txt";
+pip install 'apache-airflow[postgres,google]==2.8.0' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.8.0/constraints-3.8.txt";
 ```
 
 For information on installing provider packages, check
@@ -288,7 +288,7 @@ Apache Airflow version life cycle:
 
 | Version   | Current Patch/Minor   | State | First Release   | Limited 
Support   | EOL/Terminated   |
 
|---|---|---|-|---|--|
-| 2 | 2.7.3 | Supported | Dec 17, 2020| TBD
   | TBD  |
+| 2 | 2.8.0 | Supported | Dec 17, 2020| TBD
   | TBD  |
 | 1.10  | 1.10.15   | EOL   | Aug 27, 2018| Dec 17, 
2020  | June 17, 2021|
 | 1.9   | 1.9.0 | EOL   | Jan 03, 2018| Aug 27, 
2018  | Aug 27, 2018 |
 | 1.8   | 1.8.2 | EOL   | Mar 19, 2017| Jan 03, 
2018  | Jan 03, 2018 |
diff --git a/airflow/__init__.py b/airflow/__init__.py
index b63ff1dc05..59117d2950 100644
--- a/airflow/__init__.py
+++ b/airflow/__init__.py
@@ -26,7 +26,7 @@ isort:skip_file
 """
 from __future__ import annotations
 
-__version__ = "2.8.0.dev0"
+__version__ = "2.8.0"
 
 # flake8: noqa: F401
 
diff --git a/airflow/api_connexion/openapi/v1.yaml 
b/airflow/

(airflow) 02/02: Update RELEASE_NOTES.rst

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 26990e2ed640039e14458cb223e2d6801e315770
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 20 22:51:30 2023 +0100

Update RELEASE_NOTES.rst
---
 RELEASE_NOTES.rst   | 172 +++-
 newsfragments/35460.significant.rst |  10 ---
 2 files changed, 171 insertions(+), 11 deletions(-)

diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index 62183c8b58..b2c1c20a64 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -21,7 +21,177 @@
 
 .. towncrier release notes start
 
-Airflow 2.7.3 (2023-11-04)
+Airflow 2.8.0 (2023-12-14)
+--
+
+Significant Changes
+^^^
+
+- Raw HTML code in DAG docs and DAG params descriptions is disabled by default
+
+  To ensure that no malicious javascript can be injected with DAG descriptions 
or trigger UI forms by DAG authors
+  a new parameter ``webserver.allow_raw_html_descriptions`` was added with 
default value of ``False``.
+  If you trust your DAG authors code and want to allow using raw HTML in DAG 
descriptions and params, you can restore the previous
+  behavior by setting the configuration value to ``True``.
+
+  To ensure Airflow is secure by default, the raw HTML support in trigger UI 
has been super-seeded by markdown support via
+  the ``description_md`` attribute. If you have been using 
``description_html`` please migrate to ``description_md``.
+  The ``custom_html_form`` is now deprecated. (#35460)
+
+New Features
+""""""""""""
+- AIP-58: Add Airflow ObjectStore (AFS) (`AIP-58 
<https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-58+milestone%3A%22Airflow+2.8.0%22>`_)
+- Add "literal" wrapper to disable field templating (#35017)
+- Add task context logging feature to allow forwarding messages to task logs 
(#32646, #32693, #35857)
+- Add Listener hooks for Datasets (#34418)
+- Allow override of navbar text color (#35505)
+- Add lightweight serialization for deltalake tables (#35462)
+- Add support for serialization of iceberg tables (#35456)
+- ``prev_end_date_success`` method access (#34528)
+- Add task parameter to set custom logger name (#34964)
+- Add pyspark decorator (#35247)
+- Add trigger as a valid option for the db clean command (#34908)
+- Add decorators for external and venv python branching operators (#35043)
+- Allow PythonVenvOperator using other index url (#33017)
+- Add Python Virtualenv Operator Caching (#33355)
+- Introduce a generic export for containerized executor logging (#34903)
+- Add ability to clear downstream tis in ``List Task Instances`` view  (#34529)
+- Attribute ``clear_number`` to track DAG run being cleared (#34126)
+- Add BranchPythonVirtualenvOperator (#33356)
+- Allow PythonVenvOperator using other index url (#33017)
+- Add CLI notification commands to providers (#33116)
+
+Improvements
+""""""""""""
+- Create directories based on ``AIRFLOW_CONFIG`` path (#35818)
+- Implements ``JSON-string`` connection representation generator (#35723)
+- Move ``BaseOperatorLink`` into the separate module (#35032)
+- Set mark_end_on_close after set_context (#35761)
+- Move external logs links to top of react logs page (#35668)
+- Change terminal mode to ``cbreak`` in ``execute_interactive`` and handle 
``SIGINT`` (#35602)
+- Make raw HTML descriptions configurable (#35460)
+- Allow email field to be templated (#35546)
+- Hide logical date and run id in trigger UI form (#35284)
+- Improved instructions for adding dependencies in TaskFlow (#35406)
+- Add optional exit code to list import errors (#35378)
+- Limit query result on DB rather than client in ``synchronize_log_template`` 
function (#35366)
+- Feature: Allow description to be passed in when using variables CLI (#34791)
+- Allow optional defaults in required fields with manual triggered dags 
(#31301)
+- Permitting airflow kerberos to run in different modes (#35146)
+- Refactor commands to unify daemon context handling (#34945)
+- Add extra fields to plugins endpoint (#34913)
+- Add description to pools view (#34862)
+- Move cli's Connection export and Variable export command print logic to a 
separate function (#34647)
+- Extract and reuse get_kerberos_principle func from get_kerberos_principle 
(#34936)
+- Change type annotation for ``BaseOperatorLink.operators`` (#35003)
+- Optimise and migrate to ``SA2-compatible`` syntax for TaskReschedule (#33720)
+- Consolidate the permissions name in SlaMissModelView (#34949)
+- Add debug log saying what's being run to ``EventScheduler`` (#34808)
+- Increase log reader stream loop sleep duration to 1 second (#34789)
+- Resolve pydantic deprecation warnings re ``update_forward_refs`` (#34657)
+- Unify mappe

(airflow) branch v2-8-test updated (02ffb4b415 -> 26990e2ed6)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


omit 02ffb4b415 Update RELEASE_NOTES.rst
omit 7a95e92120 Update version to 2.8.0
omit 3a2940b3c8 Disable mypy check for referencing package (#35850)
omit 2fa40aae8e Add a new config to configure the host of the scheduler 
health check server (#35616)
omit 3ab871e46d Enhance docs for zombie tasks (#35825)
omit 05837c9cb6 Make livy connection failure test less picky (#35852)
omit 99ecb3d9e9 Feature pass dictionary configuration in application_file 
in SparkKubernetesOperator (#35848)
omit e33d626bd8 Deprecate `CloudComposerEnvironmentSensor` in favor of 
`CloudComposerCreateEnvironmentOperator` with defer mode (#35775)
omit 8d2832beb0 fix: set dry_run to be optional. (#35167)
omit 64ffd38bf0 Enhance `attribute_value` in `DynamoDBValueSensor` to 
accept list (#35831)
omit 4375ac6d44 allow multiple elements in impersonation chain (#35694)
omit efc94598f2 Add support for service account impersonation with 
computeEngineSSHHook (google provider) and IAP tunnel (#35136)
omit bac3448945 Update information about links into the provider.yaml files 
(#35837)
omit 5f3222c796 change indent to 4 (#35824)
omit 934bf75014 Update README_RELEASE_PROVIDER_PACKAGES.md (#35846)
omit db697701f7 Add missing default for version suffix (#35842)
omit 4fd6166a73 Create directories based on `AIRFLOW_CONFIG` path (#35818)
omit beb9455390 Prepare docs 2nd wave of Providers November 2023 (#35836)
omit 36de48ee69 Add DagModel attributes before dumping DagDetailSchema for 
get_dag_details API endpoint (#34947)
omit f1e5109da2 Fix K8S executor override config using pod_override_object 
(#35185)
omit edc91c6739 breeze - docs added pipx python flag (#35827)
omit 4843d7f657 Implements JSON-string connection representation generator 
(#35723)
omit d4065dd0b9 Use `OpenSearch` instead of `Open Search` and `Opensearch` 
(#35821)
omit 9e2a8e24c3 Add `EC2HibernateInstanceOperator` and 
`EC2RebootInstanceOperator` (#35790)
omit 658a51fb8c Revert Remove PodLoggingStatus object #35422 (#35822)
omit 1a668ef763 Add OpenLineage support to `S3FileTransformOperator` 
(#35819)
omit 6c45f70216 Consolidate environment variable calculation in ShellParams 
in Breeze (#35801)
omit dcd7bf65fe Fix Batch operator's retry_strategy (#35808)
omit d4e5a9bdf9 Fix pre-commit script output (#35807)
omit 3808ec6855 Make EksPodOperator exec config not  rely on log level 
(#35771)
omit ce59b31c52 Add OpenLineage support to S3Operators - Copy, Delete and 
Create Object (#35796)
omit e4c0062f0c Avoid breeze self-upgrade run in some pre-commits. (#35802)
omit d5d0a6ecf2 Add borderWidthRight to grid for Firefox scrollbar (#35346)
omit f496a3f930 Updated docstring: `check_key_async` is now in line with 
description of `_check_key_async` (#35799)
omit 3eec6eeb68 Fix HttpOperator pagination with `str` data (#35782)
omit 7a03e81d8e Upgrade to Pydantic v2 (#35551)
omit 7e59f53cb3 feat: K8S resource operator - CRD (#35600)
omit 349c7e6d23 Fix for infinite recursion due to secrets_masker (#35048)
omit 3a3d0a4008 Added retry strategy parameter to Amazon AWS provider Batch 
Operator to allow dynamic Batch retry strategies (#35789)
omit 756c17549c added Topic params for schema_settings and 
message_retention_duration. (#35767)
omit ea889614b1 Add missing docker test_exceptions.py (#35674)
omit bcb6a56cd5 Update emr.py (#35787)
omit 8d42d9c697 Reflect drop/add support of DB Backends versions in 
documentation (#35785)
omit 07a4b8b26b Remove pendulum as dependency of breeze (#35786)
omit d35e982a31 Fix permission check on menus (#35781)
omit c068b869c9 Fix DataFusion example type annotations (#35753)
omit 83c66f6aea Remove --force-build command in cache steps in CI (#35784)
omit 7c7b6ddee6 Check attr on parent not self re TaskContextLogger 
set_context (#35780)
omit 3e1d7bc57c Implement login and logout in AWS auth manager (#35488)
omit efb4694144 Add OpenLineage support to GCSToBigQueryOperator (#35778)
omit cfaeb69f76 Move `BaseOperatorLink` into the separate module (#35032)
omit b959de605d Make passing build args explicit in ci/prod builds (#35768)
omit 82d00f1601 Set mark_end_on_close after set_context (#35761)
omit 4e342d2485 Fix broken link to Weaviate docs (#35776)
omit f95cb427d8 Improve ownership fixing for Breeze (#35759)
omit 3b288fa391 Remove backcompat inheritance for DbApiHook (#35754)
omit 6a64883876 Extend task context logging support for remote logging 
using Elasticsearch (#32977)
omit 546e55a3d9 Add basic metrics to stats collector. (#35368)
omit e81309e3b6 Update README.md to reflect changes we agreed to the 
versioning (#35764)
omit 96da30a4f7

(airflow) branch main updated (99b68e2db2 -> c41088b922)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 99b68e2db2 Add OpenLineage support to GcsOperators - Delete, Transform 
and TimeSpanTransform (#35838)
 add c41088b922 Revert "Revert "Add v2-8 branches to codecov.yml and 
.asf.yaml (#35750)" (#35883)" (#35886)

No new revisions were added by this update.

Summary of changes:
 .asf.yaml   | 3 +++
 codecov.yml | 2 ++
 2 files changed, 5 insertions(+)



(airflow) branch v2-8-test updated: Exclude common-io provider just for test

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v2-8-test by this push:
 new 21d83eb2ce Exclude common-io provider just for test
21d83eb2ce is described below

commit 21d83eb2ce68b759990b407a6551259924343ea2
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 27 13:08:24 2023 +0100

Exclude common-io provider just for test
---
 setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index bc624e21fa..9255b0fdb8 100644
--- a/setup.py
+++ b/setup.py
@@ -870,7 +870,7 @@ def get_provider_package_name_from_package_id(package_id: 
str) -> str:
 
 def get_excluded_providers() -> list[str]:
 """Return packages excluded for the current python version."""
-return []
+return ["apache-airflow-providers-common-io"]
 
 
 def get_all_provider_packages() -> str:



(airflow) 01/01: Exclude common-io provider just for test

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c3f2236795472ac4c5ff63644e99d1b04a4f0e66
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 27 13:08:24 2023 +0100

Exclude common-io provider just for test
---
 airflow/providers/installed_providers.txt | 2 +-
 setup.py  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/providers/installed_providers.txt 
b/airflow/providers/installed_providers.txt
index 0d9b03a55e..a338685b36 100644
--- a/airflow/providers/installed_providers.txt
+++ b/airflow/providers/installed_providers.txt
@@ -1,7 +1,7 @@
+#common.io
 amazon
 celery
 cncf.kubernetes
-common.io
 common.sql
 daskexecutor
 docker
diff --git a/setup.py b/setup.py
index bc624e21fa..9255b0fdb8 100644
--- a/setup.py
+++ b/setup.py
@@ -870,7 +870,7 @@ def get_provider_package_name_from_package_id(package_id: 
str) -> str:
 
 def get_excluded_providers() -> list[str]:
 """Return packages excluded for the current python version."""
-return []
+return ["apache-airflow-providers-common-io"]
 
 
 def get_all_provider_packages() -> str:



(airflow) branch v2-8-test updated (21d83eb2ce -> c3f2236795)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


omit 21d83eb2ce Exclude common-io provider just for test
 new c3f2236795 Exclude common-io provider just for test

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (21d83eb2ce)
\
 N -- N -- N   refs/heads/v2-8-test (c3f2236795)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/providers/installed_providers.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



(airflow) branch v2-8-test updated (c3f2236795 -> c28ba46e13)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


omit c3f2236795 Exclude common-io provider just for test
 new c28ba46e13 Exclude common-io provider

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (c3f2236795)
\
 N -- N -- N   refs/heads/v2-8-test (c28ba46e13)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/providers/installed_providers.txt | 1 -
 setup.py  | 2 +-
 2 files changed, 1 insertion(+), 2 deletions(-)



(airflow) 01/01: Exclude common-io provider

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c28ba46e131c96fcd9804dca28589cc782fe
Author: Ephraim Anierobi 
AuthorDate: Mon Nov 27 13:08:24 2023 +0100

Exclude common-io provider
---
 airflow/providers/installed_providers.txt | 1 -
 1 file changed, 1 deletion(-)

diff --git a/airflow/providers/installed_providers.txt 
b/airflow/providers/installed_providers.txt
index 0d9b03a55e..bd32056d5f 100644
--- a/airflow/providers/installed_providers.txt
+++ b/airflow/providers/installed_providers.txt
@@ -1,7 +1,6 @@
 amazon
 celery
 cncf.kubernetes
-common.io
 common.sql
 daskexecutor
 docker



(airflow) branch v2-8-stable updated (9a4857bd87 -> c28ba46e13)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 9a4857bd87 Update default branches for 2-8
 add 4fad6e5862 Update version to 2.8.0
 add 26990e2ed6 Update RELEASE_NOTES.rst
 add c28ba46e13 Exclude common-io provider

No new revisions were added by this update.

Summary of changes:
 README.md  |  18 +--
 RELEASE_NOTES.rst  | 172 -
 airflow/__init__.py|   2 +-
 airflow/api_connexion/openapi/v1.yaml  |   2 +-
 airflow/providers/installed_providers.txt  |   1 -
 .../logging-monitoring/logging-tasks.rst   |   2 +-
 .../installation/supported-versions.rst|   2 +-
 docs/docker-stack/README.md|  10 +-
 .../customizing/own-requirements.sh|   2 +-
 .../extending/add-airflow-configuration/Dockerfile |   2 +-
 .../extending/add-apt-packages/Dockerfile  |   2 +-
 .../add-build-essential-extend/Dockerfile  |   2 +-
 .../extending/add-providers/Dockerfile |   2 +-
 .../extending/add-pypi-packages/Dockerfile |   2 +-
 .../extending/add-requirement-packages/Dockerfile  |   2 +-
 .../extending/custom-providers/Dockerfile  |   2 +-
 .../extending/embedding-dags/Dockerfile|   2 +-
 .../extending/writable-directory/Dockerfile|   2 +-
 docs/docker-stack/entrypoint.rst   |  18 +--
 generated/PYPI_README.md   |  16 +-
 newsfragments/35460.significant.rst|  10 --
 .../ci/pre_commit/pre_commit_supported_versions.py |   2 +-
 22 files changed, 217 insertions(+), 58 deletions(-)
 delete mode 100644 newsfragments/35460.significant.rst



(airflow) annotated tag constraints-2.8.0b1 updated (5051db48f6 -> 20223d0d04)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to annotated tag constraints-2.8.0b1
in repository https://gitbox.apache.org/repos/asf/airflow.git


*** WARNING: tag constraints-2.8.0b1 was modified! ***

from 5051db48f6 (commit)
  to 20223d0d04 (tag)
 tagging 5051db48f6d6e9a3d0a8b47c50f9cd9ef7271f64 (commit)
 replaces constraints-2.4.0b1
  by Ephraim Anierobi
  on Mon Nov 27 16:55:02 2023 +0100

- Log -
Constraints for Apache Airflow 2.8.0b1
-BEGIN PGP SIGNATURE-

iHUEABYKAB0WIQS9oZEc9zL8clyUp/4Ip9yRa47wgAUCZWS71gAKCRAIp9yRa47w
gCeGAQCoc8bh3XjkBCHm/QPGDALJ4fMyUN+CsaVPkS9ZhAGxIQD/XtAwtFESUOhW
wm8rYyFAyuS6YtOGIWrwiq3BE3NnJw4=
=pA1S
-END PGP SIGNATURE-
---


No new revisions were added by this update.

Summary of changes:



svn commit: r65566 - /dev/airflow/2.8.0b1/

2023-11-27 Thread ephraimanierobi
Author: ephraimanierobi
Date: Mon Nov 27 16:00:24 2023
New Revision: 65566

Log:
Add artifacts for Airflow 2.8.0b1

Added:
dev/airflow/2.8.0b1/
dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz   (with props)
dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz.asc
dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz.sha512
dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz   (with props)
dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz.asc
dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz.sha512
dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl   (with props)
dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl.asc
dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl.sha512

Added: dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz
==
Binary file - no diff available.

Propchange: dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz.asc
==
--- dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz.asc (added)
+++ dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz.asc Mon Nov 27 
16:00:24 2023
@@ -0,0 +1,8 @@
+-BEGIN PGP SIGNATURE-
+
+iJEEABYKADkWIQS9oZEc9zL8clyUp/4Ip9yRa47wgAUCZWS7xRscZXBocmFpbWFu
+aWVyb2JpQGFwYWNoZS5vcmcACgkQCKfckWuO8IDeDgEA5+WTevglsQrvlz4TkM9G
+zqf/V50yWRaa6LlT3LRRCZkA/AuT5fOlWxU4M65CqON1RqIiuLuoprFL5P3SoD5I
+WIYO
+=koZz
+-END PGP SIGNATURE-

Added: dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz.sha512
==
--- dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz.sha512 (added)
+++ dev/airflow/2.8.0b1/apache-airflow-2.8.0-source.tar.gz.sha512 Mon Nov 27 
16:00:24 2023
@@ -0,0 +1 @@
+b97370d11e039a0c1b0945f690ce12afccd7aa5a12d09f76d64ae98f9226470c3f8884034574ecbf7104a1c9d5f9f04aefa7b5fa176fc26c6f6f752a76f62d16
  apache-airflow-2.8.0-source.tar.gz

Added: dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz
==
Binary file - no diff available.

Propchange: dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz.asc
==
--- dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz.asc (added)
+++ dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz.asc Mon Nov 27 16:00:24 2023
@@ -0,0 +1,8 @@
+-BEGIN PGP SIGNATURE-
+
+iJEEABYKADkWIQS9oZEc9zL8clyUp/4Ip9yRa47wgAUCZWS7xhscZXBocmFpbWFu
+aWVyb2JpQGFwYWNoZS5vcmcACgkQCKfckWuO8ICBsQEAjH9unuvx7uFa64SQAbpU
+KiuArlzfx+dGFukRnUGxR88BAOuaSeTS2NZ7h2vvnkK2cCxIAgB86o+/ZY1oID+2
+PcUD
+=T25N
+-END PGP SIGNATURE-

Added: dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz.sha512
==
--- dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz.sha512 (added)
+++ dev/airflow/2.8.0b1/apache-airflow-2.8.0.tar.gz.sha512 Mon Nov 27 16:00:24 
2023
@@ -0,0 +1 @@
+4bc8e7297256a1563fddfd118977ba81146b764e28b101f7556b1171a08dfaf48572e4757eba7d88468a3588b8ed84d026f8720615ae85a714dc7bbe291de911
  apache-airflow-2.8.0.tar.gz

Added: dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl
==
Binary file - no diff available.

Propchange: dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl
--
svn:mime-type = application/octet-stream

Added: dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl.asc
==
--- dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl.asc (added)
+++ dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl.asc Mon Nov 27 
16:00:24 2023
@@ -0,0 +1,8 @@
+-BEGIN PGP SIGNATURE-
+
+iJEEABYKADkWIQS9oZEc9zL8clyUp/4Ip9yRa47wgAUCZWS7xxscZXBocmFpbWFu
+aWVyb2JpQGFwYWNoZS5vcmcACgkQCKfckWuO8IADrgEA1HL2idZlhbK9RDW1LUrZ
+3NbL1hPc10V7YB/cC5rxojIA/2tnIm44nfZ4JNTp5mhInFhLRKHjlaSvg624saGu
+SYwO
+=L0e4
+-END PGP SIGNATURE-

Added: dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl.sha512
==
--- dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl.sha512 (added)
+++ dev/airflow/2.8.0b1/apache_airflow-2.8.0-py3-none-any.whl.sha512 Mon Nov 27 
16:00:24 2023
@@ -0,0 +1

(airflow) annotated tag 2.8.0b1 updated (c28ba46e13 -> 0557f37a97)

2023-11-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to annotated tag 2.8.0b1
in repository https://gitbox.apache.org/repos/asf/airflow.git


*** WARNING: tag 2.8.0b1 was modified! ***

from c28ba46e13 (commit)
  to 0557f37a97 (tag)
 tagging c28ba46e131c96fcd9804dca28589cc782fe (commit)
 replaces providers-amazon/8.12.0rc1
  by Ephraim Anierobi
  on Mon Nov 27 16:36:51 2023 +0100

- Log -
Apache Airflow 2.8.0b1
-BEGIN PGP SIGNATURE-

iHUEABYKAB0WIQS9oZEc9zL8clyUp/4Ip9yRa47wgAUCZWS3kwAKCRAIp9yRa47w
gG8bAP96AkrviZlZvX6I9SxCdzKC4dkKsvClixAoGqFS5tzZ/QD+Jzg/8PSOQVZD
17lobsEzQu6gYgW5Ju0g3FGwrWFthQw=
=AL3c
-END PGP SIGNATURE-
---


No new revisions were added by this update.

Summary of changes:



(airflow) branch main updated: Add 2.8.0b1 to issue template (#35912)

2023-11-28 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 491f0134ac Add 2.8.0b1 to issue template (#35912)
491f0134ac is described below

commit 491f0134ac0522e6829a4817bca1e1c1e5de811e
Author: Ephraim Anierobi 
AuthorDate: Tue Nov 28 10:40:25 2023 +0100

Add 2.8.0b1 to issue template (#35912)
---
 .github/ISSUE_TEMPLATE/airflow_bug_report.yml | 1 +
 1 file changed, 1 insertion(+)

diff --git a/.github/ISSUE_TEMPLATE/airflow_bug_report.yml 
b/.github/ISSUE_TEMPLATE/airflow_bug_report.yml
index 309151c8d7..c7895fe299 100644
--- a/.github/ISSUE_TEMPLATE/airflow_bug_report.yml
+++ b/.github/ISSUE_TEMPLATE/airflow_bug_report.yml
@@ -25,6 +25,7 @@ body:
 the latest release or main to see if the issue is fixed before 
reporting it.
   multiple: false
   options:
+- "2.8.0b1"
 - "2.7.3"
 - "main (development)"
 - "Other Airflow 2 version (please specify below)"



(airflow) branch main updated: Add more ways to connect to weaviate (#35864)

2023-11-28 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 2919abe5b3 Add more ways to connect to weaviate (#35864)
2919abe5b3 is described below

commit 2919abe5b3f2d186c896aebbc51acf98d554ef33
Author: Ephraim Anierobi 
AuthorDate: Tue Nov 28 20:30:06 2023 +0100

Add more ways to connect to weaviate (#35864)

* Add more ways to connect to weaviate

There are other options for connecting to weaviate. This commit adds
these other options and also improved the imports/typing

* fixup! Add more ways to connect to weaviate

* fixup! fixup! Add more ways to connect to weaviate

* add depreccation

* remove mark as dbtest
---
 airflow/providers/weaviate/hooks/weaviate.py   |  50 ---
 .../connections.rst|  19 +++
 tests/providers/weaviate/hooks/test_weaviate.py| 148 -
 3 files changed, 198 insertions(+), 19 deletions(-)

diff --git a/airflow/providers/weaviate/hooks/weaviate.py 
b/airflow/providers/weaviate/hooks/weaviate.py
index c8b0ed05d4..151aaabea6 100644
--- a/airflow/providers/weaviate/hooks/weaviate.py
+++ b/airflow/providers/weaviate/hooks/weaviate.py
@@ -17,10 +17,13 @@
 
 from __future__ import annotations
 
+import warnings
 from typing import Any
 
-import weaviate
+from weaviate import Client as WeaviateClient
+from weaviate.auth import AuthApiKey, AuthBearerToken, AuthClientCredentials, 
AuthClientPassword
 
+from airflow.exceptions import AirflowProviderDeprecationWarning
 from airflow.hooks.base import BaseHook
 
 
@@ -40,19 +43,19 @@ class WeaviateHook(BaseHook):
 super().__init__(*args, **kwargs)
 self.conn_id = conn_id
 
-@staticmethod
-def get_connection_form_widgets() -> dict[str, Any]:
+@classmethod
+def get_connection_form_widgets(cls) -> dict[str, Any]:
 """Returns connection widgets to add to connection form."""
 from flask_appbuilder.fieldwidgets import BS3PasswordFieldWidget
 from flask_babel import lazy_gettext
 from wtforms import PasswordField
 
 return {
-"token": PasswordField(lazy_gettext("Weaviate API Token"), 
widget=BS3PasswordFieldWidget()),
+"token": PasswordField(lazy_gettext("Weaviate API Key"), 
widget=BS3PasswordFieldWidget()),
 }
 
-@staticmethod
-def get_ui_field_behaviour() -> dict[str, Any]:
+@classmethod
+def get_ui_field_behaviour(cls) -> dict[str, Any]:
 """Returns custom field behaviour."""
 return {
 "hidden_fields": ["port", "schema"],
@@ -62,28 +65,43 @@ class WeaviateHook(BaseHook):
 },
 }
 
-def get_client(self) -> weaviate.Client:
+def get_conn(self) -> WeaviateClient:
 conn = self.get_connection(self.conn_id)
 url = conn.host
 username = conn.login or ""
 password = conn.password or ""
 extras = conn.extra_dejson
-token = extras.pop("token", "")
+access_token = extras.get("access_token", None)
+refresh_token = extras.get("refresh_token", None)
+expires_in = extras.get("expires_in", 60)
+# previously token was used as api_key(backwards compatibility)
+api_key = extras.get("api_key", None) or extras.get("token", None)
+client_secret = extras.get("client_secret", None)
 additional_headers = extras.pop("additional_headers", {})
-scope = conn.extra_dejson.get("oidc_scope", "offline_access")
-
-if token == "" and username != "":
-auth_client_secret = weaviate.AuthClientPassword(
-username=username, password=password, scope=scope
+scope = extras.get("scope", None) or extras.get("oidc_scope", None)
+if api_key:
+auth_client_secret = AuthApiKey(api_key)
+elif access_token:
+auth_client_secret = AuthBearerToken(
+access_token, expires_in=expires_in, 
refresh_token=refresh_token
 )
+elif client_secret:
+auth_client_secret = 
AuthClientCredentials(client_secret=client_secret, scope=scope)
 else:
-auth_client_secret = weaviate.AuthApiKey(token)
+auth_client_secret = AuthClientPassword(username=username, 
password=password, scope=scope)
 
-client = weaviate.Client(
+return WeaviateClient(
 url=url, auth_client_secret=auth_client_secret, 
additional_headers=ad

(airflow) branch main updated: Revert "Prevent assignment of non JSON serializable values to DagRun.conf dict (#35096)" (#35959)

2023-11-29 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 4a7c7460bf Revert "Prevent assignment of non JSON serializable values 
to DagRun.conf dict (#35096)" (#35959)
4a7c7460bf is described below

commit 4a7c7460bf1734b76497280f5a2adc3e30a7820c
Author: Ephraim Anierobi 
AuthorDate: Wed Nov 29 19:31:43 2023 +0100

Revert "Prevent assignment of non JSON serializable values to DagRun.conf 
dict (#35096)" (#35959)

This reverts commit 84c40a7877e5ea9dbee03b707065cb590f872111.
---
 airflow/models/dagrun.py| 51 ++---
 tests/models/test_dagrun.py | 14 -
 2 files changed, 2 insertions(+), 63 deletions(-)

diff --git a/airflow/models/dagrun.py b/airflow/models/dagrun.py
index b2e70b37a5..b7d9b05e82 100644
--- a/airflow/models/dagrun.py
+++ b/airflow/models/dagrun.py
@@ -18,7 +18,6 @@
 from __future__ import annotations
 
 import itertools
-import json
 import os
 import warnings
 from collections import defaultdict
@@ -98,37 +97,6 @@ class TISchedulingDecision(NamedTuple):
 finished_tis: list[TI]
 
 
-class ConfDict(dict):
-"""Custom dictionary for storing only JSON serializable values."""
-
-def __init__(self, val=None):
-super().__init__(self.is_jsonable(val))
-
-def __setitem__(self, key, value):
-self.is_jsonable({key: value})
-super().__setitem__(key, value)
-
-@staticmethod
-def is_jsonable(conf: dict) -> dict | None:
-"""Prevent setting non-json attributes."""
-try:
-json.dumps(conf)
-except TypeError:
-raise AirflowException("Cannot assign non JSON Serializable value")
-if isinstance(conf, dict):
-return conf
-else:
-raise AirflowException(f"Object of type {type(conf)} must be a 
dict")
-
-@staticmethod
-def dump_check(conf: str) -> str:
-val = json.loads(conf)
-if isinstance(val, dict):
-return conf
-else:
-raise TypeError(f"Object of type {type(val)} must be a dict")
-
-
 def _creator_note(val):
 """Creator the ``note`` association proxy."""
 if isinstance(val, str):
@@ -159,7 +127,7 @@ class DagRun(Base, LoggingMixin):
 creating_job_id = Column(Integer)
 external_trigger = Column(Boolean, default=True)
 run_type = Column(String(50), nullable=False)
-_conf = Column("conf", PickleType)
+conf = Column(PickleType)
 # These two must be either both NULL or both datetime.
 data_interval_start = Column(UtcDateTime)
 data_interval_end = Column(UtcDateTime)
@@ -261,12 +229,7 @@ class DagRun(Base, LoggingMixin):
 self.execution_date = execution_date
 self.start_date = start_date
 self.external_trigger = external_trigger
-
-if isinstance(conf, str):
-self._conf = ConfDict.dump_check(conf)
-else:
-self._conf = ConfDict(conf or {})
-
+self.conf = conf or {}
 if state is not None:
 self.state = state
 if queued_at is NOTSET:
@@ -296,16 +259,6 @@ class DagRun(Base, LoggingMixin):
 )
 return run_id
 
-def get_conf(self):
-return self._conf
-
-def set_conf(self, value):
-self._conf = ConfDict(value)
-
-@declared_attr
-def conf(self):
-return synonym("_conf", descriptor=property(self.get_conf, 
self.set_conf))
-
 @property
 def stats_tags(self) -> dict[str, str]:
 return prune_dict({"dag_id": self.dag_id, "run_type": self.run_type})
diff --git a/tests/models/test_dagrun.py b/tests/models/test_dagrun.py
index cb873b0bc3..5732e0d565 100644
--- a/tests/models/test_dagrun.py
+++ b/tests/models/test_dagrun.py
@@ -2618,17 +2618,3 @@ def test_dag_run_id_config(session, dag_maker, pattern, 
run_id, result):
 else:
 with pytest.raises(AirflowException):
 dag_maker.create_dagrun(run_id=run_id)
-
-
-def test_dagrun_conf():
-dag_run = DagRun(conf={"test": 1234})
-assert dag_run.conf == {"test": 1234}
-
-with pytest.raises(AirflowException) as err:
-dag_run.conf["non_json"] = timezone.utcnow()
-assert str(err.value) == "Cannot assign non JSON Serializable value"
-
-with pytest.raises(AirflowException) as err:
-value = 1
-dag_run.conf = value
-assert str(err.value) == f"Object of type {type(value)} must be a dict"



(airflow) branch main updated: Add a cache for weaviate client (#35983)

2023-12-01 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 8be03c9937 Add a cache for weaviate client (#35983)
8be03c9937 is described below

commit 8be03c99372cfaf7a86f31464959338f6f9b900f
Author: Ephraim Anierobi 
AuthorDate: Fri Dec 1 15:01:53 2023 +0100

Add a cache for weaviate client (#35983)

* Add a cache for weaviate client

While working on another issue, I realized how often I had to call get_conn.
So instead of depreccating this, we can use it as a cache within the code 
so we
don't connect everytime a method is called.

* change cache to be on _conn
---
 airflow/providers/weaviate/hooks/weaviate.py | 23 +++
 1 file changed, 15 insertions(+), 8 deletions(-)

diff --git a/airflow/providers/weaviate/hooks/weaviate.py 
b/airflow/providers/weaviate/hooks/weaviate.py
index 151aaabea6..66d820bbbe 100644
--- a/airflow/providers/weaviate/hooks/weaviate.py
+++ b/airflow/providers/weaviate/hooks/weaviate.py
@@ -18,6 +18,7 @@
 from __future__ import annotations
 
 import warnings
+from functools import cached_property
 from typing import Any
 
 from weaviate import Client as WeaviateClient
@@ -94,18 +95,24 @@ class WeaviateHook(BaseHook):
 url=url, auth_client_secret=auth_client_secret, 
additional_headers=additional_headers
 )
 
+@cached_property
+def conn(self) -> WeaviateClient:
+"""Returns a Weaviate client."""
+return self.get_conn()
+
 def get_client(self) -> WeaviateClient:
+"""Returns a Weaviate client."""
 # Keeping this for backwards compatibility
 warnings.warn(
 "The `get_client` method has been renamed to `get_conn`",
 AirflowProviderDeprecationWarning,
 stacklevel=2,
 )
-return self.get_conn()
+return self.conn
 
 def test_connection(self) -> tuple[bool, str]:
 try:
-client = self.get_client()
+client = self.conn
 client.schema.get()
 return True, "Connection established!"
 except Exception as e:
@@ -114,7 +121,7 @@ class WeaviateHook(BaseHook):
 
 def create_class(self, class_json: dict[str, Any]) -> None:
 """Create a new class."""
-client = self.get_client()
+client = self.conn
 client.schema.create_class(class_json)
 
 def create_schema(self, schema_json: dict[str, Any]) -> None:
@@ -125,13 +132,13 @@ class WeaviateHook(BaseHook):
 
 :param schema_json: The schema to create
 """
-client = self.get_client()
+client = self.conn
 client.schema.create(schema_json)
 
 def batch_data(
 self, class_name: str, data: list[dict[str, Any]], 
batch_config_params: dict[str, Any] | None = None
 ) -> None:
-client = self.get_client()
+client = self.conn
 if not batch_config_params:
 batch_config_params = {}
 client.batch.configure(**batch_config_params)
@@ -147,7 +154,7 @@ class WeaviateHook(BaseHook):
 
 def delete_class(self, class_name: str) -> None:
 """Delete an existing class."""
-client = self.get_client()
+client = self.conn
 client.schema.delete_class(class_name)
 
 def query_with_vector(
@@ -166,7 +173,7 @@ class WeaviateHook(BaseHook):
 external vectorizer. Weaviate then converts this into a vector through 
the inference API
 (OpenAI in this particular example) and uses that vector as the basis 
for a vector search.
 """
-client = self.get_client()
+client = self.conn
 results: dict[str, dict[Any, Any]] = (
 client.query.get(class_name, properties[0])
 .with_near_vector({"vector": embeddings, "certainty": certainty})
@@ -185,7 +192,7 @@ class WeaviateHook(BaseHook):
 weaviate with a query search_text. Weaviate then converts this into a 
vector through the inference
 API (OpenAI in this particular example) and uses that vector as the 
basis for a vector search.
 """
-client = self.get_client()
+client = self.conn
 results: dict[str, dict[Any, Any]] = (
 client.query.get(class_name, properties[0])
 .with_near_text({"concepts": [search_text]})



(airflow) 02/34: improved visibility of tasks in ActionModal for taskInstance (#35810)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 881d802629174409d1daa440dc5279781a88a112
Author: Aadya <101169283+theaa...@users.noreply.github.com>
AuthorDate: Mon Nov 27 21:01:07 2023 +0530

improved visibility of tasks in ActionModal for taskInstance (#35810)

* change max box height in ActionModal.tsx

* changed max box height in ActionModal.tsx

(cherry picked from commit 9a1dceb031aa0ab44a7c996c267128bd4c61a5bf)
---
 .../www/static/js/dag/details/taskInstance/taskActions/ActionModal.tsx  | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/airflow/www/static/js/dag/details/taskInstance/taskActions/ActionModal.tsx 
b/airflow/www/static/js/dag/details/taskInstance/taskActions/ActionModal.tsx
index 57128bacd1..e52cd30aae 100644
--- a/airflow/www/static/js/dag/details/taskInstance/taskActions/ActionModal.tsx
+++ b/airflow/www/static/js/dag/details/taskInstance/taskActions/ActionModal.tsx
@@ -108,7 +108,7 @@ const ActionModal = ({
   
 
 
-  
+  
 
   
 



(airflow) 01/34: Run triggers inline with dag test (#34642)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 28897f7a424f0c44b340a17591d54854aa192dd5
Author: Daniel Standish <15932138+dstand...@users.noreply.github.com>
AuthorDate: Mon Nov 27 06:48:17 2023 -0800

Run triggers inline with dag test (#34642)

No need to have trigger running -- will just run them async.

(cherry picked from commit 7b37a785d0b74d1e83c7ce84729febffd6e26821)
---
 airflow/models/dag.py  | 68 +---
 airflow/models/taskinstance.py |  3 ++
 tests/cli/commands/test_dag_command.py | 81 --
 tests/models/test_mappedoperator.py|  2 +-
 4 files changed, 81 insertions(+), 73 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 26c83754a8..27e8258a6d 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -17,7 +17,8 @@
 # under the License.
 from __future__ import annotations
 
-import collections.abc
+import asyncio
+import collections
 import copy
 import functools
 import itertools
@@ -82,11 +83,11 @@ from airflow.datasets.manager import dataset_manager
 from airflow.exceptions import (
 AirflowDagInconsistent,
 AirflowException,
-AirflowSkipException,
 DuplicateTaskIdFound,
 FailStopDagInvalidTriggerRule,
 ParamValidationError,
 RemovedInAirflow3Warning,
+TaskDeferred,
 TaskNotFound,
 )
 from airflow.jobs.job import run_job
@@ -101,7 +102,6 @@ from airflow.models.taskinstance import (
 Context,
 TaskInstance,
 TaskInstanceKey,
-TaskReturnCode,
 clear_task_instances,
 )
 from airflow.secrets.local_filesystem import LocalFilesystemBackend
@@ -285,12 +285,11 @@ def get_dataset_triggered_next_run_info(
 }
 
 
-class _StopDagTest(Exception):
-"""
-Raise when DAG.test should stop immediately.
+def _triggerer_is_healthy():
+from airflow.jobs.triggerer_job_runner import TriggererJobRunner
 
-:meta private:
-"""
+job = TriggererJobRunner.most_recent_job()
+return job and job.is_alive()
 
 
 @functools.total_ordering
@@ -2844,21 +2843,12 @@ class DAG(LoggingMixin):
 if not scheduled_tis and ids_unrunnable:
 self.log.warning("No tasks to run. unrunnable tasks: %s", 
ids_unrunnable)
 time.sleep(1)
+triggerer_running = _triggerer_is_healthy()
 for ti in scheduled_tis:
 try:
 add_logger_if_needed(ti)
 ti.task = tasks[ti.task_id]
-ret = _run_task(ti, session=session)
-if ret is TaskReturnCode.DEFERRED:
-if not _triggerer_is_healthy():
-raise _StopDagTest(
-"Task has deferred but triggerer component is 
not running. "
-"You can start the triggerer by running 
`airflow triggerer` in a terminal."
-)
-except _StopDagTest:
-# Let this exception bubble out and not be swallowed by the
-# except block below.
-raise
+_run_task(ti=ti, inline_trigger=not triggerer_running, 
session=session)
 except Exception:
 self.log.exception("Task failed; ti=%s", ti)
 if conn_file_path or variable_file_path:
@@ -3992,14 +3982,15 @@ class DagContext:
 return None
 
 
-def _triggerer_is_healthy():
-from airflow.jobs.triggerer_job_runner import TriggererJobRunner
+def _run_trigger(trigger):
+async def _run_trigger_main():
+async for event in trigger.run():
+return event
 
-job = TriggererJobRunner.most_recent_job()
-return job and job.is_alive()
+return asyncio.run(_run_trigger_main())
 
 
-def _run_task(ti: TaskInstance, session) -> TaskReturnCode | None:
+def _run_task(*, ti: TaskInstance, inline_trigger: bool = False, session: 
Session):
 """
 Run a single task instance, and push result to Xcom for downstream tasks.
 
@@ -4009,20 +4000,21 @@ def _run_task(ti: TaskInstance, session) -> 
TaskReturnCode | None:
 Args:
 ti: TaskInstance to run
 """
-ret = None
-log.info("*")
-if ti.map_index > 0:
-log.info("Running task %s index %d", ti.task_id, ti.map_index)
-else:
-log.info("Running task %s", ti.task_id)
-try:
-ret = ti._run_raw_task(session=session)
-session.flush()
-log.info("%s ran successfully!", ti.task_id)
-except AirflowSkipException:
-log.info("Task Skipped, continuing&

(airflow) 08/34: Remove workaround for pymssql failing compilation with new Cython (#35924)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2fba7bd0b38e6f5e2ec842e24878d217e195101c
Author: Jarek Potiuk 
AuthorDate: Tue Nov 28 21:37:58 2023 +0100

Remove workaround for pymssql failing compilation with new Cython (#35924)

Recent Cython release caused pymssql package failures when they
were installed on ARM platform. This had been workarounded in
the #32748, but since pymssql as of 2.1.8 supports new Cython, we
can remove the workaround and bump the minimum version of pymsssql.

This also makes it possible to remove the whole MSSQL client section
from the image if we decide to - because this section will only install
the odbc client that has been pre-installed to support MSSQL as
metadata DB for Airflow core.

(cherry picked from commit 4f060a482c3233504e7905b3ab2d00fe56ea43cd)
---
 Dockerfile  | 42 -
 Dockerfile.ci   | 36 -
 airflow/providers/microsoft/mssql/provider.yaml |  2 +-
 generated/provider_dependencies.json|  2 +-
 scripts/docker/install_mssql.sh | 39 ---
 5 files changed, 2 insertions(+), 119 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index b9b358d0c7..7a7cb89225 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -347,8 +347,6 @@ COPY <<"EOF" /install_mssql.sh
 #!/usr/bin/env bash
 set -euo pipefail
 
-. "$( dirname "${BASH_SOURCE[0]}" )/common.sh"
-
 : "${AIRFLOW_PIP_VERSION:?Should be set}"
 
 : "${INSTALL_MSSQL_CLIENT:?Should be true or false}"
@@ -384,40 +382,6 @@ function install_mssql_client() {
 rm -rf /var/lib/apt/lists/*
 apt-get autoremove -yqq --purge
 apt-get clean && rm -rf /var/lib/apt/lists/*
-
-# Workaround an issue with installing pymssql on ARM architecture 
triggered by Cython 3.0.0 release as of
-# 18 July 2023. The problem is that pip uses latest Cython to compile 
pymssql and since we are using
-# setuptools, there is no easy way to fix version of Cython used to 
compile packages.
-#
-# This triggers a problem with newer `pip` versions that have build 
isolation enabled by default because
-# There is no (easy) way to pin build dependencies for dependent packages. 
If a package does not have
-# limit on build dependencies, it will use the latest version of them to 
build that particular package.
-#
-# The workaround to the problem suggest in the last thread by Pradyun 
Gedam - pip maintainer - is to
-# use PIP_CONSTRAINT environment variable and constraint the version of 
Cython used while installing
-# the package. Which is precisely what we are doing here.
-#
-# Note that it does not work if we pass ``--constraint`` option to pip 
because it will not be passed to
-# the package being build in isolation. The fact that the PIP_CONSTRAINT 
env variable works in the isolation
-# is a bit of side-effect on how env variables work and that they are 
passed to subprocesses as pip
-# launches a subprocess `pip` to build the package.
-#
-# This is a temporary solution until the issue is resolved in pymssql or 
Cython
-# Issues/discussions that track it:
-#
-# * https://github.com/cython/cython/issues/5541
-# * https://github.com/pymssql/pymssql/pull/827
-# * https://discuss.python.org/t/no-way-to-pin-build-dependencies/29833
-#
-# TODO: Remove this workaround when the issue is resolved.
-#   ALSO REMOVE THE TOP LINES ABOVE WITH common.sh IMPORT AS WELL AS 
COPYING common.sh ib
-#   Dockerfile AND Dockerfile.ci (look for capital PYMSSQL - there are 
several places to remove)
-if [[ "${1}" == "dev" ]]; then
-common::install_pip_version
-echo "Cython==0.29.36" >> /tmp/mssql-constraints.txt
-PIP_CONSTRAINT=/tmp/mssql-constraints.txt pip install pymssql
-rm /tmp/mssql-constraints.txt
-fi
 }
 
 install_mssql_client "${@}"
@@ -1272,12 +1236,6 @@ ENV INSTALL_MYSQL_CLIENT=${INSTALL_MYSQL_CLIENT} \
 # scripts which are needed much later will not invalidate the docker layer here
 COPY --from=scripts install_mysql.sh install_mssql.sh install_postgres.sh 
/scripts/docker/
 
-# THE 3 LINES ARE ONLY NEEDED IN ORDER TO MAKE PYMSSQL BUILD WORK WITH LATEST 
CYTHON
-# AND SHOULD BE REMOVED WHEN WORKAROUND IN install_mssql.sh IS REMOVED
-ARG AIRFLOW_PIP_VERSION=23.3.1
-ENV AIRFLOW_PIP_VERSION=${AIRFLOW_PIP_VERSION}
-COPY --from=scripts common.sh /scripts/docker/
-
 RUN bash /scripts/docker/install_mysql.sh dev && \
 bash /scripts/docker/install_mssql.sh dev && \
 bash /scripts/docker/install_postgres.sh dev
diff --git a/Dockerfile.

(airflow) 07/34: Consolidate the call of change_state to fail or success in the core executors (#35901)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9798f314ddc23ab129d8e52988eab08251e755c7
Author: Hussein Awala 
AuthorDate: Tue Nov 28 20:14:13 2023 +0200

Consolidate the call of change_state to fail or success in the core 
executors (#35901)

(cherry picked from commit ce7f043e15534c3d9ba6d59c3bb6b851e36a60b9)
---
 airflow/executors/debug_executor.py  | 6 +++---
 airflow/executors/sequential_executor.py | 5 ++---
 tests/executors/test_debug_executor.py   | 4 ++--
 3 files changed, 7 insertions(+), 8 deletions(-)

diff --git a/airflow/executors/debug_executor.py 
b/airflow/executors/debug_executor.py
index b601c2b7c9..bb5f46b1f7 100644
--- a/airflow/executors/debug_executor.py
+++ b/airflow/executors/debug_executor.py
@@ -74,7 +74,7 @@ class DebugExecutor(BaseExecutor):
 elif self._terminated.is_set():
 self.log.info("Executor is terminated! Stopping %s to %s", 
ti.key, TaskInstanceState.FAILED)
 ti.set_state(TaskInstanceState.FAILED)
-self.change_state(ti.key, TaskInstanceState.FAILED)
+self.fail(ti.key)
 else:
 task_succeeded = self._run_task(ti)
 
@@ -84,11 +84,11 @@ class DebugExecutor(BaseExecutor):
 try:
 params = self.tasks_params.pop(ti.key, {})
 ti.run(job_id=ti.job_id, **params)
-self.change_state(key, TaskInstanceState.SUCCESS)
+self.success(key)
 return True
 except Exception as e:
 ti.set_state(TaskInstanceState.FAILED)
-self.change_state(key, TaskInstanceState.FAILED)
+self.fail(key)
 self.log.exception("Failed to execute task: %s.", e)
 return False
 
diff --git a/airflow/executors/sequential_executor.py 
b/airflow/executors/sequential_executor.py
index 8ea3e42dc5..227bf879f3 100644
--- a/airflow/executors/sequential_executor.py
+++ b/airflow/executors/sequential_executor.py
@@ -28,7 +28,6 @@ import subprocess
 from typing import TYPE_CHECKING, Any
 
 from airflow.executors.base_executor import BaseExecutor
-from airflow.utils.state import TaskInstanceState
 
 if TYPE_CHECKING:
 from airflow.executors.base_executor import CommandType
@@ -75,9 +74,9 @@ class SequentialExecutor(BaseExecutor):
 
 try:
 subprocess.check_call(command, close_fds=True)
-self.change_state(key, TaskInstanceState.SUCCESS)
+self.success(key)
 except subprocess.CalledProcessError as e:
-self.change_state(key, TaskInstanceState.FAILED)
+self.fail(key)
 self.log.error("Failed to execute task %s.", e)
 
 self.commands_to_run = []
diff --git a/tests/executors/test_debug_executor.py 
b/tests/executors/test_debug_executor.py
index 03a91f9c92..20ee821842 100644
--- a/tests/executors/test_debug_executor.py
+++ b/tests/executors/test_debug_executor.py
@@ -111,7 +111,7 @@ class TestDebugExecutor:
 assert not executor.tasks_to_run
 change_state_mock.assert_has_calls(
 [
-mock.call(ti1.key, State.FAILED),
+mock.call(ti1.key, State.FAILED, None),
 mock.call(ti2.key, State.UPSTREAM_FAILED),
 ]
 )
@@ -145,6 +145,6 @@ class TestDebugExecutor:
 
 change_state_mock.assert_has_calls(
 [
-mock.call(ti1.key, State.FAILED),
+mock.call(ti1.key, State.FAILED, None),
 ]
 )



(airflow) 26/34: Limit pytest-asyncio even more - to <0.23.0 (#36040)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit be0fb8b11c48568f6cf86558e18cdc21296d6e98
Author: Jarek Potiuk 
AuthorDate: Mon Dec 4 06:17:39 2023 +0100

Limit pytest-asyncio even more - to <0.23.0 (#36040)

Seems that the pytest-asyncio problem was already introduced by
0.23.0. In order to allow tests passing now, we should limit it
to below that version (follow up after #36037)

(cherry picked from commit cc2521cf6c01363f0e1c96bbd6ed0231406ecd63)
---
 setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index ecd7167fb6..b3c5c0b0a5 100644
--- a/setup.py
+++ b/setup.py
@@ -486,7 +486,7 @@ _devel_only_tests = [
 "pytest>=7.1",
 # Pytest-asyncio 0.23.1 breaks our tests. The limitation should be removed 
when the issue is fixed:
 # https://github.com/pytest-dev/pytest-asyncio/issues/703
-"pytest-asyncio<0.23.1",
+"pytest-asyncio<0.23.0",
 "pytest-cov",
 "pytest-httpx",
 "pytest-icdiff",



(airflow) 14/34: Fix airflow db shell needing an extra keypress to exit (#35982)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f6cfd45f339c7382cd1e7c77608d477b45e8e16b
Author: Renze Post 
AuthorDate: Thu Nov 30 21:28:10 2023 +0100

Fix airflow db shell needing an extra keypress to exit (#35982)

(cherry picked from commit cbb9c4f8ccadd5fbd01a2f0072343764eab03497)
---
 airflow/utils/process_utils.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/utils/process_utils.py b/airflow/utils/process_utils.py
index 15844d9227..18798e85d0 100644
--- a/airflow/utils/process_utils.py
+++ b/airflow/utils/process_utils.py
@@ -222,7 +222,7 @@ def execute_interactive(cmd: list[str], **kwargs) -> None:
 # ignore SIGINT in the parent process
 signal.signal(signal.SIGINT, signal.SIG_IGN)
 while proc.poll() is None:
-readable_fbs, _, _ = select.select([sys.stdin, primary_fd], 
[], [])
+readable_fbs, _, _ = select.select([sys.stdin, primary_fd], 
[], [], 0)
 if sys.stdin in readable_fbs:
 input_data = os.read(sys.stdin.fileno(), 10240)
 os.write(primary_fd, input_data)



(airflow) 19/34: Add support for chicken-egg providers to dockerhub release process (#36002)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e1f469bbd315bd08becb828bdc06f2ff9a329416
Author: Jarek Potiuk 
AuthorDate: Fri Dec 1 15:33:18 2023 +0100

Add support for chicken-egg providers to dockerhub release process (#36002)

(cherry picked from commit 8829d1732c7f210c9d2ab0cc20ebfb1861ae9f84)
---
 .github/workflows/release_dockerhub_image.yml  | 17 ++
 Dockerfile |  7 +++
 airflow/providers/MANAGING_PROVIDERS_LIFECYCLE.rst | 58 
 .../commands/release_management_commands.py| 33 
 .../commands/release_management_commands_config.py |  1 +
 .../src/airflow_breeze/utils/common_options.py |  8 +++
 dev/breeze/src/airflow_breeze/utils/versions.py|  6 +++
 ...tput_release-management_release-prod-images.svg | 62 ++
 ...tput_release-management_release-prod-images.txt |  2 +-
 .../docker/install_from_docker_context_files.sh|  7 +++
 10 files changed, 170 insertions(+), 31 deletions(-)

diff --git a/.github/workflows/release_dockerhub_image.yml 
b/.github/workflows/release_dockerhub_image.yml
index 6889539387..3d6d4e065e 100644
--- a/.github/workflows/release_dockerhub_image.yml
+++ b/.github/workflows/release_dockerhub_image.yml
@@ -35,6 +35,7 @@ concurrency:
   cancel-in-progress: true
 env:
   GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+  VERBOSE: true
 jobs:
   build-info:
 timeout-minutes: 10
@@ -46,6 +47,7 @@ jobs:
   pythonVersions: ${{ steps.selective-checks.outputs.python-versions }}
   allPythonVersions: ${{ 
steps.selective-checks.outputs.all-python-versions }}
   defaultPythonVersion: ${{ 
steps.selective-checks.outputs.default-python-version }}
+  chicken-egg-providers: ${{ 
steps.selective-checks.outputs.chicken-egg-providers }}
   skipLatest: ${{ github.event.inputs.skipLatest == '' && ' ' || 
'--skip-latest' }}
   limitPlatform: ${{ github.repository == 'apache/airflow' && ' ' || 
'--limit-platform linux/amd64' }}
 env:
@@ -107,6 +109,20 @@ jobs:
 run: >
   echo ${{ secrets.DOCKERHUB_TOKEN }} |
   docker login --password-stdin --username ${{ secrets.DOCKERHUB_USER 
}}
+  - name: "Prepare chicken-eggs provider packages"
+# In case of provider packages which use latest dev0 version of 
providers, we should prepare them
+# from the source code, not from the PyPI because they have 
apache-airflow>=X.Y.Z dependency
+# And when we prepare them from sources they will have 
apache-airflow>=X.Y.Z.dev0
+shell: bash
+run: >
+  breeze release-management prepare-provider-packages
+  --package-format wheel
+  --version-suffix-for-pypi dev0 ${{ 
needs.build-info.outputs.chicken-egg-providers }}
+if: needs.build-info.outputs.chicken-egg-providers != ''
+  - name: "Copy dist packages to docker-context files"
+shell: bash
+run: cp -v --no-preserve=mode,ownership ./dist/*.whl 
./docker-context-files
+if: needs.build-info.outputs.chicken-egg-providers != ''
   - name: >
   Release regular images: ${{ github.event.inputs.airflowVersion }}, 
${{ matrix.python-version }}
 run: >
@@ -116,6 +132,7 @@ jobs:
   ${{ needs.build-info.outputs.skipLatest }}
   ${{ needs.build-info.outputs.limitPlatform }}
   --limit-python ${{ matrix.python-version }}
+  --chicken-egg-providers "${{ 
needs.build-info.outputs.chicken-egg-providers }}"
 env:
   COMMIT_SHA: ${{ github.sha }}
   - name: >
diff --git a/Dockerfile b/Dockerfile
index 7a7cb89225..61b95f7d7c 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -627,6 +627,13 @@ function 
install_airflow_and_providers_from_docker_context_files(){
 
reinstalling_apache_airflow_package="apache-airflow[${AIRFLOW_EXTRAS}]==$ver"
 fi
 
+if [[ -z "${reinstalling_apache_airflow_package}" && ${AIRFLOW_VERSION=} 
!= "" ]]; then
+# When we install only provider packages from docker-context files, we 
need to still
+# install airflow from PyPI when AIRFLOW_VERSION is set. This handles 
the case where
+# pre-release dockerhub image of airflow is built, but we want to 
install some providers from
+# docker-context files
+
reinstalling_apache_airflow_package="apache-airflow[${AIRFLOW_EXTRAS}]==${AIRFLOW_VERSION}"
+fi
 # Find Apache Airflow packages in docker-context files
 local reinstalling_apache_airflow_providers_packages
 reinstalling_apache_airflow_providers_packages=$(ls \
diff --git a/airflow/providers/MANAGING_PROVIDERS_LIFECYCLE.rst 
b/airfl

(airflow) 15/34: Bump FAB to 4.3.10 (#35991)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9ea67c8749fe813b97898e1f1b7859c96160e32e
Author: Jarek Potiuk 
AuthorDate: Thu Nov 30 23:48:27 2023 +0100

Bump FAB to 4.3.10 (#35991)

(cherry picked from commit 9bcee9d439ada9104e702e090b20d5d1eeafa035)
---
 airflow/auth/managers/fab/security_manager/override.py | 2 +-
 setup.cfg  | 6 --
 setup.py   | 2 +-
 3 files changed, 6 insertions(+), 4 deletions(-)

diff --git a/airflow/auth/managers/fab/security_manager/override.py 
b/airflow/auth/managers/fab/security_manager/override.py
index 2dc023f1da..3814036bb5 100644
--- a/airflow/auth/managers/fab/security_manager/override.py
+++ b/airflow/auth/managers/fab/security_manager/override.py
@@ -2149,7 +2149,7 @@ class 
FabAirflowSecurityManagerOverride(AirflowSecurityManagerV2):
 log.debug("User info from Azure: %s", me)
 # 
https://learn.microsoft.com/en-us/azure/active-directory/develop/id-token-claims-reference#payload-claims
 return {
-"email": me["email"],
+"email": me["upn"] if "upn" in me else me["email"],
 "first_name": me.get("given_name", ""),
 "last_name": me.get("family_name", ""),
 "username": me["oid"],
diff --git a/setup.cfg b/setup.cfg
index 4f7144ca5d..47bcddf04c 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -101,7 +101,7 @@ install_requires =
 # `airflow/www/fab_security` with their upstream counterparts. In 
particular, make sure any breaking changes,
 # for example any new methods, are accounted for.
 # NOTE! When you change the value here, you also have to update 
flask-appbuilder[oauth] in setup.py
-flask-appbuilder==4.3.9
+flask-appbuilder==4.3.10
 flask-caching>=1.5.0
 flask-login>=0.6.2
 flask-session>=0.4.0
@@ -160,7 +160,9 @@ install_requires =
 # We should also remove "licenses/LICENSE-unicodecsv.txt" file when we 
remove this dependency
 unicodecsv>=0.14.1
 universal_pathlib>=0.1.4
-werkzeug>=2.0
+# Werkzug 3 breaks Flask-Login 0.6.2
+# we should remove this limitation when FAB supports Flask 2.3
+werkzeug>=2.0,<3
 
 [options.packages.find]
 include =
diff --git a/setup.py b/setup.py
index bc624e21fa..9d71ea6194 100644
--- a/setup.py
+++ b/setup.py
@@ -328,7 +328,7 @@ doc_gen = [
 flask_appbuilder_oauth = [
 "authlib>=1.0.0",
 # The version here should be upgraded at the same time as flask-appbuilder 
in setup.cfg
-"flask-appbuilder[oauth]==4.3.9",
+"flask-appbuilder[oauth]==4.3.10",
 ]
 kerberos = [
 "pykerberos>=1.1.13",



(airflow) 24/34: 34058: Fix UI Grid error when DAG has been removed. (#36028)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d26510099537b7f2f99eae32ef1fc5f41dc195e0
Author: Aleksey Kirilishin <54231417+avkirilis...@users.noreply.github.com>
AuthorDate: Sun Dec 3 05:11:15 2023 +0400

34058: Fix UI Grid error when DAG has been removed. (#36028)

(cherry picked from commit 549fac30eeefaa449df9bfdf58eb40a008e9fe75)
---
 airflow/www/views.py | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index c4230c1900..f51fbb9e79 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -3013,6 +3013,9 @@ class Airflow(AirflowBaseView):
 def graph(self, dag_id: str, session: Session = NEW_SESSION):
 """Redirect to the replacement - grid + graph. Kept for backwards 
compatibility."""
 dag = get_airflow_app().dag_bag.get_dag(dag_id, session=session)
+if not dag:
+flash(f'DAG "{dag_id}" seems to be missing from DagBag.', "error")
+return redirect(url_for("Airflow.index"))
 dt_nr_dr_data = get_date_time_num_runs_dag_runs_form_data(request, 
session, dag)
 dttm = dt_nr_dr_data["dttm"]
 dag_run = dag.get_dagrun(execution_date=dttm)



(airflow) 10/34: Revert "Prevent assignment of non JSON serializable values to DagRun.conf dict (#35096)" (#35959)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7e9b6a4f68b5d2d6989ec7275e96e7ea46a26f09
Author: Ephraim Anierobi 
AuthorDate: Wed Nov 29 19:31:43 2023 +0100

Revert "Prevent assignment of non JSON serializable values to DagRun.conf 
dict (#35096)" (#35959)

This reverts commit 84c40a7877e5ea9dbee03b707065cb590f872111.

(cherry picked from commit 4a7c7460bf1734b76497280f5a2adc3e30a7820c)
---
 airflow/models/dagrun.py| 51 ++---
 tests/models/test_dagrun.py | 14 -
 2 files changed, 2 insertions(+), 63 deletions(-)

diff --git a/airflow/models/dagrun.py b/airflow/models/dagrun.py
index b2e70b37a5..b7d9b05e82 100644
--- a/airflow/models/dagrun.py
+++ b/airflow/models/dagrun.py
@@ -18,7 +18,6 @@
 from __future__ import annotations
 
 import itertools
-import json
 import os
 import warnings
 from collections import defaultdict
@@ -98,37 +97,6 @@ class TISchedulingDecision(NamedTuple):
 finished_tis: list[TI]
 
 
-class ConfDict(dict):
-"""Custom dictionary for storing only JSON serializable values."""
-
-def __init__(self, val=None):
-super().__init__(self.is_jsonable(val))
-
-def __setitem__(self, key, value):
-self.is_jsonable({key: value})
-super().__setitem__(key, value)
-
-@staticmethod
-def is_jsonable(conf: dict) -> dict | None:
-"""Prevent setting non-json attributes."""
-try:
-json.dumps(conf)
-except TypeError:
-raise AirflowException("Cannot assign non JSON Serializable value")
-if isinstance(conf, dict):
-return conf
-else:
-raise AirflowException(f"Object of type {type(conf)} must be a 
dict")
-
-@staticmethod
-def dump_check(conf: str) -> str:
-val = json.loads(conf)
-if isinstance(val, dict):
-return conf
-else:
-raise TypeError(f"Object of type {type(val)} must be a dict")
-
-
 def _creator_note(val):
 """Creator the ``note`` association proxy."""
 if isinstance(val, str):
@@ -159,7 +127,7 @@ class DagRun(Base, LoggingMixin):
 creating_job_id = Column(Integer)
 external_trigger = Column(Boolean, default=True)
 run_type = Column(String(50), nullable=False)
-_conf = Column("conf", PickleType)
+conf = Column(PickleType)
 # These two must be either both NULL or both datetime.
 data_interval_start = Column(UtcDateTime)
 data_interval_end = Column(UtcDateTime)
@@ -261,12 +229,7 @@ class DagRun(Base, LoggingMixin):
 self.execution_date = execution_date
 self.start_date = start_date
 self.external_trigger = external_trigger
-
-if isinstance(conf, str):
-self._conf = ConfDict.dump_check(conf)
-else:
-self._conf = ConfDict(conf or {})
-
+self.conf = conf or {}
 if state is not None:
 self.state = state
 if queued_at is NOTSET:
@@ -296,16 +259,6 @@ class DagRun(Base, LoggingMixin):
 )
 return run_id
 
-def get_conf(self):
-return self._conf
-
-def set_conf(self, value):
-self._conf = ConfDict(value)
-
-@declared_attr
-def conf(self):
-return synonym("_conf", descriptor=property(self.get_conf, 
self.set_conf))
-
 @property
 def stats_tags(self) -> dict[str, str]:
 return prune_dict({"dag_id": self.dag_id, "run_type": self.run_type})
diff --git a/tests/models/test_dagrun.py b/tests/models/test_dagrun.py
index cb873b0bc3..5732e0d565 100644
--- a/tests/models/test_dagrun.py
+++ b/tests/models/test_dagrun.py
@@ -2618,17 +2618,3 @@ def test_dag_run_id_config(session, dag_maker, pattern, 
run_id, result):
 else:
 with pytest.raises(AirflowException):
 dag_maker.create_dagrun(run_id=run_id)
-
-
-def test_dagrun_conf():
-dag_run = DagRun(conf={"test": 1234})
-assert dag_run.conf == {"test": 1234}
-
-with pytest.raises(AirflowException) as err:
-dag_run.conf["non_json"] = timezone.utcnow()
-assert str(err.value) == "Cannot assign non JSON Serializable value"
-
-with pytest.raises(AirflowException) as err:
-value = 1
-dag_run.conf = value
-assert str(err.value) == f"Object of type {type(value)} must be a dict"



(airflow) 18/34: Switch "latest" image to point to newest supported Python version (#36003)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 72d610a35472806c4c330a7c77d2c02aa2d9272c
Author: Jarek Potiuk 
AuthorDate: Fri Dec 1 14:08:59 2023 +0100

Switch "latest" image to point to newest supported Python version (#36003)

Following the lazy consensus to change the "latest" image
to point to "newest" Python version, we are changing the release
method to follow it.

https://lists.apache.org/thread/0oxnvct24xlqsj76z42w2ttw2d043oy3
(cherry picked from commit 4117f1b013323d851613ba7b69b3d987cf213ead)
---
 .../commands/release_management_commands.py| 26 +++---
 docs/docker-stack/changelog.rst|  8 +++
 2 files changed, 21 insertions(+), 13 deletions(-)

diff --git 
a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py 
b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py
index 5aa1f65bf1..42a55111ea 100644
--- a/dev/breeze/src/airflow_breeze/commands/release_management_commands.py
+++ b/dev/breeze/src/airflow_breeze/commands/release_management_commands.py
@@ -38,6 +38,7 @@ from airflow_breeze.commands.release_management_group import 
release_management
 from airflow_breeze.global_constants import (
 ALLOWED_DEBIAN_VERSIONS,
 ALLOWED_PLATFORMS,
+ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS,
 APACHE_AIRFLOW_GITHUB_REPOSITORY,
 CURRENT_PYTHON_MAJOR_MINOR_VERSIONS,
 DEFAULT_PYTHON_MAJOR_MINOR_VERSION,
@@ -1368,19 +1369,18 @@ def release_prod_images(
 f"{dockerhub_repo}:{airflow_version}-python{python}",
 f"{dockerhub_repo}:latest-python{python}",
 )
-if python == DEFAULT_PYTHON_MAJOR_MINOR_VERSION:
-# only tag latest  "default" image when we build default python 
version
-# otherwise if the non-default images complete before the default 
one, their jobs will fail
-if slim_images:
-alias_image(
-f"{dockerhub_repo}:slim-{airflow_version}",
-f"{dockerhub_repo}:slim-latest",
-)
-else:
-alias_image(
-f"{dockerhub_repo}:{airflow_version}",
-f"{dockerhub_repo}:latest",
-)
+if python == ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS[-1]:
+# only tag latest "default" image when we build the latest 
allowed python version
+if slim_images:
+alias_image(
+f"{dockerhub_repo}:slim-{airflow_version}",
+f"{dockerhub_repo}:slim-latest",
+)
+else:
+alias_image(
+f"{dockerhub_repo}:{airflow_version}",
+f"{dockerhub_repo}:latest",
+)
 
 
 def is_package_in_dist(dist_files: list[str], package: str) -> bool:
diff --git a/docs/docker-stack/changelog.rst b/docs/docker-stack/changelog.rst
index ad71d49221..6e77799779 100644
--- a/docs/docker-stack/changelog.rst
+++ b/docs/docker-stack/changelog.rst
@@ -47,6 +47,14 @@ Airflow 2.8
  working with ``Debian Bookworm``. While all reference images of Airflow 
2.8.0 are built on ``Debian Bookworm``,
  it is still possible to build deprecated custom ``Debian Bullseye`` based 
image in 2.8.0 following the
 
+   * The "latest" image (i.e. default Airflow image when ``apache/airflow`` is 
used or
+ ``apache/airflow:slim-latest``) uses now the newest supported Python 
version. Previously it was using
+ the "default" Python version which was Python 3.8 as of Airflow 2.7. With 
Airflow reference images
+ released for Airflow 2.8.0, the images are going to use Python 3.11 as 
this is the latest supported
+ version for Airflow 2.8 line. Users can use Python 3.8 by using 
``apache/airflow:2.8.0-python3.8`` and
+ ``apache/airflow:slim-2.8.0-python-3.8`` images respectively so while the 
change is potentially
+ breaking, it is very easy to switch to the previous behaviour.
+
 
 Airflow 2.7
 ~~~



(airflow) 30/34: Update supported-versions.rst (#36058)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 44302838fd1422bf7f8ff3ca3e7484c34b7cfd0f
Author: andar9 
AuthorDate: Mon Dec 4 12:15:17 2023 -0800

Update supported-versions.rst (#36058)

I think the date 14.19.2023 is wrong--should it be 14.09.2023 (= September 
14, 2023)?

(cherry picked from commit 55d81378b0fa7488bfdca68a4693821109c8fd1e)
---
 docs/apache-airflow/installation/supported-versions.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow/installation/supported-versions.rst 
b/docs/apache-airflow/installation/supported-versions.rst
index c3c1da4f47..19c24074bb 100644
--- a/docs/apache-airflow/installation/supported-versions.rst
+++ b/docs/apache-airflow/installation/supported-versions.rst
@@ -61,7 +61,7 @@ They are based on the official release schedule of Python and 
Kubernetes, nicely
 2. The "oldest" supported version of Python/Kubernetes is the default one. 
"Default" is only meaningful
in terms of "smoke tests" in CI PRs which are run using this default 
version and default reference
image available in DockerHub. Currently the ``apache/airflow:latest`` and 
``apache/airflow:2.5.2`` images
-   are Python 3.8 images, however, in the first MINOR/MAJOR release of Airflow 
released after 14.19.2023,
+   are Python 3.8 images, however, in the first MINOR/MAJOR release of Airflow 
released after 14.09.2023,
they will become Python 3.9 images.
 
 3. We support a new version of Python/Kubernetes in main after they are 
officially released, as soon as we



(airflow) 03/34: Use ExitStack to manage mutation of secrets_backend_list in dag.test (#34620)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ecbc95981213ce8178af249a891e2b3065e3f028
Author: Daniel Standish <15932138+dstand...@users.noreply.github.com>
AuthorDate: Mon Nov 27 11:07:10 2023 -0800

Use ExitStack to manage mutation of secrets_backend_list in dag.test 
(#34620)

Although it requires another indent, it's cleaner, and more importantly it 
makes sure that the mutation is undone after failure.

(cherry picked from commit 99b4eb769d2a3b6692de9c0d83ba64041abf5789)
---
 airflow/models/dag.py | 103 +-
 1 file changed, 52 insertions(+), 51 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 27e8258a6d..5daa7bb805 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -32,6 +32,7 @@ import traceback
 import warnings
 import weakref
 from collections import deque
+from contextlib import ExitStack
 from datetime import datetime, timedelta
 from inspect import signature
 from typing import (
@@ -2797,63 +2798,63 @@ class DAG(LoggingMixin):
 self.log.debug("Adding Streamhandler to taskinstance %s", 
ti.task_id)
 ti.log.addHandler(handler)
 
+exit_stack = ExitStack()
 if conn_file_path or variable_file_path:
 local_secrets = LocalFilesystemBackend(
 variables_file_path=variable_file_path, 
connections_file_path=conn_file_path
 )
 secrets_backend_list.insert(0, local_secrets)
+exit_stack.callback(lambda: secrets_backend_list.pop(0))
+
+with exit_stack:
+execution_date = execution_date or timezone.utcnow()
+self.validate()
+self.log.debug("Clearing existing task instances for execution 
date %s", execution_date)
+self.clear(
+start_date=execution_date,
+end_date=execution_date,
+dag_run_state=False,  # type: ignore
+session=session,
+)
+self.log.debug("Getting dagrun for dag %s", self.dag_id)
+logical_date = timezone.coerce_datetime(execution_date)
+data_interval = 
self.timetable.infer_manual_data_interval(run_after=logical_date)
+dr: DagRun = _get_or_create_dagrun(
+dag=self,
+start_date=execution_date,
+execution_date=execution_date,
+run_id=DagRun.generate_run_id(DagRunType.MANUAL, 
execution_date),
+session=session,
+conf=run_conf,
+data_interval=data_interval,
+)
 
-execution_date = execution_date or timezone.utcnow()
-self.validate()
-self.log.debug("Clearing existing task instances for execution date 
%s", execution_date)
-self.clear(
-start_date=execution_date,
-end_date=execution_date,
-dag_run_state=False,  # type: ignore
-session=session,
-)
-self.log.debug("Getting dagrun for dag %s", self.dag_id)
-logical_date = timezone.coerce_datetime(execution_date)
-data_interval = 
self.timetable.infer_manual_data_interval(run_after=logical_date)
-dr: DagRun = _get_or_create_dagrun(
-dag=self,
-start_date=execution_date,
-execution_date=execution_date,
-run_id=DagRun.generate_run_id(DagRunType.MANUAL, execution_date),
-session=session,
-conf=run_conf,
-data_interval=data_interval,
-)
-
-tasks = self.task_dict
-self.log.debug("starting dagrun")
-# Instead of starting a scheduler, we run the minimal loop possible to 
check
-# for task readiness and dependency management. This is notably faster
-# than creating a BackfillJob and allows us to surface logs to the user
-while dr.state == DagRunState.RUNNING:
-session.expire_all()
-schedulable_tis, _ = dr.update_state(session=session)
-for s in schedulable_tis:
-s.state = TaskInstanceState.SCHEDULED
-session.commit()
-# triggerer may mark tasks scheduled so we read from DB
-all_tis = set(dr.get_task_instances(session=session))
-scheduled_tis = {x for x in all_tis if x.state == 
TaskInstanceState.SCHEDULED}
-ids_unrunnable = {x for x in all_tis if x.state not in 
State.finished} - scheduled_tis
-if not scheduled_tis and ids_unrunnable:
-self.log.warning("No tasks to run. unrunnable tasks: %s", 
ids_unrunnable)
-time.sleep(1)
-triggerer_running = _triggerer_is_healthy()
-for ti in 

(airflow) 16/34: Add feature to build "chicken-egg" packages from sources (#35890)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8a0252d419cd2ae94557cbdc026513804698a894
Author: Jarek Potiuk 
AuthorDate: Fri Dec 1 01:09:48 2023 +0100

Add feature to build "chicken-egg" packages from sources (#35890)

When we build on ci a pre-release version of provider, and we want
to include packages that have >= CURRENTLY_RELEASED_VERSION
we have to make sure that those packages are built from sources
during building of PROD image. Otherwise they will not be installable
on CURRENT_VERSION.dev0, CURRENT_VERSION.rc* etc.

Until we "Actually" release a provider we should have a way to
build such provider from sources.

This is the CI version of it, once we have it working we can also
apply it to the workflow that releases images to dockerhub.

(cherry picked from commit cf052dc64f00e851427a41a34ffe576fd39be51b)
---
 .github/actions/build-prod-images/action.yml   | 12 +
 .github/workflows/build-images.yml |  2 +
 .github/workflows/ci.yml   | 15 ++
 airflow/providers/amazon/provider.yaml |  2 +-
 airflow/providers/google/provider.yaml |  2 +-
 airflow/providers/microsoft/azure/provider.yaml|  2 +-
 .../commands/release_management_commands.py| 25 +++---
 .../commands/release_management_commands_config.py |  1 +
 .../src/airflow_breeze/params/shell_params.py  |  2 +
 dev/breeze/src/airflow_breeze/utils/packages.py|  4 --
 .../src/airflow_breeze/utils/selective_checks.py   |  5 ++
 docs/apache-airflow-providers-google/index.rst |  2 +-
 .../index.rst  |  2 +-
 generated/provider_dependencies.json   |  4 +-
 ...put_release-management_generate-constraints.svg | 58 ++
 ...put_release-management_generate-constraints.txt |  2 +-
 scripts/in_container/_in_container_utils.sh| 23 +++--
 setup.cfg  |  2 +-
 setup.py   |  2 +-
 19 files changed, 120 insertions(+), 47 deletions(-)

diff --git a/.github/actions/build-prod-images/action.yml 
b/.github/actions/build-prod-images/action.yml
index 5fdbb795c4..f038234087 100644
--- a/.github/actions/build-prod-images/action.yml
+++ b/.github/actions/build-prod-images/action.yml
@@ -22,6 +22,9 @@ inputs:
   build-provider-packages:
 description: 'Whether to build provider packages from sources'
 required: true
+  chicken-egg-providers:
+description: 'List of chicken-egg provider packages to build from sources'
+required: true
 runs:
   using: "composite"
   steps:
@@ -41,6 +44,15 @@ runs:
 --package-list-file ./airflow/providers/installed_providers.txt
 --package-format wheel --version-suffix-for-pypi dev0
   if: ${{ inputs.build-provider-packages == 'true' }}
+- name: "Prepare chicken-eggs provider packages"
+  # In case of provider packages which use latest dev0 version of 
providers, we should prepare them
+  # from the source code, not from the PyPI because they have 
apache-airflow>=X.Y.Z dependency
+  # And when we prepare them from sources they will have 
apache-airflow>=X.Y.Z.dev0
+  shell: bash
+  run: >
+breeze release-management prepare-provider-packages
+--package-format wheel --version-suffix-for-pypi dev0 ${{ 
inputs.chicken-egg-providers }}
+  if: ${{ inputs.build-provider-packages != 'true' && 
inputs.chicken-egg-providers != '' }}
 - name: "Prepare airflow package"
   shell: bash
   run: >
diff --git a/.github/workflows/build-images.yml 
b/.github/workflows/build-images.yml
index b29d49de17..82cc6e5987 100644
--- a/.github/workflows/build-images.yml
+++ b/.github/workflows/build-images.yml
@@ -74,6 +74,7 @@ jobs:
   is-arm-runner: ${{ steps.selective-checks.outputs.is-arm-runner }}
   is-vm-runner: ${{ steps.selective-checks.outputs.is-vm-runner }}
   is-k8s-runner: ${{ steps.selective-checks.outputs.is-k8s-runner }}
+  chicken-egg-providers: ${{ 
steps.selective-checks.outputs.chicken-egg-providers }}
   target-commit-sha: 
"${{steps.discover-pr-merge-commit.outputs.target-commit-sha ||
   github.event.pull_request.head.sha ||
   github.sha
@@ -293,6 +294,7 @@ jobs:
 uses: ./.github/actions/build-prod-images
 with:
   build-provider-packages: ${{ needs.build-info.outputs.default-branch 
== 'main' }}
+  chicken-egg-providers: ${{ 
needs.build-info.outputs.chicken-egg-providers }}
 env:
   UPGRADE_TO_NEWER_DEPENDENCIES: ${{ 
needs.build-info.outputs.upgrade-to-newer-dependencies }}
 

(airflow) 11/34: Rename `Connection.to_json_dict` to `Connection.to_dict` (#35894)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 90f10b199d9b689b532f6b9228b79564bd346cbe
Author: Andrey Anshin 
AuthorDate: Thu Nov 30 10:43:13 2023 +0400

Rename `Connection.to_json_dict` to `Connection.to_dict` (#35894)

(cherry picked from commit 7594b7a8eecd216dbaf31fcbd958ba22a97d6709)
---
 airflow/models/connection.py| 11 ---
 airflow/serialization/serialized_objects.py |  2 +-
 2 files changed, 5 insertions(+), 8 deletions(-)

diff --git a/airflow/models/connection.py b/airflow/models/connection.py
index 1e835b4673..4e8e3c7aaf 100644
--- a/airflow/models/connection.py
+++ b/airflow/models/connection.py
@@ -478,10 +478,7 @@ class Connection(Base, LoggingMixin):
 
 raise AirflowNotFoundException(f"The conn_id `{conn_id}` isn't 
defined")
 
-def to_dict(self) -> dict[str, Any]:
-return {"conn_id": self.conn_id, "description": self.description, 
"uri": self.get_uri()}
-
-def to_json_dict(self, *, prune_empty: bool = False, validate: bool = 
True) -> dict[str, Any]:
+def to_dict(self, *, prune_empty: bool = False, validate: bool = True) -> 
dict[str, Any]:
 """
 Convert Connection to json-serializable dictionary.
 
@@ -528,6 +525,6 @@ class Connection(Base, LoggingMixin):
 
 def as_json(self) -> str:
 """Convert Connection to JSON-string object."""
-conn = self.to_json_dict(prune_empty=True, validate=False)
-conn.pop("conn_id", None)
-return json.dumps(conn)
+conn_repr = self.to_dict(prune_empty=True, validate=False)
+conn_repr.pop("conn_id", None)
+return json.dumps(conn_repr)
diff --git a/airflow/serialization/serialized_objects.py 
b/airflow/serialization/serialized_objects.py
index c40d4703ee..6f0e88cae2 100644
--- a/airflow/serialization/serialized_objects.py
+++ b/airflow/serialization/serialized_objects.py
@@ -498,7 +498,7 @@ class BaseSerialization:
 type_=DAT.SIMPLE_TASK_INSTANCE,
 )
 elif isinstance(var, Connection):
-return cls._encode(var.to_json_dict(validate=True), 
type_=DAT.CONNECTION)
+return cls._encode(var.to_dict(validate=True), 
type_=DAT.CONNECTION)
 elif use_pydantic_models and _ENABLE_AIP_44:
 
 def _pydantic_model_dump(model_cls: type[BaseModel], var: Any) -> 
dict[str, Any]:



(airflow) 12/34: Move `duckdb` & `pandas` import in tutorial DAG into task (#35964)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9ba72a2e0ce323a14ed402913dfc9a6b4d7aa67d
Author: Ephraim Anierobi 
AuthorDate: Thu Nov 30 07:46:35 2023 +0100

Move `duckdb` & `pandas` import in tutorial DAG into task (#35964)

This improves the code as per best practices and avoids import
error if duckdb is not installed

(cherry picked from commit f0ba2dced92c767367aaf0fa3147942b4a576f92)
---
 airflow/example_dags/tutorial_objectstorage.py | 8 
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/airflow/example_dags/tutorial_objectstorage.py 
b/airflow/example_dags/tutorial_objectstorage.py
index 47db595c24..11d817400d 100644
--- a/airflow/example_dags/tutorial_objectstorage.py
+++ b/airflow/example_dags/tutorial_objectstorage.py
@@ -47,7 +47,6 @@ base = ObjectStoragePath("s3://airflow-tutorial-data/", 
conn_id="aws_default")
 # [END create_object_storage_path]
 
 
-# [START instantiate_dag]
 @dag(
 schedule=None,
 start_date=pendulum.datetime(2021, 1, 1, tz="UTC"),
@@ -62,9 +61,6 @@ def tutorial_objectstorage():
 located
 
[here](https://airflow.apache.org/docs/apache-airflow/stable/tutorial/objectstorage.html)
 """
-# [END instantiate_dag]
-import duckdb
-import pandas as pd
 
 # [START get_air_quality_data]
 @task
@@ -74,6 +70,8 @@ def tutorial_objectstorage():
 This task gets air quality data from the Finnish Meteorological 
Institute's
 open data API. The data is saved as parquet.
 """
+import pandas as pd
+
 execution_date = kwargs["logical_date"]
 start_time = kwargs["data_interval_start"]
 
@@ -113,6 +111,8 @@ def tutorial_objectstorage():
  Analyze
 This task analyzes the air quality data, prints the results
 """
+import duckdb
+
 conn = duckdb.connect(database=":memory:")
 conn.register_filesystem(path.fs)
 conn.execute(f"CREATE OR REPLACE TABLE airquality_urban AS SELECT * 
FROM read_parquet('{path}')")



(airflow) 04/34: Implement `is_authorized_variable` in AWS auth manager (#35804)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 713dfdfa234bafaebd70495c8f2be56ed09cd164
Author: Vincent <97131062+vincb...@users.noreply.github.com>
AuthorDate: Mon Nov 27 16:11:09 2023 -0500

Implement `is_authorized_variable` in AWS auth manager (#35804)

(cherry picked from commit 3b3ebafdce440952d2406955de290092ca0e361d)
---
 .../auth_manager/{constants.py => avp/__init__.py} |   7 -
 .../amazon/aws/auth_manager/avp/entities.py|  57 ++
 .../amazon/aws/auth_manager/avp/facade.py  | 126 +
 .../amazon/aws/auth_manager/aws_auth_manager.py|  14 +-
 .../providers/amazon/aws/auth_manager/constants.py |   4 +-
 airflow/providers/amazon/aws/auth_manager/user.py  |   3 +
 .../amazon/aws/hooks/verified_permissions.py   |  44 +
 airflow/providers/amazon/provider.yaml |  23 +++
 airflow/www/auth.py|  23 ++-
 .../aws/Amazon-Verified-Permissions.png| Bin 0 -> 13986 bytes
 .../amazon/aws/auth_manager/avp/__init__.py|   7 -
 .../amazon/aws/auth_manager/avp/test_entities.py   |  14 +-
 .../amazon/aws/auth_manager/avp/test_facade.py | 203 +
 .../aws/auth_manager/test_aws_auth_manager.py  |  38 +++-
 .../amazon/aws/auth_manager/test_constants.py  |  12 +-
 .../providers/amazon/aws/auth_manager/test_user.py |   3 +
 .../amazon/aws/hooks/test_verified_permissions.py  |  12 +-
 17 files changed, 549 insertions(+), 41 deletions(-)

diff --git a/airflow/providers/amazon/aws/auth_manager/constants.py 
b/airflow/providers/amazon/aws/auth_manager/avp/__init__.py
similarity index 81%
copy from airflow/providers/amazon/aws/auth_manager/constants.py
copy to airflow/providers/amazon/aws/auth_manager/avp/__init__.py
index f2f9c1da07..13a83393a9 100644
--- a/airflow/providers/amazon/aws/auth_manager/constants.py
+++ b/airflow/providers/amazon/aws/auth_manager/avp/__init__.py
@@ -14,10 +14,3 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-
-# Configuration keys
-from __future__ import annotations
-
-CONF_SECTION_NAME = "aws_auth_manager"
-CONF_SAML_METADATA_URL_KEY = "saml_metadata_url"
-CONF_ENABLE_KEY = "enable"
diff --git a/airflow/providers/amazon/aws/auth_manager/avp/entities.py 
b/airflow/providers/amazon/aws/auth_manager/avp/entities.py
new file mode 100644
index 00..fad5ee1c3f
--- /dev/null
+++ b/airflow/providers/amazon/aws/auth_manager/avp/entities.py
@@ -0,0 +1,57 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from enum import Enum
+from typing import TYPE_CHECKING
+
+if TYPE_CHECKING:
+from airflow.auth.managers.base_auth_manager import ResourceMethod
+
+AVP_PREFIX_ENTITIES = "Airflow::"
+
+
+class AvpEntities(Enum):
+"""Enum of Amazon Verified Permissions entities."""
+
+ACTION = "Action"
+ROLE = "Role"
+VARIABLE = "Variable"
+USER = "User"
+
+
+def get_entity_type(resource_type: AvpEntities) -> str:
+"""
+Return entity type.
+
+:param resource_type: Resource type.
+
+Example: Airflow::Action, Airflow::Role, Airflow::Variable, Airflow::User.
+"""
+return AVP_PREFIX_ENTITIES + resource_type.value
+
+
+def get_action_id(resource_type: AvpEntities, method: ResourceMethod):
+"""
+Return action id.
+
+Convention for action ID is ::. Example: 
Variable::GET.
+
+:param resource_type: Resource type.
+:param method: Resource method.
+"""
+return f"{resource_type.value}::{method}"
diff --git a/airflow/providers/amazon/aws/auth_manager/avp/facade.py 
b/airflow/providers/amazon/aws/auth_manager/avp/facade.py
new file mode 100644
index 00..63ed9f5c70
--- /dev/null
+++ b/airflow/providers/amazon/aws/auth_manager/avp/facade.py
@@ -0,0 +1,126 @@
+# Licensed to the Apach

(airflow) 29/34: Add XCom tab to Grid (#35719)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit be86dd37ebc393ef5eee721b15e172e4e1bfd848
Author: Huy Duong <148540755+hduong-m...@users.noreply.github.com>
AuthorDate: Mon Dec 4 15:59:37 2023 +

Add XCom tab to Grid (#35719)

* Add XCom tab to Grid

* Combine showLogs and showXcom logic evaluation to isIndividualTaskInstance

* Remove link to /xcom page from UI grid view

* Use consistent naming to distinguish XcomCollection and XcomEntry

* Refactor boolean vars

(cherry picked from commit 77c01031d6c569d26f6fabd331597b7e87274baa)
---
 airflow/www/static/js/api/index.ts |   3 +
 airflow/www/static/js/api/useTaskXcom.ts   |  71 +
 airflow/www/static/js/dag/details/index.tsx|  71 ++---
 .../www/static/js/dag/details/taskInstance/Nav.tsx |   3 -
 .../js/dag/details/taskInstance/Xcom/XcomEntry.tsx |  82 +++
 .../js/dag/details/taskInstance/Xcom/index.tsx | 116 +
 airflow/www/templates/airflow/dag.html |   3 +-
 7 files changed, 329 insertions(+), 20 deletions(-)

diff --git a/airflow/www/static/js/api/index.ts 
b/airflow/www/static/js/api/index.ts
index 782a4f99a1..6369a819d2 100644
--- a/airflow/www/static/js/api/index.ts
+++ b/airflow/www/static/js/api/index.ts
@@ -48,6 +48,7 @@ import usePools from "./usePools";
 import useDags from "./useDags";
 import useDagRuns from "./useDagRuns";
 import useHistoricalMetricsData from "./useHistoricalMetricsData";
+import { useTaskXcomEntry, useTaskXcomCollection } from "./useTaskXcom";
 
 axios.interceptors.request.use((config) => {
   config.paramsSerializer = {
@@ -91,4 +92,6 @@ export {
   useTaskInstance,
   useUpstreamDatasetEvents,
   useHistoricalMetricsData,
+  useTaskXcomEntry,
+  useTaskXcomCollection,
 };
diff --git a/airflow/www/static/js/api/useTaskXcom.ts 
b/airflow/www/static/js/api/useTaskXcom.ts
new file mode 100644
index 00..1faa19005a
--- /dev/null
+++ b/airflow/www/static/js/api/useTaskXcom.ts
@@ -0,0 +1,71 @@
+/*!
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+import type { API } from "src/types";
+import { getMetaValue } from "src/utils";
+import { useQuery } from "react-query";
+import axios, { AxiosResponse } from "axios";
+
+// tryNumber is not required to get XCom keys or values but is used
+// in query key so refetch will occur if new tries are available
+interface TaskXcomCollectionProps extends API.GetXcomEntriesVariables {
+  tryNumber: number;
+}
+interface TaskXcomProps extends API.GetXcomEntryVariables {
+  tryNumber: number;
+}
+
+export const useTaskXcomCollection = ({
+  dagId,
+  dagRunId,
+  taskId,
+  mapIndex,
+  tryNumber,
+}: TaskXcomCollectionProps) =>
+  useQuery(["taskXcoms", dagId, dagRunId, taskId, mapIndex, tryNumber], () =>
+axios.get(
+  getMetaValue("task_xcom_entries_api")
+.replace("_DAG_RUN_ID_", dagRunId)
+.replace("_TASK_ID_", taskId),
+  { params: { map_index: mapIndex } }
+)
+  );
+
+export const useTaskXcomEntry = ({
+  dagId,
+  dagRunId,
+  taskId,
+  mapIndex,
+  xcomKey,
+  tryNumber,
+}: TaskXcomProps) =>
+  useQuery(
+["taskXcom", dagId, dagRunId, taskId, mapIndex, xcomKey, tryNumber],
+() =>
+  axios.get(
+getMetaValue("task_xcom_entry_api")
+  .replace("_DAG_RUN_ID_", dagRunId)
+  .replace("_TASK_ID_", taskId)
+  .replace("_XCOM_KEY_", xcomKey),
+{ params: { map_index: mapIndex } }
+  ),
+{
+  enabled: !!xcomKey,
+}
+  );
diff --git a/airflow/www/static/js/dag/details/index.tsx 
b/airflow/www/static/js/dag/details/index.tsx
index b476d61950..3c555c701e 100644
--- a/airflow/www/static/js/dag/details/index.tsx
+++ b/airflow/www/static/js/dag/details/index.tsx
@@ -39,6 +39,7 @@ import {
   MdReorder,
   MdCode,
   MdOutl

(airflow) 34/34: Update RELEASE_NOTES.rst

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 786ae6bf332b1dd5ad193ccadb7bac110d3ecf66
Author: Ephraim Anierobi 
AuthorDate: Tue Dec 5 11:31:27 2023 +0100

Update RELEASE_NOTES.rst
---
 RELEASE_NOTES.rst | 20 
 1 file changed, 20 insertions(+)

diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index b2c1c20a64..0657776984 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -41,6 +41,8 @@ Significant Changes
 New Features
 """"""""""""
 - AIP-58: Add Airflow ObjectStore (AFS) (`AIP-58 
<https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-58+milestone%3A%22Airflow+2.8.0%22>`_)
+- Add XCom tab to Grid (#35719)
+- Add a public interface for custom ``weight_rule`` implementation (#35210)
 - Add "literal" wrapper to disable field templating (#35017)
 - Add task context logging feature to allow forwarding messages to task logs 
(#32646, #32693, #35857)
 - Add Listener hooks for Datasets (#34418)
@@ -63,6 +65,13 @@ New Features
 
 Improvements
 """"""""""""
+- Add multiselect to run state in grid view (#35403)
+- Fix warning message in ``Connection.get_hook`` in case of ImportError 
(#36005)
+- Add processor_subdir to import_error table to handle multiple dag processors 
(#35956)
+- Consolidate the call of change_state to fail or success in the core 
executors (#35901)
+- Relax mandatory requirement for start_date when schedule=None (#35356)
+- Use ExitStack to manage mutation of secrets_backend_list in dag.test (#34620)
+- improved visibility of tasks in ActionModal for taskInstance (#35810)
 - Create directories based on ``AIRFLOW_CONFIG`` path (#35818)
 - Implements ``JSON-string`` connection representation generator (#35723)
 - Move ``BaseOperatorLink`` into the separate module (#35032)
@@ -122,6 +131,13 @@ Improvements
 
 Bug Fixes
 """""""""
+- Update ``reset_user_sessions`` to work from either CLI or web (#36056)
+- Fix UI Grid error when DAG has been removed. (#36028)
+- Use dropdown instead of buttons when there are more than 10 retries in log 
tab (#36025)
+- Change Trigger UI to use HTTP POST in web ui (#36026)
+- Fix airflow db shell needing an extra keypress to exit (#35982)
+- Change dag grid overscroll behaviour to auto (#35717)
+- Run triggers inline with dag test (#34642)
 - Add ``borderWidthRight`` to grid for Firefox ``scrollbar`` (#35346)
 - Fix for infinite recursion due to secrets_masker (#35048)
 - Fix write ``processor_subdir`` in serialized_dag table (#35661)
@@ -146,6 +162,9 @@ Bug Fixes
 
 Misc/Internal
 """""""""""""
+- Bump FAB to ``4.3.10`` (#35991)
+- Mark daskexecutor provider as removed (#35965)
+- Rename ``Connection.to_json_dict`` to ``Connection.to_dict`` (#35894)
 - Upgrade to Pydantic v2 (#35551)
 - Bump ``moto`` version to ``>= 4.2.9`` (#35687)
 - Use ``pyarrow-hotfix`` to mitigate CVE-2023-47248 (#35650)
@@ -177,6 +196,7 @@ Misc/Internal
 
 Doc Only Changes
 """"""""""""""""
+- Add the section describing the security model of DAG Author capabilities 
(#36022)
 - Enhance docs for zombie tasks (#35825)
 - Reflect drop/add support of DB Backends versions in documentation (#35785)
 - More detail on mandatory task arguments (#35740)



(airflow) 25/34: Limit Pytest-asyncio to < 0.23.1 (#36037)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f4c4a06bd8f41d7767d00a64d83dc6d07cf68555
Author: Jarek Potiuk 
AuthorDate: Sun Dec 3 23:56:47 2023 +0100

Limit Pytest-asyncio to < 0.23.1 (#36037)

Seems that pytest-asyncio 0.23.1 break our asyncio tests. We are
temporarily limiting it until
https://github.com/pytest-dev/pytest-asyncio/issues/703 is
solved or answered.

(cherry picked from commit 9845b40a7551703537ac2c2676511ec54689)
---
 setup.py | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index eacde7a499..ecd7167fb6 100644
--- a/setup.py
+++ b/setup.py
@@ -484,7 +484,9 @@ _devel_only_tests = [
 "beautifulsoup4>=4.7.1",
 "coverage>=7.2",
 "pytest>=7.1",
-"pytest-asyncio",
+# Pytest-asyncio 0.23.1 breaks our tests. The limitation should be removed 
when the issue is fixed:
+# https://github.com/pytest-dev/pytest-asyncio/issues/703
+"pytest-asyncio<0.23.1",
 "pytest-cov",
 "pytest-httpx",
 "pytest-icdiff",



(airflow) 22/34: Change Trigger UI to use HTTP POST in web ui (#36026)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d6ce328397be19ae2dece4664c2ffd4836c9493f
Author: Jens Scheffler <95105677+jsche...@users.noreply.github.com>
AuthorDate: Sun Dec 3 00:38:20 2023 +0100

Change Trigger UI to use HTTP POST in web ui (#36026)

* Change Trigger UI to use HTTP POST in web ui, GET always shows trigger 
form
* Adjust tests to changed behavior of trigger handling, expects data 
submitted in POST

(cherry picked from commit f5d802791fa5f6b13b635f06a1ea2eccc22a9ba7)
---
 airflow/www/templates/airflow/dag.html|  7 ++-
 airflow/www/templates/airflow/dags.html   |  7 ++-
 airflow/www/views.py  |  4 +++-
 tests/www/views/test_views_trigger_dag.py | 20 ++--
 4 files changed, 29 insertions(+), 9 deletions(-)

diff --git a/airflow/www/templates/airflow/dag.html 
b/airflow/www/templates/airflow/dag.html
index 40440d3fd6..435dacb50c 100644
--- a/airflow/www/templates/airflow/dag.html
+++ b/airflow/www/templates/airflow/dag.html
@@ -254,7 +254,7 @@
   {% else %}
 
   play_arrow
@@ -289,5 +289,10 @@
   }
   return false;
 }
+
+function triggerDag(link, dagId) {
+  postAsForm(link.href, {});
+  return false;
+}
   
 {% endblock %}
diff --git a/airflow/www/templates/airflow/dags.html 
b/airflow/www/templates/airflow/dags.html
index c2ccb03e5d..1ee168996c 100644
--- a/airflow/www/templates/airflow/dags.html
+++ b/airflow/www/templates/airflow/dags.html
@@ -385,7 +385,7 @@
   
 {% else %}
   
 play_arrow
@@ -483,5 +483,10 @@
   }
   return false;
 }
+
+function triggerDag(link, dagId) {
+  postAsForm(link.href, {});
+  return false;
+}
   
 {% endblock %}
diff --git a/airflow/www/views.py b/airflow/www/views.py
index 649440865c..c4230c1900 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -2047,7 +2047,9 @@ class Airflow(AirflowBaseView):
 if isinstance(run_conf, dict) and any(run_conf)
 }
 
-if request.method == "GET" and (ui_fields_defined or 
show_trigger_form_if_no_params):
+if request.method == "GET" or (
+not request_conf and (ui_fields_defined or 
show_trigger_form_if_no_params)
+):
 # Populate conf textarea with conf requests parameter, or 
dag.params
 default_conf = ""
 
diff --git a/tests/www/views/test_views_trigger_dag.py 
b/tests/www/views/test_views_trigger_dag.py
index 65ad8734d5..6471c092bd 100644
--- a/tests/www/views/test_views_trigger_dag.py
+++ b/tests/www/views/test_views_trigger_dag.py
@@ -57,7 +57,7 @@ def test_trigger_dag_button_normal_exist(admin_client):
 )
 def test_trigger_dag_button(admin_client, req, expected_run_id):
 test_dag_id = "example_bash_operator"
-admin_client.post(f"dags/{test_dag_id}/trigger?{req}")
+admin_client.post(f"dags/{test_dag_id}/trigger?{req}", data={"conf": "{}"})
 with create_session() as session:
 run = session.query(DagRun).filter(DagRun.dag_id == 
test_dag_id).first()
 assert run is not None
@@ -68,8 +68,12 @@ def test_trigger_dag_button(admin_client, req, 
expected_run_id):
 def test_duplicate_run_id(admin_client):
 test_dag_id = "example_bash_operator"
 run_id = "test_run"
-admin_client.post(f"dags/{test_dag_id}/trigger?run_id={run_id}", 
follow_redirects=True)
-response = 
admin_client.post(f"dags/{test_dag_id}/trigger?run_id={run_id}", 
follow_redirects=True)
+admin_client.post(
+f"dags/{test_dag_id}/trigger?run_id={run_id}", data={"conf": "{}"}, 
follow_redirects=True
+)
+response = admin_client.post(
+f"dags/{test_dag_id}/trigger?run_id={run_id}", data={"conf": "{}"}, 
follow_redirects=True
+)
 check_content_in_response(f"The run ID {run_id} already exists", response)
 
 
@@ -112,7 +116,9 @@ def test_trigger_dag_conf_not_dict(admin_client):
 def test_trigger_dag_wrong_execution_date(admin_client):
 test_dag_id = "example_bash_operator"
 
-response = admin_client.post(f"dags/{test_dag_id}/trigger", 
data={"execution_date": "not_a_date"})
+response = admin_client.post(
+f"dags/{test_dag_id}/trigger", data={"conf": "{}", "execution_date": 
"not_a_date"}
+)
 check_content_in_response("Invalid execution date", response)
 
 with create_session() as session:
@@ -124,7 +130,9 @@ def 
test_trigger_dag_execution_date_data_interval(admin_client):
 test_dag_id = "example_bash_operator

(airflow) 33/34: Mark daskexecutor provider as removed (#35965)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 48d7ac44b4d831fdaab2bc9d32e5cbba331c6632
Author: Jarek Potiuk 
AuthorDate: Thu Nov 30 14:08:18 2023 +0100

Mark daskexecutor provider as removed (#35965)

Following the discussion in airflow devlist we mark daskexecutor
as removed. See:
https://lists.apache.org/thread/fxv44cqqljrrhll3fdpdgc9h9fz5ghcy

(cherry picked from commit 9c1c9f450e289b40f94639db3f0686f592c8841e)
---
 CONTRIBUTING.rst   | 20 +++
 Dockerfile |  2 +-
 IMAGES.rst |  2 +-
 INSTALL| 20 +++
 airflow/config_templates/config.yml|  2 +-
 airflow/executors/executor_constants.py|  1 -
 airflow/executors/executor_loader.py   |  3 -
 airflow/providers/daskexecutor/CHANGELOG.rst   |  6 ++
 airflow/providers/daskexecutor/provider.yaml   |  4 +-
 airflow/providers/installed_providers.txt  |  1 -
 dev/breeze/src/airflow_breeze/global_constants.py  |  1 -
 dev/breeze/tests/test_packages.py  |  6 +-
 docker_tests/test_prod_image.py|  1 -
 docs/apache-airflow/extra-packages-ref.rst |  4 --
 docs/docker-stack/build-arg-ref.rst|  1 -
 generated/provider_dependencies.json   | 10 
 images/breeze/output_build-docs.svg|  6 +-
 images/breeze/output_build-docs.txt|  2 +-
 images/breeze/output_prod-image_build.svg  |  2 +-
 images/breeze/output_prod-image_build.txt  |  2 +-
 ...tput_release-management_add-back-references.svg |  6 +-
 ...tput_release-management_add-back-references.txt |  2 +-
 ...management_generate-issue-content-providers.svg |  6 +-
 ...management_generate-issue-content-providers.txt |  2 +-
 ...e-management_prepare-provider-documentation.svg |  6 +-
 ...e-management_prepare-provider-documentation.txt |  2 +-
 ...elease-management_prepare-provider-packages.svg |  6 +-
 ...elease-management_prepare-provider-packages.txt |  2 +-
 .../output_release-management_publish-docs.svg |  6 +-
 .../output_release-management_publish-docs.txt |  2 +-
 ...output_sbom_generate-providers-requirements.svg | 64 ++
 ...output_sbom_generate-providers-requirements.txt |  2 +-
 setup.py   | 13 +
 tests/cli/commands/test_standalone_command.py  |  3 -
 tests/sensors/test_base.py |  3 -
 35 files changed, 93 insertions(+), 128 deletions(-)

diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 1114c16074..25633c0995 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -672,16 +672,16 @@ aiobotocore, airbyte, alibaba, all, all_dbs, amazon, 
apache.atlas, apache.beam,
 apache.drill, apache.druid, apache.flink, apache.hdfs, apache.hive, 
apache.impala, apache.kafka,
 apache.kylin, apache.livy, apache.pig, apache.pinot, apache.spark, 
apache.sqoop, apache.webhdfs,
 apprise, arangodb, asana, async, atlas, atlassian.jira, aws, azure, cassandra, 
celery, cgroups,
-cloudant, cncf.kubernetes, cohere, common.io, common.sql, crypto, dask, 
daskexecutor, databricks,
-datadog, dbt.cloud, deprecated_api, devel, devel_all, devel_ci, devel_hadoop, 
dingding, discord,
-doc, doc_gen, docker, druid, elasticsearch, exasol, facebook, ftp, gcp, 
gcp_api, github,
-github_enterprise, google, google_auth, grpc, hashicorp, hdfs, hive, http, 
imap, influxdb, jdbc,
-jenkins, kerberos, kubernetes, ldap, leveldb, microsoft.azure, 
microsoft.mssql, microsoft.psrp,
-microsoft.winrm, mongo, mssql, mysql, neo4j, odbc, openai, openfaas, 
openlineage, opensearch,
-opsgenie, oracle, otel, pagerduty, pandas, papermill, password, pgvector, 
pinecone, pinot, plexus,
-postgres, presto, rabbitmq, redis, s3, s3fs, salesforce, samba, saml, segment, 
sendgrid, sentry,
-sftp, singularity, slack, smtp, snowflake, spark, sqlite, ssh, statsd, 
tableau, tabular, telegram,
-trino, vertica, virtualenv, weaviate, webhdfs, winrm, yandex, zendesk
+cloudant, cncf.kubernetes, cohere, common.io, common.sql, crypto, databricks, 
datadog, dbt.cloud,
+deprecated_api, devel, devel_all, devel_ci, devel_hadoop, dingding, discord, 
doc, doc_gen, docker,
+druid, elasticsearch, exasol, facebook, ftp, gcp, gcp_api, github, 
github_enterprise, google,
+google_auth, grpc, hashicorp, hdfs, hive, http, imap, influxdb, jdbc, jenkins, 
kerberos, kubernetes,
+ldap, leveldb, microsoft.azure, microsoft.mssql, microsoft.psrp, 
microsoft.winrm, mongo, mssql,
+mysql, neo4j, odbc, openai, openfaas, openlineage, opensearch, opsgenie, 
oracle, otel, pagerduty,
+pandas, papermill, password, pgvector, pinecone, pinot, plexus, postgres, 
presto, rabbitmq, redis,
+s3, s3fs, salesforce, samba, saml

(airflow) 31/34: Avoid crushing container when directory is not found on rm (#36050)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a42d3d86c77e43c7a762b145cb17fb9e6c81a8f6
Author: rom sharon <33751805+romsharo...@users.noreply.github.com>
AuthorDate: Mon Dec 4 23:50:39 2023 +0200

Avoid crushing container when directory is not found on rm (#36050)

(cherry picked from commit 61fd166a4662d67bc914949f9cf07ceab7d55686)
---
 Dockerfile   | 2 +-
 scripts/docker/clean-logs.sh | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index 61b95f7d7c..b1ba16db53 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -1174,7 +1174,7 @@ while true; do
 -type f -mtime +"${RETENTION}" -name '*.log' -print0 | \
 xargs -0 rm -f
 
-  find "${DIRECTORY}"/logs -type d -empty -delete
+  find "${DIRECTORY}"/logs -type d -empty -delete || true
 
   seconds=$(( $(date -u +%s) % EVERY))
   (( seconds < 1 )) || sleep $((EVERY - seconds - 1))
diff --git a/scripts/docker/clean-logs.sh b/scripts/docker/clean-logs.sh
index 53f5407d96..df138e4a6f 100644
--- a/scripts/docker/clean-logs.sh
+++ b/scripts/docker/clean-logs.sh
@@ -35,7 +35,7 @@ while true; do
 -type f -mtime +"${RETENTION}" -name '*.log' -print0 | \
 xargs -0 rm -f
 
-  find "${DIRECTORY}"/logs -type d -empty -delete
+  find "${DIRECTORY}"/logs -type d -empty -delete || true
 
   seconds=$(( $(date -u +%s) % EVERY))
   (( seconds < 1 )) || sleep $((EVERY - seconds - 1))



(airflow) 06/34: Add a public interface for custom weight_rule implementation (#35210)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9be2ffc30411693d069074c6a51a243d1e61fdc0
Author: Hussein Awala 
AuthorDate: Tue Nov 28 19:24:21 2023 +0200

Add a public interface for custom weight_rule implementation (#35210)

* Add a public interface for custom weight_rule implementation

* Remove _weight_strategy attribute

* Move priority weight calculation to TI to support advanced strategies

* Fix loading the var from mapped operators and simplify loading it from 
task

* Update default value and deprecated the other one

* Update task endpoint API spec

* fix tests

* Update docs and add dag example

* Fix serialization test

* revert change in spark provider

* Update unit tests

(cherry picked from commit 3385113e277f86b5f163a3509ba61590cfe7d8cc)
---
 airflow/api_connexion/openapi/v1.yaml  |  7 ++
 airflow/api_connexion/schemas/task_schema.py   |  1 +
 airflow/config_templates/config.yml| 11 +++
 .../example_priority_weight_strategy.py| 69 
 airflow/executors/base_executor.py |  2 +-
 airflow/executors/debug_executor.py|  2 +-
 ...2_8_0_add_priority_weight_strategy_to_task_.py} | 42 +-
 airflow/models/abstractoperator.py | 20 -
 airflow/models/baseoperator.py | 36 ++---
 airflow/models/mappedoperator.py   | 16 +++-
 airflow/models/taskinstance.py | 22 +-
 airflow/serialization/pydantic/taskinstance.py |  1 +
 airflow/task/priority_strategy.py  | 91 ++
 airflow/utils/db.py|  2 +-
 airflow/utils/weight_rule.py   |  6 +-
 airflow/www/static/js/types/api-generated.ts   | 10 ++-
 .../priority-weight.rst| 12 +--
 docs/apache-airflow/img/airflow_erd.sha256 |  2 +-
 docs/apache-airflow/migrations-ref.rst |  4 +-
 .../api_connexion/endpoints/test_task_endpoint.py  | 21 +++--
 tests/api_connexion/schemas/test_task_schema.py|  6 +-
 tests/models/test_baseoperator.py  | 12 ++-
 tests/models/test_dag.py   | 20 +
 tests/models/test_taskinstance.py  |  1 +
 tests/serialization/test_dag_serialization.py  |  3 +-
 tests/www/views/test_views_tasks.py|  7 ++
 26 files changed, 362 insertions(+), 64 deletions(-)

diff --git a/airflow/api_connexion/openapi/v1.yaml 
b/airflow/api_connexion/openapi/v1.yaml
index 5d0c58102a..1653470d91 100644
--- a/airflow/api_connexion/openapi/v1.yaml
+++ b/airflow/api_connexion/openapi/v1.yaml
@@ -3738,6 +3738,8 @@ components:
   readOnly: true
 weight_rule:
   $ref: "#/components/schemas/WeightRule"
+priority_weight_strategy:
+  $ref: "#/components/schemas/PriorityWeightStrategy"
 ui_color:
   $ref: "#/components/schemas/Color"
 ui_fgcolor:
@@ -4767,11 +4769,16 @@ components:
 WeightRule:
   description: Weight rule.
   type: string
+  nullable: true
   enum:
 - downstream
 - upstream
 - absolute
 
+PriorityWeightStrategy:
+  description: Priority weight strategy.
+  type: string
+
 HealthStatus:
   description: Health status
   type: string
diff --git a/airflow/api_connexion/schemas/task_schema.py 
b/airflow/api_connexion/schemas/task_schema.py
index ac1b465bb2..cd8ccdfd3b 100644
--- a/airflow/api_connexion/schemas/task_schema.py
+++ b/airflow/api_connexion/schemas/task_schema.py
@@ -57,6 +57,7 @@ class TaskSchema(Schema):
 retry_exponential_backoff = fields.Boolean(dump_only=True)
 priority_weight = fields.Number(dump_only=True)
 weight_rule = WeightRuleField(dump_only=True)
+priority_weight_strategy = fields.String(dump_only=True)
 ui_color = ColorField(dump_only=True)
 ui_fgcolor = ColorField(dump_only=True)
 template_fields = fields.List(fields.String(), dump_only=True)
diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index a25adc7206..072eaea86d 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -306,6 +306,17 @@ core:
   description: |
 The weighting method used for the effective total priority weight of 
the task
   version_added: 2.2.0
+  version_deprecated: 2.8.0
+  deprecation_reason: |
+This option is deprecated and will be removed in Airflow 3.0.
+Please use ``default_task_priority_weight_strategy`` instead.
+  type: string
+  example: ~
+  default: ~
+default_task_priority_weight_strategy:
+  descri

(airflow) branch v2-8-test updated (c28ba46e13 -> 786ae6bf33)

2023-12-05 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


from c28ba46e13 Exclude common-io provider
 new 28897f7a42 Run triggers inline with dag test (#34642)
 new 881d802629 improved visibility of tasks in ActionModal for 
taskInstance (#35810)
 new ecbc959812 Use ExitStack to manage mutation of secrets_backend_list in 
dag.test (#34620)
 new 713dfdfa23 Implement `is_authorized_variable` in AWS auth manager 
(#35804)
 new bc95360902 Relax mandatory requirement for start_date when 
schedule=None (#35356)
 new 9be2ffc304 Add a public interface for custom weight_rule 
implementation (#35210)
 new 9798f314dd Consolidate the call of change_state to fail or success in 
the core executors (#35901)
 new 2fba7bd0b3 Remove workaround for pymssql failing compilation with new 
Cython (#35924)
 new 54dc2b9127 Change dag grid overscroll behaviour to auto (#35717)
 new 7e9b6a4f68 Revert "Prevent assignment of non JSON serializable values 
to DagRun.conf dict (#35096)" (#35959)
 new 90f10b199d Rename `Connection.to_json_dict` to `Connection.to_dict` 
(#35894)
 new 9ba72a2e0c Move `duckdb` & `pandas` import in tutorial DAG into task 
(#35964)
 new 7b03a11dc9 Add processor_subdir to import_error table to handle 
multiple dag processors (#35956)
 new f6cfd45f33 Fix airflow db shell needing an extra keypress to exit 
(#35982)
 new 9ea67c8749 Bump FAB to 4.3.10 (#35991)
 new 8a0252d419 Add feature to build "chicken-egg" packages from sources 
(#35890)
 new 9281ccb6f9 Pass conn ID to ObjectStoragePath via URI (#35913)
 new 72d610a354 Switch "latest" image to point to newest supported Python 
version (#36003)
 new e1f469bbd3 Add support for chicken-egg providers to dockerhub release 
process (#36002)
 new 08188ed880 [AIP-44] Introduce Pydantic model for LogTemplate (#36004)
 new 4652d7fc00 Add multiselect to run state in grid view (#35403)
 new d6ce328397 Change Trigger UI to use HTTP POST in web ui (#36026)
 new 552fbe3120 Use dropdown instead of buttons when there are more than 10 
retries in log tab (#36025)
 new d265100995 34058: Fix UI Grid error when DAG has been removed. (#36028)
 new f4c4a06bd8 Limit Pytest-asyncio to < 0.23.1 (#36037)
 new be0fb8b11c Limit pytest-asyncio even more - to <0.23.0 (#36040)
 new a2573503a6 Add the section describing the security model of DAG Author 
capabilities (#36022)
 new beba3b80a1 Remove pytest-asyncio upper-binding limitatin (#36046)
 new be86dd37eb Add XCom tab to Grid (#35719)
 new 44302838fd Update supported-versions.rst (#36058)
 new a42d3d86c7 Avoid crushing container when directory is not found on rm 
(#36050)
 new 2a008cfb57 Update reset_user_sessions to work from either CLI or web 
(#36056)
 new 48d7ac44b4 Mark daskexecutor provider as removed (#35965)
 new 786ae6bf33 Update RELEASE_NOTES.rst

The 34 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/actions/build-prod-images/action.yml   |   12 +
 .github/workflows/build-images.yml |2 +
 .github/workflows/ci.yml   |   15 +
 .github/workflows/release_dockerhub_image.yml  |   17 +
 CONTRIBUTING.rst   |   20 +-
 Dockerfile |   53 +-
 Dockerfile.ci  |   36 -
 IMAGES.rst |2 +-
 INSTALL|   20 +-
 RELEASE_NOTES.rst  |   20 +
 airflow/api_connexion/openapi/v1.yaml  |7 +
 airflow/api_connexion/schemas/task_schema.py   |1 +
 airflow/api_internal/endpoints/rpc_api_endpoint.py |1 +
 .../auth/managers/fab/security_manager/override.py |   48 +-
 airflow/config_templates/config.yml|   13 +-
 airflow/dag_processing/manager.py  |   13 +-
 airflow/dag_processing/processor.py|   14 +-
 .../example_priority_weight_strategy.py|   69 +
 airflow/example_dags/tutorial_objectstorage.py |   10 +-
 airflow/executors/base_executor.py |2 +-
 airflow/executors/debug_executor.py|8 +-
 airflow/executors/executor_constants.py|1 -
 airflow/executors/executor_loader.py   |3 -
 airflow/executors/sequential_executor.py   |5 +-
 airflow/io/path.py |   10 +-
 airflow/io/store/__init__.py   |2 +

  1   2   3   4   5   6   7   8   9   10   >