[GitHub] [airflow] MrGeorgeOwl commented on a diff in pull request #27776: Add deferrable mode to dataflow operators

2023-01-23 Thread via GitHub


MrGeorgeOwl commented on code in PR #27776:
URL: https://github.com/apache/airflow/pull/27776#discussion_r1083733347


##
airflow/providers/google/cloud/hooks/dataflow.py:
##
@@ -1037,7 +1050,7 @@ def start_sql_job(
 def get_job(
 self,
 job_id: str,
-project_id: str,
+project_id: str = PROVIDE_PROJECT_ID,

Review Comment:
   Drive-by improvement



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] MrGeorgeOwl commented on a diff in pull request #27776: Add deferrable mode to dataflow operators

2023-01-23 Thread via GitHub


MrGeorgeOwl commented on code in PR #27776:
URL: https://github.com/apache/airflow/pull/27776#discussion_r1083736807


##
airflow/providers/google/cloud/links/dataflow.py:
##
@@ -48,5 +48,5 @@ def persist(
 operator_instance.xcom_push(
 context,
 key=DataflowJobLink.key,
-value={"project_id": project_id, "location": region, "job_id": 
job_id},
+value={"project_id": project_id, "region": region, "job_id": 
job_id},

Review Comment:
   Not improvement but fixing the link work, without that change link for 
Dataflow won't work cause of KeyError exception if I remember correctly



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] snjypl commented on pull request #28394: Fix manual task trigger failing for k8s.

2023-01-23 Thread via GitHub


snjypl commented on PR #28394:
URL: https://github.com/apache/airflow/pull/28394#issuecomment-1399944689

   > I thikn we need a ot more context in commit - what hapens here.
   
   i have added a more descriptive commit message. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tanelk commented on a diff in pull request #29094: Add `max_active_tis_per_dagrun` for Dynamic Task Mapping

2023-01-23 Thread via GitHub


tanelk commented on code in PR #29094:
URL: https://github.com/apache/airflow/pull/29094#discussion_r1083763018


##
airflow/jobs/scheduler_job.py:
##
@@ -480,13 +490,40 @@ def _executable_task_instances_to_queued(self, max_tis: 
int, session: Session) -
 " this task has been reached.",
 task_instance,
 )
-starved_tasks.add((task_instance.dag_id, 
task_instance.task_id))

Review Comment:
   I think that instead of modifying the `starved_tasks`, we should add a new 
variable, that keeps track of the `(dag_id, run_id, task_id)` triples. 
   
   When using the `task_concurrency_limit`, then the current behaviour is that 
all runs are filtered out, but after your change the filtering is done one 
dagrun at time and it could take more iterations to get to the TIs, that can be 
queued.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tanelk commented on a diff in pull request #29094: Add `max_active_tis_per_dagrun` for Dynamic Task Mapping

2023-01-23 Thread via GitHub


tanelk commented on code in PR #29094:
URL: https://github.com/apache/airflow/pull/29094#discussion_r1083766525


##
airflow/jobs/scheduler_job.py:
##
@@ -336,7 +344,9 @@ def _executable_task_instances_to_queued(self, max_tis: 
int, session: Session) -
 query = query.filter(not_(TI.dag_id.in_(starved_dags)))
 
 if starved_tasks:
-task_filter = tuple_in_condition((TaskInstance.dag_id, 
TaskInstance.task_id), starved_tasks)
+task_filter = tuple_in_condition(
+(TaskInstance.dag_id, TaskInstance.run_id, 
TaskInstance.task_id), starved_tasks

Review Comment:
   This method has a mixed use of `TaskInstance` and `TI`. While you are at it, 
could you unify the usage at least in this method and use `TI` everywhere.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mattinbits commented on issue #21171: Airflow schedules tasks in wrong timezone with an MSSQL Metadata DB on a non-UTC server

2023-01-23 Thread via GitHub


mattinbits commented on issue #21171:
URL: https://github.com/apache/airflow/issues/21171#issuecomment-147539

   As someone who has been running Airflow on MSSQL in production for several 
years, I tend to agree that explicitly removing support would be better. We're 
planning to move away from it and I wouldn't recommend it to anyone else. 
   
   Although the Github README makes it clear that the support is experimental, 
I don't see this mentioned here, which I assume is probably most people's entry 
point: 
https://airflow.apache.org/docs/apache-airflow/stable/howto/set-up-database.html#choosing-database-backend


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ashb commented on a diff in pull request #27063: AIP-50 Trigger UI based on FAB

2023-01-23 Thread via GitHub


ashb commented on code in PR #27063:
URL: https://github.com/apache/airflow/pull/27063#discussion_r1083793113


##
airflow/config_templates/config.yml:
##
@@ -1529,6 +1529,19 @@ webserver:
   type: boolean
   example: ~
   default: "False"
+trigger_dag_url:

Review Comment:
   Yeah, if you could separate this that would be good.
   
   (One option for when re-doing it _might_ be store the route name in config, 
i.e. the value of the first argument to `url_for`?)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ashb commented on a diff in pull request #27063: AIP-50 Trigger UI based on FAB

2023-01-23 Thread via GitHub


ashb commented on code in PR #27063:
URL: https://github.com/apache/airflow/pull/27063#discussion_r1083798621


##
airflow/example_dags/example_params_ui_tutorial.py:
##
@@ -0,0 +1,234 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""DAG demonstrating various options for a trigger form generated by DAG 
params.
+
+The DAG attribute `params` is used to define a default dictionary of 
parameters which are usually passed
+to the DAG and which are used to render a trigger form.
+"""
+from __future__ import annotations
+
+import datetime
+import json
+from pathlib import Path
+
+from airflow import DAG
+from airflow.decorators import task
+from airflow.exceptions import AirflowSkipException
+from airflow.models.dagrun import DagRun
+from airflow.models.param import Param
+from airflow.models.taskinstance import TaskInstance
+
+HTML_COLOR_PICKER_CUSTOM_CODE = """
+
+Red:
+
+
+
+ 
+
+
+Green:
+
+
+
+Blue:
+
+
+
+
+const hex_chars = "0123456789ABCDEF";
+function i2hex(name) {
+var i = document.getElementById(name).value;
+return hex_chars.substr(parseInt(i / 16), 1) + 
hex_chars.substr(parseInt(i % 16), 1)
+}
+function u_{name}() {
+document.getElementById("{name}").value = 
"#"+i2hex("r_{name}")+i2hex("g_{name}")+i2hex("b_{name}");
+document.getElementById("preview_{name}").style.background = 
document.getElementById("{name}").value;
+updateJSONconf();
+}
+function hex2i(text) {
+return hex_chars.indexOf(text.substr(0,1)) * 16 + 
hex_chars.indexOf(text.substr(1,1));
+}
+function v_{name}() {
+var value = document.getElementById("{name}").value.toUpperCase();
+document.getElementById("r_{name}").value = hex2i(value.substr(1,2));
+document.getElementById("g_{name}").value = hex2i(value.substr(3,2));
+document.getElementById("b_{name}").value = hex2i(value.substr(5,2));
+document.getElementById("preview_{name}").style.background = 
document.getElementById("{name}").value;
+}
+v_{name}();
+
+"""
+
+with DAG(
+dag_id=Path(__file__).stem,
+description=__doc__[0 : __doc__.find(".")],
+doc_md=__doc__,
+schedule=None,
+start_date=datetime.datetime(2022, 3, 4),
+catchup=False,
+tags=["example_ui"],
+params={
+# Let's start simple: Standard dict values are detected from type and 
offered as entry form fields.
+# Detected types are numbers, text, boolean, lists and dicts.
+# Note that such auto-detected parameters are treated as optional (not 
required to contain a value)
+"x": 3,
+"text": "Hello World!",
+"flag": False,
+"a_simple_list": ["one", "two", "three", "actually one value is made 
per line"],
+# But of course you might want to have it nicer! Let's add some 
description to parameters.
+# Note if you can add any HTML formatting to the description, you need 
to use the description_html
+# attribute.
+"most_loved_number": Param(
+42,
+type="integer",
+title="You favorite number",
+description_html="""Everybody should have a favorite number. Not 
only math teachers.
+If you can not think of any at the moment please think of the 42 
which is very famous because
+of the book
+
+The Hitchhiker's Guide to the Galaxy""",
+),
+# If you want to have a selection list box then you can use the enum 
feature of JSON schema
+"pick_one": Param(
+"value 1",
+type="string",
+title="Select one Value",
+description="You can use JSON schema enum's to generate drop down 
selection boxes.",
+enum=[f"value {i}" for i in range(1, 42)],
+),
+# Boolean as proper parameter with description
+"bool": Param(
+True,
+type="boolean",
+title="Please confirm",
+description="A On/Off selection with a proper description.",
+),
+# Dates and Times are also supported
+"date_time": Param(
+f"{datetime.date.today()} {datetime.time(hour=

[GitHub] [airflow] ashb commented on a diff in pull request #29080: Make static checks generated file more stable across the board

2023-01-23 Thread via GitHub


ashb commented on code in PR #29080:
URL: https://github.com/apache/airflow/pull/29080#discussion_r1083806540


##
.pre-commit-config.yaml:
##
@@ -146,6 +146,13 @@ repos:
   - --fuzzy-match-generates-todo
 files: >
   
\.cfg$|\.conf$|\.ini$|\.ldif$|\.properties$|\.readthedocs$|\.service$|\.tf$|Dockerfile.*$
+  - repo: https://github.com/psf/black
+rev: 22.12.0
+hooks:
+  - id: black
+name: Run black (python formatter)
+args: [--config=./pyproject.toml]

Review Comment:
   Nit: this is the default and shouldn't be needed



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on a diff in pull request #29098: Explicit a few steps in the release process

2023-01-23 Thread via GitHub


ephraimbuddy commented on code in PR #29098:
URL: https://github.com/apache/airflow/pull/29098#discussion_r1083817696


##
dev/README_RELEASE_AIRFLOW.md:
##
@@ -119,8 +120,24 @@ git log --oneline --decorate 
apache/v2-2-stable..apache/main -- docs/apache-airf
 Those changes that are "doc-only" changes should be marked with 
`type:doc-only` label so that they
 land in documentation part of the changelog. The tool to review and assign the 
labels is described below.
 
+## Making the cherry picking
+
+To see cherry picking candidates (unmerged PR with the appropriate milestone) 
you can run:
+
+```shell
+./dev/airflow-github compare 2.1.2 --unmerged
+```
+
+Be careful and verify the hash commit specified. This is a 'best effort' to 
find it, and
+could be inaccurate if the PR was referenced in other commits after it was 
merged.

Review Comment:
   Let's add a note to start cherry-picking from the bottom of the list



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ashb commented on a diff in pull request #27063: AIP-50 Trigger UI based on FAB

2023-01-23 Thread via GitHub


ashb commented on code in PR #27063:
URL: https://github.com/apache/airflow/pull/27063#discussion_r1083797766


##
airflow/example_dags/example_params_ui_tutorial.py:
##
@@ -0,0 +1,234 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""DAG demonstrating various options for a trigger form generated by DAG 
params.
+
+The DAG attribute `params` is used to define a default dictionary of 
parameters which are usually passed
+to the DAG and which are used to render a trigger form.
+"""
+from __future__ import annotations
+
+import datetime
+import json
+from pathlib import Path
+
+from airflow import DAG
+from airflow.decorators import task
+from airflow.exceptions import AirflowSkipException
+from airflow.models.dagrun import DagRun
+from airflow.models.param import Param
+from airflow.models.taskinstance import TaskInstance
+
+HTML_COLOR_PICKER_CUSTOM_CODE = """
+
+Red:
+
+
+
+ 
+
+
+Green:
+
+
+
+Blue:
+
+
+
+
+const hex_chars = "0123456789ABCDEF";
+function i2hex(name) {
+var i = document.getElementById(name).value;
+return hex_chars.substr(parseInt(i / 16), 1) + 
hex_chars.substr(parseInt(i % 16), 1)
+}
+function u_{name}() {
+document.getElementById("{name}").value = 
"#"+i2hex("r_{name}")+i2hex("g_{name}")+i2hex("b_{name}");
+document.getElementById("preview_{name}").style.background = 
document.getElementById("{name}").value;
+updateJSONconf();
+}
+function hex2i(text) {
+return hex_chars.indexOf(text.substr(0,1)) * 16 + 
hex_chars.indexOf(text.substr(1,1));
+}
+function v_{name}() {
+var value = document.getElementById("{name}").value.toUpperCase();
+document.getElementById("r_{name}").value = hex2i(value.substr(1,2));
+document.getElementById("g_{name}").value = hex2i(value.substr(3,2));
+document.getElementById("b_{name}").value = hex2i(value.substr(5,2));
+document.getElementById("preview_{name}").style.background = 
document.getElementById("{name}").value;
+}
+v_{name}();
+
+"""
+
+with DAG(
+dag_id=Path(__file__).stem,
+description=__doc__[0 : __doc__.find(".")],
+doc_md=__doc__,
+schedule=None,
+start_date=datetime.datetime(2022, 3, 4),
+catchup=False,
+tags=["example_ui"],
+params={
+# Let's start simple: Standard dict values are detected from type and 
offered as entry form fields.
+# Detected types are numbers, text, boolean, lists and dicts.
+# Note that such auto-detected parameters are treated as optional (not 
required to contain a value)
+"x": 3,
+"text": "Hello World!",
+"flag": False,
+"a_simple_list": ["one", "two", "three", "actually one value is made 
per line"],
+# But of course you might want to have it nicer! Let's add some 
description to parameters.
+# Note if you can add any HTML formatting to the description, you need 
to use the description_html
+# attribute.
+"most_loved_number": Param(
+42,
+type="integer",
+title="You favorite number",
+description_html="""Everybody should have a favorite number. Not 
only math teachers.
+If you can not think of any at the moment please think of the 42 
which is very famous because
+of the book
+
+The Hitchhiker's Guide to the Galaxy""",
+),
+# If you want to have a selection list box then you can use the enum 
feature of JSON schema
+"pick_one": Param(
+"value 1",
+type="string",
+title="Select one Value",
+description="You can use JSON schema enum's to generate drop down 
selection boxes.",
+enum=[f"value {i}" for i in range(1, 42)],
+),
+# Boolean as proper parameter with description
+"bool": Param(
+True,
+type="boolean",
+title="Please confirm",
+description="A On/Off selection with a proper description.",
+),
+# Dates and Times are also supported
+"date_time": Param(
+f"{datetime.date.today()} {datetime.time(hour=

[GitHub] [airflow] Taragolis commented on pull request #29086: Remove upper bound limitation for `pytest`

2023-01-23 Thread via GitHub


Taragolis commented on PR #29086:
URL: https://github.com/apache/airflow/pull/29086#issuecomment-1400130898

   Non relevant error in the CI to this PR. Looks like internal_api test flaky
   
   ```console
   if not ignore_running:
   raise AssertionError(
   >   "Background processes are running that prevent the test 
from passing successfully."
   )
   E   AssertionError: Background processes are running that 
prevent the test from passing successfully.
   
   tests/cli/commands/test_internal_api_command.py:112: AssertionError
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] BasPH commented on a diff in pull request #29078: enhance production guide with a few Argo specific guidelines [#20999][#28637]

2023-01-23 Thread via GitHub


BasPH commented on code in PR #29078:
URL: https://github.com/apache/airflow/pull/29078#discussion_r1083850885


##
docs/helm-chart/index.rst:
##
@@ -140,3 +140,18 @@ will not start as the migrations will not be run:
 This is so these CI/CD services can perform updates without issues and 
preserve the immutability of Kubernetes Job manifests.
 
 This also applies if you install the chart using ``--wait`` in your ``helm 
install`` command.
+
+.. note::
+While deploying this Helm chart with Argo, you might encounter issues with 
database migrations not running automatically on upgrade.
+
+To ensure database migrations with Argo CD, you will need to add:
+
+.. code-block::yaml

Review Comment:
   ```suggestion
   .. code-block:: yaml
   ```
   
   RST won't display this code block without a whitespace in between `::` and 
`yaml`.



##
docs/helm-chart/index.rst:
##
@@ -140,3 +140,18 @@ will not start as the migrations will not be run:
 This is so these CI/CD services can perform updates without issues and 
preserve the immutability of Kubernetes Job manifests.
 
 This also applies if you install the chart using ``--wait`` in your ``helm 
install`` command.
+
+.. note::
+While deploying this Helm chart with Argo, you might encounter issues with 
database migrations not running automatically on upgrade.
+
+To ensure database migrations with Argo CD, you will need to add:
+
+.. code-block::yaml
+
+migrateDatabaseJob:
+jobAnnotations:
+"argocd.argoproj.io/hook": Sync
+
+This will ensure database migrations run when the Airflow Docker image is 
upgraded. This approach has a limitation in that the database migrations will 
run every time there is a ``Sync`` event in Argo. This is a trade-off for 
automation at the cost of some computational loss.

Review Comment:
   ```suggestion
   This will run database migrations when the Airflow Docker image is upgraded. 
This approach has a limitation in that the database migrations will run every 
time there is a ``Sync`` event in Argo. This is a trade-off for automation at 
the cost of some computational loss.
   ```
   
   Avoid double usage of "ensure database migrations", reads a bit nicer.



##
docs/helm-chart/index.rst:
##
@@ -140,3 +140,18 @@ will not start as the migrations will not be run:
 This is so these CI/CD services can perform updates without issues and 
preserve the immutability of Kubernetes Job manifests.
 
 This also applies if you install the chart using ``--wait`` in your ``helm 
install`` command.
+
+.. note::
+While deploying this Helm chart with Argo, you might encounter issues with 
database migrations not running automatically on upgrade.
+
+To ensure database migrations with Argo CD, you will need to add:
+
+.. code-block::yaml
+
+migrateDatabaseJob:
+jobAnnotations:
+"argocd.argoproj.io/hook": Sync
+
+This will ensure database migrations run when the Airflow Docker image is 
upgraded. This approach has a limitation in that the database migrations will 
run every time there is a ``Sync`` event in Argo. This is a trade-off for 
automation at the cost of some computational loss.
+
+If you use the Celery(Kubernetes)Executor, and using the built-in Redis, it is 
recommended that you setup a static Redis password either by supplying 
``redis.passwordSecretName`` and ``redis.data.brokerUrlSecretName`` or 
``redis.password``.  See `Celery Backend `__ 
for more information about managing Celery Backend

Review Comment:
   ```suggestion
   If you use the Celery(Kubernetes)Executor with the built-in Redis, it is 
recommended that you set up a static Redis password either by supplying 
``redis.passwordSecretName`` and ``redis.data.brokerUrlSecretName`` or 
``redis.password``.
   ```
   
   Minor spelling nitpicks.
   
   Also, would remove the last reference. That links to a section about 
managing your own Celery Backend, while the prior sentence speaks about 
managing the built-in Redis.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ashb merged pull request #29092: Fix warning in migrations about old config.

2023-01-23 Thread via GitHub


ashb merged PR #29092:
URL: https://github.com/apache/airflow/pull/29092


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] shahar1 commented on pull request #27156: Add documentation for BigQuery transfer operators

2023-01-23 Thread via GitHub


shahar1 commented on PR #27156:
URL: https://github.com/apache/airflow/pull/27156#issuecomment-1400154592

   @eladkal Please assign me


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (736f2e898a -> 9c3cd3803f)

2023-01-23 Thread ash
This is an automated email from the ASF dual-hosted git repository.

ash pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 736f2e898a Fix rendering parameters in PapermillOperator (#28979)
 add 9c3cd3803f Fix warning in migrations about old config. (#29092)

No new revisions were added by this update.

Summary of changes:
 .../versions/0069_2_0_0_add_scheduling_decision_to_dagrun_and_.py   | 2 +-
 docs/apache-airflow/img/airflow_erd.sha256  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)



[GitHub] [airflow] potiuk merged pull request #28795: Migrate Models Variable to Internal API

2023-01-23 Thread via GitHub


potiuk merged PR #28795:
URL: https://github.com/apache/airflow/pull/28795


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (9c3cd3803f -> bea49094be)

2023-01-23 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 9c3cd3803f Fix warning in migrations about old config. (#29092)
 add bea49094be Migrate Models Variable to Internal API (#28795)

No new revisions were added by this update.

Summary of changes:
 airflow/api_internal/endpoints/rpc_api_endpoint.py |  5 +++-
 airflow/models/variable.py | 29 --
 2 files changed, 20 insertions(+), 14 deletions(-)



[GitHub] [airflow] potiuk closed issue #28271: AIP-44 Migrate Variable to Internal API

2023-01-23 Thread via GitHub


potiuk closed issue #28271: AIP-44 Migrate Variable to Internal API
URL: https://github.com/apache/airflow/issues/28271


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] auvipy commented on pull request #29086: Remove upper bound limitation for `pytest`

2023-01-23 Thread via GitHub


auvipy commented on PR #29086:
URL: https://github.com/apache/airflow/pull/29086#issuecomment-1400183570

   I think we can handle that in another PR?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29099: Trigger Class migration to internal API

2023-01-23 Thread via GitHub


potiuk commented on PR #29099:
URL: https://github.com/apache/airflow/pull/29099#issuecomment-1400195705

   And again conflict to resolve (BTW. I think maybe we could split the RPC 
endpoint to be more modular and less conflict-prone, however this is just a 
temporary problem of having several parallel PRs so it might be not worth it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #19627: Providing a metadata secret conflicts with the use of pgbouncer

2023-01-23 Thread via GitHub


potiuk commented on issue #19627:
URL: https://github.com/apache/airflow/issues/19627#issuecomment-1400209990

   Please do. Assigned you :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] MattiaGallegati commented on issue #16163: Confusing log for long running tasks: "dependency 'Task Instance Not Running' FAILED: Task is in the running state"

2023-01-23 Thread via GitHub


MattiaGallegati commented on issue #16163:
URL: https://github.com/apache/airflow/issues/16163#issuecomment-1400214542

   In order to help I noticed a similar behaviour in 2.5.0 (with celery, redis, 
postgres, default configs).
   I also found related issues to ipothetically close on completion of this 
issue ( #5935 #6229 ).
   
   In my case I have a long running task (e.g. 1 day) that logs a reschedule 
attempts.
   It seems it does not have a practical effect on the task or the DAG, the 
task still runs and logs correctly till the end, no other runs are created.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (8805f37a7f -> 129f0820cd)

2023-01-23 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 8805f37a7f Update Airflow version to 2.5.1 (#29074)
 add 129f0820cd Make static checks generated file  more stable accross the 
board (#29080)

No new revisions were added by this update.

Summary of changes:
 .pre-commit-config.yaml|  21 +-
 airflow/providers/common/sql/operators/sql.pyi |   4 +-
 dev/deprecations/generate_deprecated_dicts.py  | 217 -
 dev/provider_packages/prepare_provider_packages.py |  18 +-
 .../ci/pre_commit/common_precommit_black_utils.py  |  31 +--
 scripts/ci/pre_commit/common_precommit_utils.py|   3 +-
 .../pre_commit_check_pre_commit_hooks.py   |  70 +++
 .../ci/pre_commit/pre_commit_compile_www_assets.py |   3 +-
 scripts/ci/pre_commit/pre_commit_insert_extras.py  |   4 +-
 .../ci/pre_commit/pre_commit_local_yml_mounts.py   |  18 +-
 scripts/ci/pre_commit/pre_commit_mypy.py   |  13 +-
 .../pre_commit_update_common_sql_api_stubs.py  | 175 +
 12 files changed, 177 insertions(+), 400 deletions(-)
 delete mode 100644 dev/deprecations/generate_deprecated_dicts.py
 copy airflow/utils/cli_app_builder.py => 
scripts/ci/pre_commit/common_precommit_black_utils.py (51%)



[GitHub] [airflow] potiuk commented on a diff in pull request #29080: Make static checks generated file more stable across the board

2023-01-23 Thread via GitHub


potiuk commented on code in PR #29080:
URL: https://github.com/apache/airflow/pull/29080#discussion_r1083960155


##
.pre-commit-config.yaml:
##
@@ -146,6 +146,13 @@ repos:
   - --fuzzy-match-generates-todo
 files: >
   
\.cfg$|\.conf$|\.ini$|\.ldif$|\.properties$|\.readthedocs$|\.service$|\.tf$|Dockerfile.*$
+  - repo: https://github.com/psf/black
+rev: 22.12.0
+hooks:
+  - id: black
+name: Run black (python formatter)
+args: [--config=./pyproject.toml]

Review Comment:
   Yeah we could remove, but also I think having it explicit here might make it 
easier for anyone in the future looking where the configuration is. There are 
couple of ways it can be specified and `./pyproject.toml` configuration is not 
yet as popular as it should be so that gives a hint that wa are using it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #29080: Make static checks generated file more stable across the board

2023-01-23 Thread via GitHub


potiuk merged PR #29080:
URL: https://github.com/apache/airflow/pull/29080


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (bea49094be -> 8805f37a7f)

2023-01-23 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from bea49094be Migrate Models Variable to Internal API (#28795)
 add 8805f37a7f Update Airflow version to 2.5.1 (#29074)

No new revisions were added by this update.

Summary of changes:
 .github/ISSUE_TEMPLATE/airflow_bug_report.yml  |   2 +-
 Dockerfile |   2 +-
 README.md  |  14 +--
 RELEASE_NOTES.rst  | 113 +
 airflow/api_connexion/openapi/v1.yaml  |   2 +-
 airflow/utils/db.py|   3 +-
 chart/Chart.yaml   |  22 ++--
 chart/newsfragments/28074.significant.rst  |   3 -
 chart/newsfragments/29074.significant.rst  |   3 +
 chart/values.schema.json   |   4 +-
 chart/values.yaml  |   4 +-
 .../installation/supported-versions.rst|   2 +-
 newsfragments/08212.misc.rst   |   1 -
 .../ci/pre_commit/pre_commit_supported_versions.py |   2 +-
 14 files changed, 144 insertions(+), 33 deletions(-)
 delete mode 100644 chart/newsfragments/28074.significant.rst
 create mode 100644 chart/newsfragments/29074.significant.rst
 delete mode 100644 newsfragments/08212.misc.rst



[GitHub] [airflow] MrGeorgeOwl commented on pull request #25466: Auto ML assets

2023-01-23 Thread via GitHub


MrGeorgeOwl commented on PR #25466:
URL: https://github.com/apache/airflow/pull/25466#issuecomment-1400125528

   @potiuk I think that PR can be merged. I can't do that because I am not the 
author of PR and I don't have write access


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] kaxil merged pull request #29014: Add deferrable mode to `DbtCloudRunJobOperator`

2023-01-23 Thread via GitHub


kaxil merged PR #29014:
URL: https://github.com/apache/airflow/pull/29014


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #29074: Update Airflow version to 2.5.1

2023-01-23 Thread via GitHub


potiuk merged PR #29074:
URL: https://github.com/apache/airflow/pull/29074


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Add deferrable mode to `DbtCloudRunJobOperator` (#29014)

2023-01-23 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 55049c50d5 Add deferrable mode to `DbtCloudRunJobOperator` (#29014)
55049c50d5 is described below

commit 55049c50d52323e242c2387f285f0591ea38cde7
Author: Phani Kumar <94376113+phanik...@users.noreply.github.com>
AuthorDate: Mon Jan 23 17:36:24 2023 +0530

Add deferrable mode to `DbtCloudRunJobOperator` (#29014)

This PR donates the `DbtCloudRunJobOperatorAsync` from 
[astronomer-providers](https://github.com/astronomer/astronomer-providers) repo
---
 airflow/providers/dbt/cloud/hooks/dbt.py   | 110 ++-
 airflow/providers/dbt/cloud/operators/dbt.py   |  67 +---
 airflow/providers/dbt/cloud/provider.yaml  |   2 +
 airflow/providers/dbt/cloud/triggers/__init__.py   |  16 +++
 airflow/providers/dbt/cloud/triggers/dbt.py| 119 +
 .../operators.rst  |  12 +++
 generated/provider_dependencies.json   |   4 +-
 7 files changed, 314 insertions(+), 16 deletions(-)

diff --git a/airflow/providers/dbt/cloud/hooks/dbt.py 
b/airflow/providers/dbt/cloud/hooks/dbt.py
index 4b6ac2151a..3ddeeb222b 100644
--- a/airflow/providers/dbt/cloud/hooks/dbt.py
+++ b/airflow/providers/dbt/cloud/hooks/dbt.py
@@ -22,8 +22,11 @@ import warnings
 from enum import Enum
 from functools import wraps
 from inspect import signature
-from typing import Any, Callable, Sequence, Set
+from typing import Any, Callable, Sequence, Set, TypeVar, cast
 
+import aiohttp
+from aiohttp import ClientResponseError
+from asgiref.sync import sync_to_async
 from requests import PreparedRequest, Session
 from requests.auth import AuthBase
 from requests.models import Response
@@ -125,6 +128,34 @@ class DbtCloudJobRunException(AirflowException):
 """An exception that indicates a job run failed to complete."""
 
 
+T = TypeVar("T", bound=Any)
+
+
+def provide_account_id(func: T) -> T:
+"""
+Decorator which provides a fallback value for ``account_id``. If the 
``account_id`` is None or not passed
+to the decorated function, the value will be taken from the configured dbt 
Cloud Airflow Connection.
+"""
+function_signature = signature(func)
+
+@wraps(func)
+async def wrapper(*args: Any, **kwargs: Any) -> Any:
+bound_args = function_signature.bind(*args, **kwargs)
+
+if bound_args.arguments.get("account_id") is None:
+self = args[0]
+if self.dbt_cloud_conn_id:
+connection = await 
sync_to_async(self.get_connection)(self.dbt_cloud_conn_id)
+default_account_id = connection.login
+if not default_account_id:
+raise AirflowException("Could not determine the dbt Cloud 
account.")
+bound_args.arguments["account_id"] = int(default_account_id)
+
+return await func(*bound_args.args, **bound_args.kwargs)
+
+return cast(T, wrapper)
+
+
 class DbtCloudHook(HttpHook):
 """
 Interact with dbt Cloud using the V2 API.
@@ -150,6 +181,83 @@ class DbtCloudHook(HttpHook):
 super().__init__(auth_type=TokenAuth)
 self.dbt_cloud_conn_id = dbt_cloud_conn_id
 
+@staticmethod
+def get_request_url_params(
+tenant: str, endpoint: str, include_related: list[str] | None = None
+) -> tuple[str, dict[str, Any]]:
+"""
+Form URL from base url and endpoint url
+
+:param tenant: The tenant name which is need to be replaced in base 
url.
+:param endpoint: Endpoint url to be requested.
+:param include_related: Optional. List of related fields to pull with 
the run.
+Valid values are "trigger", "job", "repository", and "environment".
+"""
+data: dict[str, Any] = {}
+base_url = f"https://{tenant}.getdbt.com/api/v2/accounts/";
+if include_related:
+data = {"include_related": include_related}
+if base_url and not base_url.endswith("/") and endpoint and not 
endpoint.startswith("/"):
+url = base_url + "/" + endpoint
+else:
+url = (base_url or "") + (endpoint or "")
+return url, data
+
+async def get_headers_tenants_from_connection(self) -> tuple[dict[str, 
Any], str]:
+"""Get Headers, tenants from the connection details"""
+headers: dict[str, Any] = {}
+connection: Connection = await 
sync_to_async(self.get_connection)(self.dbt_cloud_conn_id)
+tenant: str = connection.schema if connection.schema else "cloud"
+package_name, provider_version = _get_provider_info()
+headers["User-Agent"] = f"{package_name}-v{provider_version}"
+headers["Content-Type"] = "application/json"
+headers["Authorization"] = f"Token {connection.passwo

[GitHub] [airflow] Taragolis commented on pull request #29086: Remove upper bound limitation for `pytest`

2023-01-23 Thread via GitHub


Taragolis commented on PR #29086:
URL: https://github.com/apache/airflow/pull/29086#issuecomment-1400225586

   > I think we can handle that in another PR?
   
   It definitely for another PR and related only to AIP-44, this flaky error 
also appear in `pytest` 6 yesterday.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on pull request #29087: Unquarantine receive SIGTERM on Task Runner test (second attempt)

2023-01-23 Thread via GitHub


Taragolis commented on PR #29087:
URL: https://github.com/apache/airflow/pull/29087#issuecomment-1400282901

   Rebase again


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #29102: Airflow UI not showing mapped tasks

2023-01-23 Thread boring-cyborg


boring-cyborg[bot] commented on issue #29102:
URL: https://github.com/apache/airflow/issues/29102#issuecomment-1400275979

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on pull request #27829: Improving the release process

2023-01-23 Thread via GitHub


ephraimbuddy commented on PR #27829:
URL: https://github.com/apache/airflow/pull/27829#issuecomment-1400284160

   Previously, some of the commands used to check out branches were being 
executed in the CI. However, because this change was not in the checked-out 
branch, the codes could not be found and the process failed. To resolve this 
issue, the solution was to run the branch check-out commands in dry-run mode.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mattinbits commented on issue #21171: Airflow schedules tasks in wrong timezone with an MSSQL Metadata DB on a non-UTC server

2023-01-23 Thread via GitHub


mattinbits commented on issue #21171:
URL: https://github.com/apache/airflow/issues/21171#issuecomment-1400287367

   > > @potiuk As an initial fix for our internal system, I replaced 
`func.now()` with `func.GETDATE()` in an internal fork which is enough to fix 
it for SQL Server. This is not enough for cross-database compatibility of 
course. I have considered creating something similar to the functionality of 
sqlalchemy-utc to airflow.utils.sqlalchemy, but I had second thoughts whether 
this additional complexity is necessary. Do we need to rely on the DB's version 
of "now" in this case? Can we instead use `timezone.utcnow()`, i.e. let the 
application server decide what "now" is and pass it as a literal to the DB?
   > > Here are two places where `func.now()` is used in a filter, exposing 
this issue: 
https://github.com/apache/airflow/blob/main/airflow/models/dagrun.py#L294 
https://github.com/apache/airflow/blob/main/airflow/models/dag.py#L2872
   > > I can see in other places that `timezone.utcnow()` is used: 
https://github.com/apache/airflow/blob/main/airflow/models/taskinstance.py#L293 
https://github.com/apache/airflow/blob/main/airflow/models/trigger.py#L179
   > > I'm not sure if there is a particular reason why `func.now()` is needed 
in the first two instances?
   > 
   > @mattinbits thanks for the internal fix just to clarify do you mean 
replacing func.now() with func.GETUTCDATE() instead of func.GETDATE()? When I 
made those changes you suggested with func.GETUTCDATE() it worked.
   
   You're right, it was mistake in my initial comment. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] anneadb opened a new issue, #29102: Airflow UI not showing mapped tasks

2023-01-23 Thread via GitHub


anneadb opened a new issue, #29102:
URL: https://github.com/apache/airflow/issues/29102

   ### Apache Airflow version
   
   2.5.1
   
   ### What happened
   
   I am using dynamic task mapping to process a list of files in the same way.
   The tasks are generated but I cannot see them listed in the UI tab "Mapped 
Tasks".
   When I go to the graph view I can look at the logs for the last run with all 
mapped instances.
   
   https://user-images.githubusercontent.com/46241952/214037237-82381300-11c1-43ce-a4b9-b8f16294a74c.png";>
   
   https://user-images.githubusercontent.com/46241952/214037258-026d4074-40c5-40a1-96de-74c2bd11173f.png";>
   
   
   ### What you think should happen instead
   
   On the tab "Mapped Tasks" I expect to see a list of the generated tasks with 
links to their logs.
   
   ### How to reproduce
   
   * Use a docker-compose with apache/airflow:2.5.1-python3.8 as the base image.
   * Expand a python operator based on a list of file names from another task
   
   Dag code:
   
   ```
   def prep_args(file_name):
   return {"file_name": file_name}
   
   with DAG(
   "imparted_orders_lascana",
   default_args=default_args,
   schedule="19 */3 * * *",
   ) as dag:
   
   t0 = PythonOperator(
   task_id="get_file_list",
   python_callable=get_file_list,
   )
   
   t1 = PythonOperator.partial(
   task_id="transfer_data",
   python_callable=transfer_data,
   ).expand(op_kwargs=t0.output.map(prep_args))
   
   ```
   
   ### Operating System
   
   CentOS 7.9
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   We're creating our own docker image based on the provided image to add a few 
additional packages.
   
   The webserver config looks like this:
   
   ```
   from __future__ import annotations
   
   import os
   
   from airflow.www.fab_security.manager import AUTH_DB
   
   basedir = os.path.abspath(os.path.dirname(__file__))
   
   # Flask-WTF flag for CSRF
   WTF_CSRF_ENABLED = True
   
   # The authentication type
   # AUTH_DB : Is for database
   AUTH_TYPE = AUTH_DB
   
   # Uncomment and set to desired role to enable access without authentication
   AUTH_ROLE_PUBLIC = "Admin"
   ```
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] schwartzpub commented on issue #21171: Airflow schedules tasks in wrong timezone with an MSSQL Metadata DB on a non-UTC server

2023-01-23 Thread via GitHub


schwartzpub commented on issue #21171:
URL: https://github.com/apache/airflow/issues/21171#issuecomment-1400295104

   
   
   
   
   > As someone who has been running Airflow on MSSQL in production for several 
years, I tend to agree that explicitly removing support would be better. We're 
planning to move away from it and I wouldn't recommend it to anyone else.
   > 
   > Although the Github README makes it clear that the support is 
experimental, I don't see this mentioned here, which I assume is probably most 
people's entry point: 
https://airflow.apache.org/docs/apache-airflow/stable/howto/set-up-database.html#choosing-database-backend
   
   That's how I wound up using MSSQL, it wasn't immediately clear that it was 
an experimental feature -- thankfully in our case we're still in PoC stage, so 
making the swap to postgres was painless.  I appreciate the feedback, comments, 
and guidance.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Prepare ad hoc provider release for Docker, Cassandra, Papermill (#28999)

2023-01-23 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new dd6cef7889 Prepare ad hoc provider release for Docker, Cassandra, 
Papermill (#28999)
dd6cef7889 is described below

commit dd6cef7889884bd15d4caca8aae61f3b73c29b1e
Author: eladkal <45845474+elad...@users.noreply.github.com>
AuthorDate: Mon Jan 23 14:54:12 2023 +0200

Prepare ad hoc provider release for Docker, Cassandra, Papermill (#28999)
---
 airflow/providers/apache/cassandra/CHANGELOG.rst   |  9 +
 airflow/providers/apache/cassandra/provider.yaml   |  1 +
 airflow/providers/docker/CHANGELOG.rst | 18 ++
 airflow/providers/docker/provider.yaml |  2 +-
 airflow/providers/papermill/CHANGELOG.rst  |  9 +
 airflow/providers/papermill/provider.yaml  |  1 +
 .../commits.rst| 15 ++-
 .../index.rst  |  2 +-
 docs/apache-airflow-providers-docker/commits.rst   | 13 -
 docs/apache-airflow-providers-docker/index.rst |  2 +-
 docs/apache-airflow-providers-papermill/commits.rst| 14 +-
 docs/apache-airflow-providers-papermill/index.rst  |  2 +-
 12 files changed, 69 insertions(+), 19 deletions(-)

diff --git a/airflow/providers/apache/cassandra/CHANGELOG.rst 
b/airflow/providers/apache/cassandra/CHANGELOG.rst
index 51d14af93e..00f4b3632c 100644
--- a/airflow/providers/apache/cassandra/CHANGELOG.rst
+++ b/airflow/providers/apache/cassandra/CHANGELOG.rst
@@ -24,6 +24,15 @@
 Changelog
 -
 
+3.1.1
+.
+
+Misc
+
+
+* ``Limit dnspython to < 2.3.0 until eventlet incompatibitliy is solved 
(#28962)``
+* ``Remove limit for dnspython after eventlet got fixed (#29004)``
+
 3.1.0
 .
 
diff --git a/airflow/providers/apache/cassandra/provider.yaml 
b/airflow/providers/apache/cassandra/provider.yaml
index 961bbd3679..2055bd6c60 100644
--- a/airflow/providers/apache/cassandra/provider.yaml
+++ b/airflow/providers/apache/cassandra/provider.yaml
@@ -22,6 +22,7 @@ description: |
 `Apache Cassandra `__.
 
 versions:
+  - 3.1.1
   - 3.1.0
   - 3.0.0
   - 2.1.3
diff --git a/airflow/providers/docker/CHANGELOG.rst 
b/airflow/providers/docker/CHANGELOG.rst
index 93831e6140..4cf8bae119 100644
--- a/airflow/providers/docker/CHANGELOG.rst
+++ b/airflow/providers/docker/CHANGELOG.rst
@@ -24,17 +24,24 @@
 Changelog
 -
 
-3.4.1
+3.5.0
 .
 
-Bug Fixes
-~
+Features
+
 
 * ``Add correct widgets in Docker Hook (#28700)``
 * ``Make docker operators always use 'DockerHook' for API calls (#28363)``
+* ``Skip DockerOperator task when it returns a provided exit code (#28996)``
+
+Bug Fixes
+~
+
+* ``Fix label name for 'reauth' field in Docker Connection (#28974)``
 
 .. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
+   * ``Prepare docs for Jan 2023 mid-month wave of Providers (#28929)``
 
 3.4.0
 .
@@ -45,11 +52,6 @@ Features
 * ``add hostname argument to DockerOperator (#27822)``
 * ``Move min airflow version down for Docker Provider to 2.3.0 (#28648)``
 
-
-.. Below changes are excluded from the changelog. Move them to
-   appropriate section above if needed. Do not delete the lines(!):
-
-
 3.3.0
 .
 
diff --git a/airflow/providers/docker/provider.yaml 
b/airflow/providers/docker/provider.yaml
index 2f5b3b2c33..560db294d1 100644
--- a/airflow/providers/docker/provider.yaml
+++ b/airflow/providers/docker/provider.yaml
@@ -22,7 +22,7 @@ description: |
 `Docker `__
 
 versions:
-  - 3.4.1
+  - 3.5.0
   - 3.4.0
   - 3.3.0
   - 3.2.0
diff --git a/airflow/providers/papermill/CHANGELOG.rst 
b/airflow/providers/papermill/CHANGELOG.rst
index 56f7031005..e245ae7610 100644
--- a/airflow/providers/papermill/CHANGELOG.rst
+++ b/airflow/providers/papermill/CHANGELOG.rst
@@ -24,6 +24,15 @@
 Changelog
 -
 
+3.1.1
+.
+
+
+Bug Fixes
+~
+
+* ``Fix rendering parameters in PapermillOperator (#28979)``
+
 3.1.0
 .
 
diff --git a/airflow/providers/papermill/provider.yaml 
b/airflow/providers/papermill/provider.yaml
index 966a171b50..afcfe41276 100644
--- a/airflow/providers/papermill/provider.yaml
+++ b/airflow/providers/papermill/provider.yaml
@@ -22,6 +22,7 @@ description: |
 `Papermill `__
 
 versions:
+  - 3.1.1
   - 3.1.0
   - 3.0.0
   - 2.2.3
diff --git a/docs/apache-airflow-providers-apache-cassandra/commits.rst 
b/docs/apache-airflow-providers-apache-cassandra/commits.rst
index 3e3b8439e5..e16df52f42 100644
--- a/docs/apache-airflow-providers-apache-cassandra/commits.rst
+++ b/docs/apache-airflow-providers-apache-c

[GitHub] [airflow] Taragolis commented on issue #21171: Airflow schedules tasks in wrong timezone with an MSSQL Metadata DB on a non-UTC server

2023-01-23 Thread via GitHub


Taragolis commented on issue #21171:
URL: https://github.com/apache/airflow/issues/21171#issuecomment-1400302771

   Yep seems like inconsistent between 
[README.md](https://github.com/apache/airflow#requirements) and 
[Documentation](https://airflow.apache.org/docs/apache-airflow/stable/howto/set-up-database.html#choosing-database-backend).
   
   Feel free to change it, you could do even do it by click on "Suggest a 
change on this page"


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



svn commit: r59539 - /dev/airflow/providers/

2023-01-23 Thread eladkal
Author: eladkal
Date: Mon Jan 23 13:00:01 2023
New Revision: 59539

Log:
Add artifacts for Airflow Providers 2023-01-23

Added:

dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz   
(with props)

dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz.asc

dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz.sha512
dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz   (with 
props)
dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz.asc
dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz.sha512
dev/airflow/providers/apache-airflow-providers-papermill-3.1.1.tar.gz   
(with props)
dev/airflow/providers/apache-airflow-providers-papermill-3.1.1.tar.gz.asc
dev/airflow/providers/apache-airflow-providers-papermill-3.1.1.tar.gz.sha512

dev/airflow/providers/apache_airflow_providers_apache_cassandra-3.1.1-py3-none-any.whl
   (with props)

dev/airflow/providers/apache_airflow_providers_apache_cassandra-3.1.1-py3-none-any.whl.asc

dev/airflow/providers/apache_airflow_providers_apache_cassandra-3.1.1-py3-none-any.whl.sha512

dev/airflow/providers/apache_airflow_providers_docker-3.5.0-py3-none-any.whl   
(with props)

dev/airflow/providers/apache_airflow_providers_docker-3.5.0-py3-none-any.whl.asc

dev/airflow/providers/apache_airflow_providers_docker-3.5.0-py3-none-any.whl.sha512

dev/airflow/providers/apache_airflow_providers_papermill-3.1.1-py3-none-any.whl 
  (with props)

dev/airflow/providers/apache_airflow_providers_papermill-3.1.1-py3-none-any.whl.asc

dev/airflow/providers/apache_airflow_providers_papermill-3.1.1-py3-none-any.whl.sha512

Added: 
dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz
==
Binary file - no diff available.

Propchange: 
dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz
--
svn:mime-type = application/octet-stream

Added: 
dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz.asc
==
--- 
dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz.asc
 (added)
+++ 
dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz.asc
 Mon Jan 23 13:00:01 2023
@@ -0,0 +1,7 @@
+-BEGIN PGP SIGNATURE-
+
+iIkEABYKADEWIQSDQO8ECQokO9vDRUWG4IhmPszevgUCY86EXhMcZWxhZGthbEBh
+cGFjaGUub3JnAAoJEIbgiGY+zN6+pt8BAN8w5Fpx9zgiSsgViYHGNt6EvT7IfOuZ
+CjqLPPfu9cMKAP9rHscWOQEANggbqkXYe6LdZt3i7q96We7nsU46jridDg==
+=BzVJ
+-END PGP SIGNATURE-

Added: 
dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz.sha512
==
--- 
dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz.sha512
 (added)
+++ 
dev/airflow/providers/apache-airflow-providers-apache-cassandra-3.1.1.tar.gz.sha512
 Mon Jan 23 13:00:01 2023
@@ -0,0 +1 @@
+7ad9865b50324ab1bd919b2293b2afb9804c5c41d0b9e9fbeb0a633a79a2ba6aa018a8fbc1e01f1a0fae9cae8fe83eabdcb5a173dae98baf954af45969e01852
  apache-airflow-providers-apache-cassandra-3.1.1.tar.gz

Added: dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz
==
Binary file - no diff available.

Propchange: dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz.asc
==
--- dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz.asc 
(added)
+++ dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz.asc Mon 
Jan 23 13:00:01 2023
@@ -0,0 +1,7 @@
+-BEGIN PGP SIGNATURE-
+
+iIkEABYKADEWIQSDQO8ECQokO9vDRUWG4IhmPszevgUCY86EZxMcZWxhZGthbEBh
+cGFjaGUub3JnAAoJEIbgiGY+zN6+FMAA/ioYSXy2MChGoD2R0mnStdPBy/O+Zo+E
+VLTNjTWCJuf9AQCmnMOTFwQPYGHYdKnsOE4hX5RE4ey3rqEgHE+t/UDkBw==
+=GSXX
+-END PGP SIGNATURE-

Added: dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz.sha512
==
--- dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz.sha512 
(added)
+++ dev/airflow/providers/apache-airflow-providers-docker-3.5.0.tar.gz.sha512 
Mon Jan 23 13:00:01 2023
@@ -0,0 +1 @@
+c73836dc88d4c967398ecd49f1641c2b883460aca792b51ab31159cb0d0e83fa400d26fb55fbbcec8583ec9f6a60708a01ead9aae9efe363335aca06e311750a
  apache-airflow-providers-docker-3.5.0.tar.gz

Added: dev/airflow/providers/apache-airf

[airflow] annotated tag providers-papermill/3.1.1rc1 updated (dd6cef7889 -> e5a2bc0b13)

2023-01-23 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to annotated tag providers-papermill/3.1.1rc1
in repository https://gitbox.apache.org/repos/asf/airflow.git


*** WARNING: tag providers-papermill/3.1.1rc1 was modified! ***

from dd6cef7889 (commit)
  to e5a2bc0b13 (tag)
 tagging dd6cef7889884bd15d4caca8aae61f3b73c29b1e (commit)
 replaces providers-amazon/7.1.0
  by Elad Kalif
  on Mon Jan 23 15:23:50 2023 +0200

- Log -
Release 2023-01-23 of providers
---


No new revisions were added by this update.

Summary of changes:



[airflow] annotated tag providers-docker/3.5.0rc1 updated (dd6cef7889 -> 8f9d3b9984)

2023-01-23 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to annotated tag providers-docker/3.5.0rc1
in repository https://gitbox.apache.org/repos/asf/airflow.git


*** WARNING: tag providers-docker/3.5.0rc1 was modified! ***

from dd6cef7889 (commit)
  to 8f9d3b9984 (tag)
 tagging dd6cef7889884bd15d4caca8aae61f3b73c29b1e (commit)
 replaces providers-amazon/7.1.0
  by Elad Kalif
  on Mon Jan 23 15:23:50 2023 +0200

- Log -
Release 2023-01-23 of providers
---


No new revisions were added by this update.

Summary of changes:



[airflow] annotated tag providers-apache-cassandra/3.1.1rc1 updated (dd6cef7889 -> ab84fb8d4c)

2023-01-23 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to annotated tag providers-apache-cassandra/3.1.1rc1
in repository https://gitbox.apache.org/repos/asf/airflow.git


*** WARNING: tag providers-apache-cassandra/3.1.1rc1 was modified! ***

from dd6cef7889 (commit)
  to ab84fb8d4c (tag)
 tagging dd6cef7889884bd15d4caca8aae61f3b73c29b1e (commit)
 replaces providers-amazon/7.1.0
  by Elad Kalif
  on Mon Jan 23 15:23:50 2023 +0200

- Log -
Release 2023-01-23 of providers
---


No new revisions were added by this update.

Summary of changes:



[GitHub] [airflow] eladkal merged pull request #28999: Prepare ad hoc provider release for Docker, Cassandra, Papermill

2023-01-23 Thread via GitHub


eladkal merged PR #28999:
URL: https://github.com/apache/airflow/pull/28999


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow-site] branch add-documentation-2023-01-23 created (now 7a8706e300)

2023-01-23 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch add-documentation-2023-01-23
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


  at 7a8706e300 Add documentation for packages - 2023-01-23

This branch includes the following new commits:

 new 7a8706e300 Add documentation for packages - 2023-01-23

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




[GitHub] [airflow-site] eladkal closed pull request #727: Add documentation for packages - 2023-01-23

2023-01-23 Thread via GitHub


eladkal closed pull request #727: Add documentation for packages - 2023-01-23
URL: https://github.com/apache/airflow-site/pull/727


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow-site] branch add-documentation-2023-01-23 created (now f6b01e2405)

2023-01-23 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch add-documentation-2023-01-23
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


  at f6b01e2405 Add documentation for packages - 2023-01-23

This branch includes the following new commits:

 new f6b01e2405 Add documentation for packages - 2023-01-23

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




[GitHub] [airflow] auvipy commented on pull request #8545: [AIRFLOW-249] Refactor the SLA mechanism (Continuation from #3584 )

2023-01-23 Thread via GitHub


auvipy commented on PR #8545:
URL: https://github.com/apache/airflow/pull/8545#issuecomment-1400340680

   to take it over where should some one focus? fixing conflicts first?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal opened a new issue, #29103: Status of testing Providers that were prepared on January 23, 2023

2023-01-23 Thread via GitHub


eladkal opened a new issue, #29103:
URL: https://github.com/apache/airflow/issues/29103

   ### Body
   
   
   I have a kind request for all the contributors to the latest provider 
packages release.
   Could you please help us to test the RC versions of the providers?
   
   Let us know in the comment, whether the issue is addressed.
   
   Those are providers that require testing as there were some substantial 
changes introduced:
   
   
   ## Provider [apache.cassandra: 
3.1.1rc1](https://pypi.org/project/apache-airflow-providers-apache-cassandra/3.1.1rc1)
  - [ ] [Limit dnspython to < 2.3.0 until eventlet incompatibitliy is 
solved (#28962)](https://github.com/apache/airflow/pull/28962): @potiuk
  - [ ] [Remove limit for dnspython after eventlet got fixed 
(#29004)](https://github.com/apache/airflow/pull/29004): @potiuk
   ## Provider [docker: 
3.5.0rc1](https://pypi.org/project/apache-airflow-providers-docker/3.5.0rc1)
  - [ ] [Add correct widgets in Docker Hook 
(#28700)](https://github.com/apache/airflow/pull/28700): @potiuk
  - [ ] [make docker operators always use `DockerHook` for API calls 
(#28363)](https://github.com/apache/airflow/pull/28363): @Taragolis
  - [ ] [Skip DockerOperator task when it returns a provided exit code 
(#28996)](https://github.com/apache/airflow/pull/28996): @hussein-awala
  - [ ] [Fix label name for `reauth` field in Docker Connection 
(#28974)](https://github.com/apache/airflow/pull/28974): @Taragolis
   ## Provider [papermill: 
3.1.1rc1](https://pypi.org/project/apache-airflow-providers-papermill/3.1.1rc1)
  - [ ] [Fix rendering parameters in PapermillOperator 
(#28979)](https://github.com/apache/airflow/pull/28979): @Taragolis
   
   The guidelines on how to test providers can be found in
   
   [Verify providers by 
contributors](https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-by-contributors)
   
   All users involved in the PRs:
   @hussein-awala @Taragolis @potiuk
   
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] schwartzpub opened a new pull request, #29104: Update set-up-database.rst

2023-01-23 Thread via GitHub


schwartzpub opened a new pull request, #29104:
URL: https://github.com/apache/airflow/pull/29104

   Add experimental notation to MSSQL to be consistent with previous 
documentation.
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on pull request #29104: Update set-up-database.rst

2023-01-23 Thread boring-cyborg


boring-cyborg[bot] commented on PR #29104:
URL: https://github.com/apache/airflow/pull/29104#issuecomment-1400425389

   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (ruff, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it's a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on issue #26424: `POST /taskInstances/list` with wildcards returns unhelpful error

2023-01-23 Thread via GitHub


eladkal commented on issue #26424:
URL: https://github.com/apache/airflow/issues/26424#issuecomment-1400453634

   @maheshsv are you still working on this issue?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] talnagar opened a new issue, #29105: graph disappears when using branch_task and a dynamic classic operator

2023-01-23 Thread via GitHub


talnagar opened a new issue, #29105:
URL: https://github.com/apache/airflow/issues/29105

   ### Apache Airflow version
   
   2.5.1
   
   ### What happened
   
   when using a dynamically generated task after a branch_task the graph 
doesn't render. tried with BashOperator and a KubernetesPodOperator. 
   
   the developer console in the browser shows the error:
   `Uncaught TypeError: Cannot read properties of undefined (reading 'length')
   at z (graph.1c0596dfced26c638bfe.js:2:17499)
   at graph.1c0596dfced26c638bfe.js:2:17654
   at Array.map ()
   at z (graph.1c0596dfced26c638bfe.js:2:17646)
   at graph.1c0596dfced26c638bfe.js:2:26602
   at graph.1c0596dfced26c638bfe.js:2:26655
   at graph.1c0596dfced26c638bfe.js:2:26661
   at graph.1c0596dfced26c638bfe.js:2:222
   at graph.1c0596dfced26c638bfe.js:2:227
   z @ graph.1c0596dfced26c638bfe.js:2
   (anonymous) @ graph.1c0596dfced26c638bfe.js:2
   z @ graph.1c0596dfced26c638bfe.js:2
   (anonymous) @ graph.1c0596dfced26c638bfe.js:2
   (anonymous) @ graph.1c0596dfced26c638bfe.js:2
   (anonymous) @ graph.1c0596dfced26c638bfe.js:2
   (anonymous) @ graph.1c0596dfced26c638bfe.js:2
   (anonymous) @ graph.1c0596dfced26c638bfe.js:2
   `
   
   grid view renders fine. 
   
   ### What you think should happen instead
   
   graph should be rendered. 
   
   ### How to reproduce
   
   `@dag('branch_dynamic', schedule_interval=None, default_args=default_args, 
catchup=False)
   def branch_dynamic_flow():
   @branch_task
   def choose_path():
   return 'b'
   
   @task
   def a():
   print('a')
   
   b = BashOperator.partial(task_id="b").expand(bash_command=["echo 1", 
"echo 2"])
   
   path = choose_path()
   path >> a()
   path >> b
   
   
   dag = branch_dynamic_flow()
   `
   
   ### Operating System
   
   red hat
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-cncf-kubernetes | 5.1.1 | Kubernetes
   
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on pull request #29035: Renaming nose compatible methods in flavour of regular pytest naming

2023-01-23 Thread via GitHub


Taragolis commented on PR #29035:
URL: https://github.com/apache/airflow/pull/29035#issuecomment-1400499360

   > Ah - isn't that the "airflow.jobs.backfill_job.BackfillJob" generated log ?
   
   Yeah, it is `distributed.client:client.py` logger create this message
   
   ```console
   ERRORdistributed.client:client.py:1312 Failed to reconnect to scheduler 
after 30.00 seconds, closing client
   ```
   
   Let's have a look if it happen more often then do something with it. Do not 
know about probability, maybe 1:1000 or maybe 1:1M


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on issue #29103: Status of testing Providers that were prepared on January 23, 2023

2023-01-23 Thread via GitHub


Taragolis commented on issue #29103:
URL: https://github.com/apache/airflow/issues/29103#issuecomment-1400502773

   @TPapajCin @marvinfretly @nicnguyen3103 could you also have a look 
PapermillOperator? 
   ```pip install apache-airflow-providers-papermill==3.1.1rc1```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi commented on issue #29100: Unnecessary scrollbars in grid view

2023-01-23 Thread via GitHub


bbovenzi commented on issue #29100:
URL: https://github.com/apache/airflow/issues/29100#issuecomment-1400508370

   I've been back and forth on the best scrolling solution for the grid view 
because the right or left side could be significantly longer than the other. 
This was most notable with logs. Scrolling the entire window would mean half 
the page would just be blank and you'd lose some context.
   I'd also like to make progress on cleaning up a lot of the menus above the 
grid view, so it all fits better in a typical window height.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] raphaelauv opened a new pull request, #29106: [FIX] README K8S support 2.5.1

2023-01-23 Thread via GitHub


raphaelauv opened a new pull request, #29106:
URL: https://github.com/apache/airflow/pull/29106

   https://github.com/apache/airflow/pull/29074 did not updated the K8S support 
in the README


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] raphaelauv commented on pull request #29106: [FIX] README K8S support 2.5.1

2023-01-23 Thread via GitHub


raphaelauv commented on PR #29106:
URL: https://github.com/apache/airflow/pull/29106#issuecomment-1400514307

   @pierrejeambrun could you review , thank you :+1: 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #29098: Explicit a few steps in the release process

2023-01-23 Thread via GitHub


potiuk commented on code in PR #29098:
URL: https://github.com/apache/airflow/pull/29098#discussion_r1084184091


##
dev/README_RELEASE_AIRFLOW.md:
##
@@ -477,8 +496,12 @@ To do this we need to
 twine upload -r pypitest dist/*
 ```
 
-- Verify that the test package looks good by downloading it and installing it 
into a virtual environment. The package download link is available at:
-https://test.pypi.org/project/apache-airflow/#files
+- Copy the package tar.gz download link available at 
https://test.pypi.org/project/apache-airflow/#files. Verify that the test 
package looks good by
+installing it with the constraint file into a new virtual environment. (adapt 
the constraint file python version to your virtual env setup)
+
+```shell script
+pip install 
https://test-files.pythonhosted.org/packages/98/2c/82009f7760f341cc4e12c44da657c906415130ca60cd201d05e9b38caa38/apache-airflow-2.2.1rc1.tar.gz
 --constraint 
https://raw.githubusercontent.com/apache/airflow/constraints-2.2.1rc1/constraints-3.8.txt

Review Comment:
   Maybe we can do it myuch simpler:
   
   ```
   python3 -m pip install --index-url https://test.pypi.org/simple/ 
 --extra-index-url https://pypi.org/simple/ apache-airflow-2.2.1rc1.tar.gz
   ```
   
   This will use both test pypi (to install RC) and regular PyPI (for all other 
packages). 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #29098: Explicit a few steps in the release process

2023-01-23 Thread via GitHub


potiuk commented on code in PR #29098:
URL: https://github.com/apache/airflow/pull/29098#discussion_r1084184091


##
dev/README_RELEASE_AIRFLOW.md:
##
@@ -477,8 +496,12 @@ To do this we need to
 twine upload -r pypitest dist/*
 ```
 
-- Verify that the test package looks good by downloading it and installing it 
into a virtual environment. The package download link is available at:
-https://test.pypi.org/project/apache-airflow/#files
+- Copy the package tar.gz download link available at 
https://test.pypi.org/project/apache-airflow/#files. Verify that the test 
package looks good by
+installing it with the constraint file into a new virtual environment. (adapt 
the constraint file python version to your virtual env setup)
+
+```shell script
+pip install 
https://test-files.pythonhosted.org/packages/98/2c/82009f7760f341cc4e12c44da657c906415130ca60cd201d05e9b38caa38/apache-airflow-2.2.1rc1.tar.gz
 --constraint 
https://raw.githubusercontent.com/apache/airflow/constraints-2.2.1rc1/constraints-3.8.txt

Review Comment:
   Maybe we can do it myuch simpler:
   
   ```
   python3 -m pip install --index-url https://test.pypi.org/simple/ 
--extra-index-url https://pypi.org/simple/ apache-airflow-2.2.1rc1.tar.gz
   ```
   
   This will use both test pypi (to install RC) and regular PyPI (for all other 
packages). 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #29098: Explicit a few steps in the release process

2023-01-23 Thread via GitHub


potiuk commented on code in PR #29098:
URL: https://github.com/apache/airflow/pull/29098#discussion_r1084184091


##
dev/README_RELEASE_AIRFLOW.md:
##
@@ -477,8 +496,12 @@ To do this we need to
 twine upload -r pypitest dist/*
 ```
 
-- Verify that the test package looks good by downloading it and installing it 
into a virtual environment. The package download link is available at:
-https://test.pypi.org/project/apache-airflow/#files
+- Copy the package tar.gz download link available at 
https://test.pypi.org/project/apache-airflow/#files. Verify that the test 
package looks good by
+installing it with the constraint file into a new virtual environment. (adapt 
the constraint file python version to your virtual env setup)
+
+```shell script
+pip install 
https://test-files.pythonhosted.org/packages/98/2c/82009f7760f341cc4e12c44da657c906415130ca60cd201d05e9b38caa38/apache-airflow-2.2.1rc1.tar.gz
 --constraint 
https://raw.githubusercontent.com/apache/airflow/constraints-2.2.1rc1/constraints-3.8.txt

Review Comment:
   Maybe we can do it much simpler:
   
   ```
   python3 -m pip install --index-url https://test.pypi.org/simple/ 
 --extra-index-url https://pypi.org/simple/ apache-airflow-2.2.1rc1.tar.gz
   ```
   
   This will use both test pypi (to install RC) and regular PyPI (for all other 
packages). 



##
dev/README_RELEASE_AIRFLOW.md:
##
@@ -477,8 +496,12 @@ To do this we need to
 twine upload -r pypitest dist/*
 ```
 
-- Verify that the test package looks good by downloading it and installing it 
into a virtual environment. The package download link is available at:
-https://test.pypi.org/project/apache-airflow/#files
+- Copy the package tar.gz download link available at 
https://test.pypi.org/project/apache-airflow/#files. Verify that the test 
package looks good by
+installing it with the constraint file into a new virtual environment. (adapt 
the constraint file python version to your virtual env setup)
+
+```shell script
+pip install 
https://test-files.pythonhosted.org/packages/98/2c/82009f7760f341cc4e12c44da657c906415130ca60cd201d05e9b38caa38/apache-airflow-2.2.1rc1.tar.gz
 --constraint 
https://raw.githubusercontent.com/apache/airflow/constraints-2.2.1rc1/constraints-3.8.txt

Review Comment:
   Maybe we can do it much simpler:
   
   ```
   python3 -m pip install --index-url https://test.pypi.org/simple/ 
 --extra-index-url https://pypi.org/simple/ apache-airflow-2.2.1rc1.tar.gz
   ```
   
   This will use both test PyPI (to install RC) and regular PyPI (for all other 
packages). 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #25466: Auto ML assets

2023-01-23 Thread via GitHub


potiuk merged PR #25466:
URL: https://github.com/apache/airflow/pull/25466


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Auto ML assets (#25466)

2023-01-23 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 90e6277da6 Auto ML assets (#25466)
90e6277da6 is described below

commit 90e6277da6b4102cf565134739af10bafa9d3894
Author: Maksim 
AuthorDate: Mon Jan 23 18:19:28 2023 +0300

Auto ML assets (#25466)
---
 .../cloud/example_dags/example_automl_tables.py| 319 -
 airflow/providers/google/cloud/links/automl.py | 163 +++
 airflow/providers/google/cloud/operators/automl.py | 103 ++-
 airflow/providers/google/provider.yaml |   5 +
 .../operators/cloud/automl.rst |  26 +-
 .../google/cloud/operators/test_automl.py  |  32 +--
 .../google/cloud/operators/test_automl_system.py   |  41 ---
 .../google/cloud/utils/gcp_authenticator.py|   1 -
 .../google/cloud/automl/example_automl_dataset.py  | 201 +
 .../google/cloud/automl/example_automl_model.py| 285 ++
 .../google/cloud/automl/resources/__init__.py  |  16 ++
 11 files changed, 797 insertions(+), 395 deletions(-)

diff --git 
a/airflow/providers/google/cloud/example_dags/example_automl_tables.py 
b/airflow/providers/google/cloud/example_dags/example_automl_tables.py
deleted file mode 100644
index 89006402f7..00
--- a/airflow/providers/google/cloud/example_dags/example_automl_tables.py
+++ /dev/null
@@ -1,319 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-"""
-Example Airflow DAG that uses Google AutoML services.
-"""
-from __future__ import annotations
-
-import os
-from copy import deepcopy
-from datetime import datetime
-from typing import cast
-
-from airflow import models
-from airflow.models.xcom_arg import XComArg
-from airflow.providers.google.cloud.hooks.automl import CloudAutoMLHook
-from airflow.providers.google.cloud.operators.automl import (
-AutoMLBatchPredictOperator,
-AutoMLCreateDatasetOperator,
-AutoMLDeleteDatasetOperator,
-AutoMLDeleteModelOperator,
-AutoMLDeployModelOperator,
-AutoMLGetModelOperator,
-AutoMLImportDataOperator,
-AutoMLListDatasetOperator,
-AutoMLPredictOperator,
-AutoMLTablesListColumnSpecsOperator,
-AutoMLTablesListTableSpecsOperator,
-AutoMLTablesUpdateDatasetOperator,
-AutoMLTrainModelOperator,
-)
-
-START_DATE = datetime(2021, 1, 1)
-
-GCP_PROJECT_ID = os.environ.get("GCP_PROJECT_ID", "your-project-id")
-GCP_AUTOML_LOCATION = os.environ.get("GCP_AUTOML_LOCATION", "us-central1")
-GCP_AUTOML_DATASET_BUCKET = os.environ.get(
-"GCP_AUTOML_DATASET_BUCKET", "gs://INVALID BUCKET NAME/bank-marketing.csv"
-)
-TARGET = os.environ.get("GCP_AUTOML_TARGET", "Deposit")
-
-# Example values
-MODEL_ID = "TBL123456"
-DATASET_ID = "TBL123456"
-
-# Example model
-MODEL = {
-"display_name": "auto_model_1",
-"dataset_id": DATASET_ID,
-"tables_model_metadata": {"train_budget_milli_node_hours": 1000},
-}
-
-# Example dataset
-DATASET = {
-"display_name": "test_set",
-"tables_dataset_metadata": {"target_column_spec_id": ""},
-}
-
-IMPORT_INPUT_CONFIG = {"gcs_source": {"input_uris": 
[GCP_AUTOML_DATASET_BUCKET]}}
-
-extract_object_id = CloudAutoMLHook.extract_object_id
-
-
-def get_target_column_spec(columns_specs: list[dict], column_name: str) -> str:
-"""
-Using column name returns spec of the column.
-"""
-for column in columns_specs:
-if column["display_name"] == column_name:
-return extract_object_id(column)
-raise Exception(f"Unknown target column: {column_name}")
-
-
-# Example DAG to create dataset, train model_id and deploy it.
-with models.DAG(
-"example_create_and_deploy",
-start_date=START_DATE,
-catchup=False,
-user_defined_macros={
-"get_target_column_spec": get_target_column_spec,
-"target": TARGET,
-"extract_object_id": extract_object_id,
-},
-tags=["example"],
-) as create_deploy_dag:
-# [START howto_operator_automl_create_dataset]
-create_dataset_task = AutoMLCreateDatasetOperator(
-task_id="create_dataset_task",
-dataset=DA

[GitHub] [airflow] sudohainguyen commented on issue #28399: Dataproc Generator should allow creating spot instances

2023-01-23 Thread via GitHub


sudohainguyen commented on issue #28399:
URL: https://github.com/apache/airflow/issues/28399#issuecomment-1400553157

   due to the latest change from google 
(https://github.com/googleapis/python-dataproc/pull/512)
   I'm resuming the work from now on


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] nirutgupta opened a new issue, #29108: Unable to login in iframe

2023-01-23 Thread via GitHub


nirutgupta opened a new issue, #29108:
URL: https://github.com/apache/airflow/issues/29108

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   Airflow Version : 2.3.2
   Helmchart Version: 1.3.0
   
   When trying to login with admin credentials via iframe, it is getting 
redirected back to login page.
   Auth getting used : airflow.api.auth.backend.basic_auth
   
   ### What you think should happen instead
   
   Should be able to login just like how it should be when opened in a new tab.
   
   ### How to reproduce
   
   `webserver: 
 webserverConfig:
   WTF_CSRF_ENABLED = False `
   
   I am using Kubernetes Executor and auth as 
   ```
   api:
   auth_backend: airflow.api.auth.backend.basic_auth
   ```
   
   There are multiple webservers pods though it can be reproducible even in one 
webserver instance setup.
   
   ### Operating System
   
   apache/airflow:2.3.2-python3.8 this is the image I am using.
   
   ### Versions of Apache Airflow Providers
   
   2.3.2
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   I have added the related details under How to reproduce section. Putting the 
ingress section as well.
   
   ```
   ingress:
 # Enable ingress resource
 enabled: true
   
 # Configs for the Ingress of the web Service
 web:
   # Annotations for the web Ingress
   annotations:
 kubernetes.io/ingress.class: nginx
 nginx.ingress.kubernetes.io/ssl-redirect: "false"
 host: ""
   ```
   
   ### Anything else
   
   Proper documentation is lagging for the iframe enable setup. I know this 
feature is added for security purposes but they should be disabled with the 
flag. 
   
   I have tried adding these two environment variables under webserver config 
section after searching but no luck.
   SESSION_COOKIE_SAMESITE = 'None'
   SAMESITE = 'None'
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] saulbein commented on issue #27657: Zombie tasks detected after service restart sometimes do not retry

2023-01-23 Thread via GitHub


saulbein commented on issue #27657:
URL: https://github.com/apache/airflow/issues/27657#issuecomment-1400584723

   Updated to 2.5.0 (unfortunately right before 2.5.1). updated the issue 
description with the current providers that we use and the log itself. Now we 
don't really get tasks restarting anymore, anything that is running during 
shutdown gets killed and doesn't attempt to retry.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jedcunningham commented on a diff in pull request #29078: enhance production guide with a few Argo specific guidelines [#20999][#28637]

2023-01-23 Thread via GitHub


jedcunningham commented on code in PR #29078:
URL: https://github.com/apache/airflow/pull/29078#discussion_r1084232512


##
docs/helm-chart/index.rst:
##
@@ -140,3 +140,18 @@ will not start as the migrations will not be run:
 This is so these CI/CD services can perform updates without issues and 
preserve the immutability of Kubernetes Job manifests.
 
 This also applies if you install the chart using ``--wait`` in your ``helm 
install`` command.
+
+.. note::
+While deploying this Helm chart with Argo, you might encounter issues with 
database migrations not running automatically on upgrade.
+
+To ensure database migrations with Argo CD, you will need to add:

Review Comment:
   ```suggestion
   To run database migrations with Argo CD automatically, you will need to add:
   ```
   
   nit



##
docs/helm-chart/index.rst:
##
@@ -140,3 +140,18 @@ will not start as the migrations will not be run:
 This is so these CI/CD services can perform updates without issues and 
preserve the immutability of Kubernetes Job manifests.
 
 This also applies if you install the chart using ``--wait`` in your ``helm 
install`` command.
+
+.. note::
+While deploying this Helm chart with Argo, you might encounter issues with 
database migrations not running automatically on upgrade.
+
+To ensure database migrations with Argo CD, you will need to add:
+
+.. code-block:: yaml
+
+migrateDatabaseJob:
+jobAnnotations:
+"argocd.argoproj.io/hook": Sync
+
+This will run database migrations when the Airflow Docker image is upgraded. 
This approach has a limitation in that the database migrations will run every 
time there is a ``Sync`` event in Argo. This is a trade-off for automation at 
the cost of some computational loss.

Review Comment:
   ```suggestion
   This will run database migrations every time there is a ``Sync`` event in 
Argo CD. While it is not ideal to run the migrations on every sync, it is a 
tradeoff that allows them to be run automatically.
   ```
   
   Maybe something like this?
   
   I think we could probably do without this, as this is the same situation 
with the helm hook. Though this might be my maintainer bias speaking here, so 
happy to leave it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi commented on a diff in pull request #27063: AIP-50 Trigger UI based on FAB

2023-01-23 Thread via GitHub


bbovenzi commented on code in PR #27063:
URL: https://github.com/apache/airflow/pull/27063#discussion_r1084245521


##
airflow/www/templates/airflow/trigger.html:
##
@@ -27,11 +27,93 @@
   
 {% endblock %}
 
+{% macro form_element(form_key, form_details) %}
+
+  
+
+  {%- if "title" in form_details.schema and form_details.schema.title -%}
+{{ form_details.schema.title }}
+  {%- else -%}
+{{ form_key }}
+  {%- endif -%}
+  {%- if form_details.schema and form_details.schema.type and not "null" 
in form_details.schema.type and not "boolean" in form_details.schema.type -%}
+ *
+  {%- endif -%}
+: 
+  
+  
+{% if "custom_html_form" in form_details.schema %}
+  {{ form_details.schema.custom_html_form | replace("{name}", "element_" + 
form_key) | replace("{value}", form_details.value) }}
+{% elif "type" in form_details.schema and form_details.schema.type == 
"boolean" %}
+  
+
+
+  
+{% elif "format" in form_details.schema and "date-time" in 
form_details.schema.format %}
+  
+calendar_today
+
+  
+{% elif "format" in form_details.schema and "date" in 
form_details.schema.format %}
+  
+calendar_today
+
+  
+{% elif "format" in form_details.schema and "time" in 
form_details.schema.format %}
+  
+calendar_today
+
+  
+{% elif "enum" in form_details.schema and form_details.schema.enum %}
+  
+{% for option in form_details.schema.enum -%}
+{{ option }}
+{% endfor -%}
+  
+{% elif form_details.schema and "array" in form_details.schema.type %}
+  
+{%- for txt in form_details.value -%}
+  {{ txt }}{{ "\n" }}
+{%- endfor -%}
+  
+{% elif form_details.schema and "object" in form_details.schema.type %}
+  
+{{- form_details.value | tojson() -}}
+  
+{% elif form_details.schema and ("integer" in form_details.schema.type or 
"number" in form_details.schema.type) %}
+  
+{% else %}
+  
+{% endif %}
+{% if form_details.description -%}

Review Comment:
   Could you include some screenshots of what all this looks like?
   I _feel_ like the description would be best above all of these fields. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #29109: [Google Cloud] DataprocCreateBatchOperator returns incorrect results and does not reattach

2023-01-23 Thread boring-cyborg


boring-cyborg[bot] commented on issue #29109:
URL: https://github.com/apache/airflow/issues/29109#issuecomment-1400596078

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] kristopherkane opened a new issue, #29109: [Google Cloud] DataprocCreateBatchOperator returns incorrect results and does not reattach

2023-01-23 Thread via GitHub


kristopherkane opened a new issue, #29109:
URL: https://github.com/apache/airflow/issues/29109

   ### Apache Airflow version
   
   main (development)
   
   ### What happened
   
   The provider operator and hooks for Google Cloud Dataproc has two bugs: 
   
   1. The running 
[operator](https://github.com/apache/airflow/blob/main/airflow/providers/google/cloud/operators/dataproc.py#L2123-L2124)
 returns successful even if the job transitions to State.CANCELLED or State 
CANCELLING 
   2. It 
[attempts](https://github.com/apache/airflow/blob/main/airflow/providers/google/cloud/operators/dataproc.py#L2154)
 to 'reattach' to a potentially running job if it AlreadyExists, but it sends 
the wrong type since 'result' is a Batch and needs Operation
   
   ### What you think should happen instead
   
   A new hook that polls for batch job completion.  There is precedent for it 
in traditional dataproc with 'wait_for_job'.
   
   ### How to reproduce
   
   Use the Breeze environment and a DAG that runs DataprocCreateBatchOperator.  
Allow the first instance to start. 
   
   Use the gcloud CLI to cancel the job. 
   
   `gcloud dataproc batches cancel  --project  --region 
`
   
   Observe that the task completes successfully after a 3-5 minute timeout, 
even though the job was cancelled. 
   
   Run the task again with the same batch_id.  Observe the ValueError where it 
expects Operation but receives Batch
   
   
   
   ### Operating System
   
   Darwin 5806 21.6.0 Darwin Kernel Version 21.6.0: Mon Aug 22 20:17:10 PDT 
2022; root:xnu-8020.140.49~2/RELEASE_X86_64 x86_64
   
   ### Versions of Apache Airflow Providers
   
   Same as dev (main) version. 
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   Observable in the Breeze environment, when running against real Google 
Infrastructure. 
   
   ### Anything else
   
   Every time. 
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] MrGeorgeOwl commented on pull request #28406: Add defer mode to GKECreateClusterOperator and GKEDeleteClusterOperator

2023-01-23 Thread via GitHub


MrGeorgeOwl commented on PR #28406:
URL: https://github.com/apache/airflow/pull/28406#issuecomment-1400598190

   @potiuk, I think PR is good and can be merged


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi commented on a diff in pull request #27063: AIP-50 Trigger UI based on FAB

2023-01-23 Thread via GitHub


bbovenzi commented on code in PR #27063:
URL: https://github.com/apache/airflow/pull/27063#discussion_r1084252178


##
airflow/www/static/js/trigger.js:
##
@@ -19,33 +19,190 @@
 
 /* global document, CodeMirror, window */
 
-const textArea = document.getElementById('json');
+let jsonForm;
+const objectFields = new Map();
 const recentConfigList = document.getElementById('recent_configs');
-const minHeight = 300;
-const maxHeight = window.innerHeight - 450;
-const height = maxHeight > minHeight ? maxHeight : minHeight;
-
-CodeMirror.fromTextArea(textArea, {
-  lineNumbers: true,
-  mode: {
-name: 'javascript',
-json: true,
-  },
-  gutters: ['CodeMirror-lint-markers'],
-  lint: true,
-})
-  .setSize(null, height);
+
+/**
+ * Update the generated JSON DagRun.conf JSON field if any field changed
+ */
+function updateJSONconf() {
+  const jsonStart = document.getElementById('json_start').value;
+  const params = JSON.parse(jsonStart);
+  const elements = document.getElementById('trigger_form');
+  for (let i = 0; i < elements.length; i += 1) {
+if (elements[i].name && elements[i].name.startsWith('element_')) {
+  const keyName = elements[i].name.substr(8);
+  if (elements[i].type === 'checkbox') {
+params[keyName] = elements[i].checked;
+  } else if (elements[i].attributes.valuetype && 
elements[i].attributes.valuetype.value === 'array') {
+const lines = elements[i].value.split('\n');
+const values = [];
+for (let j = 0; j < lines.length; j += 1) {
+  if (lines[j].trim().length > 0) {
+values[values.length] = lines[j].trim();
+  }
+}
+params[keyName] = values;
+  } else if (elements[i].value.length === 0) {
+params[keyName] = null;
+  } else if (elements[i].attributes.valuetype && 
elements[i].attributes.valuetype.value === 'object') {
+try {
+  const textValue = objectFields.get(elements[i].name).getValue();
+  if (textValue.length > 0) {
+const objValue = JSON.parse(textValue);
+params[keyName] = objValue;
+
objectFields.get(elements[i].name).setValue(JSON.stringify(objValue, null, 4));
+  } else {
+params[keyName] = null;
+  }
+} catch (e) {
+  // ignore JSON parsing errors

Review Comment:
   Do we not show anything if there's an error with json parsing?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi commented on a diff in pull request #27063: AIP-50 Trigger UI based on FAB

2023-01-23 Thread via GitHub


bbovenzi commented on code in PR #27063:
URL: https://github.com/apache/airflow/pull/27063#discussion_r1084254664


##
airflow/www/static/js/trigger.js:
##
@@ -19,33 +19,190 @@
 
 /* global document, CodeMirror, window */
 
-const textArea = document.getElementById('json');
+let jsonForm;
+const objectFields = new Map();
 const recentConfigList = document.getElementById('recent_configs');
-const minHeight = 300;
-const maxHeight = window.innerHeight - 450;
-const height = maxHeight > minHeight ? maxHeight : minHeight;
-
-CodeMirror.fromTextArea(textArea, {
-  lineNumbers: true,
-  mode: {
-name: 'javascript',
-json: true,
-  },
-  gutters: ['CodeMirror-lint-markers'],
-  lint: true,
-})
-  .setSize(null, height);
+
+/**
+ * Update the generated JSON DagRun.conf JSON field if any field changed
+ */
+function updateJSONconf() {
+  const jsonStart = document.getElementById('json_start').value;
+  const params = JSON.parse(jsonStart);
+  const elements = document.getElementById('trigger_form');
+  for (let i = 0; i < elements.length; i += 1) {
+if (elements[i].name && elements[i].name.startsWith('element_')) {
+  const keyName = elements[i].name.substr(8);
+  if (elements[i].type === 'checkbox') {
+params[keyName] = elements[i].checked;
+  } else if (elements[i].attributes.valuetype && 
elements[i].attributes.valuetype.value === 'array') {
+const lines = elements[i].value.split('\n');
+const values = [];
+for (let j = 0; j < lines.length; j += 1) {
+  if (lines[j].trim().length > 0) {
+values[values.length] = lines[j].trim();
+  }
+}
+params[keyName] = values;
+  } else if (elements[i].value.length === 0) {
+params[keyName] = null;
+  } else if (elements[i].attributes.valuetype && 
elements[i].attributes.valuetype.value === 'object') {
+try {
+  const textValue = objectFields.get(elements[i].name).getValue();
+  if (textValue.length > 0) {
+const objValue = JSON.parse(textValue);
+params[keyName] = objValue;
+
objectFields.get(elements[i].name).setValue(JSON.stringify(objValue, null, 4));
+  } else {
+params[keyName] = null;
+  }
+} catch (e) {
+  // ignore JSON parsing errors
+}
+  } else if (Number.isNaN(elements[i].value)) {
+params[keyName] = elements[i].value;
+  } else if (elements[i].attributes.valuetype && 
elements[i].attributes.valuetype.value === 'number') {
+params[keyName] = Number(elements[i].value);
+  } else {
+params[keyName] = elements[i].value;
+  }
+}
+  }
+  jsonForm.setValue(JSON.stringify(params, null, 4));
+}
+
+/**
+ * Initialize the form during load of the web page
+ */
+function initForm() {
+  const formSectionsElement = document.getElementById('form_sections');
+  const formWithFields = (formSectionsElement != null);

Review Comment:
   ```suggestion
 const formHasFields = (formSectionsElement != null);
   ```
   Just nitpicking.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] kristopherkane commented on issue #29109: [Google Cloud] DataprocCreateBatchOperator returns incorrect results and does not reattach

2023-01-23 Thread via GitHub


kristopherkane commented on issue #29109:
URL: https://github.com/apache/airflow/issues/29109#issuecomment-1400609969

   That label should probably be 'providers'


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi commented on a diff in pull request #27063: AIP-50 Trigger UI based on FAB

2023-01-23 Thread via GitHub


bbovenzi commented on code in PR #27063:
URL: https://github.com/apache/airflow/pull/27063#discussion_r1084258122


##
airflow/www/static/js/trigger.js:
##
@@ -19,33 +19,190 @@
 
 /* global document, CodeMirror, window */
 
-const textArea = document.getElementById('json');
+let jsonForm;
+const objectFields = new Map();
 const recentConfigList = document.getElementById('recent_configs');
-const minHeight = 300;
-const maxHeight = window.innerHeight - 450;
-const height = maxHeight > minHeight ? maxHeight : minHeight;
-
-CodeMirror.fromTextArea(textArea, {
-  lineNumbers: true,
-  mode: {
-name: 'javascript',
-json: true,
-  },
-  gutters: ['CodeMirror-lint-markers'],
-  lint: true,
-})
-  .setSize(null, height);
+
+/**
+ * Update the generated JSON DagRun.conf JSON field if any field changed
+ */
+function updateJSONconf() {
+  const jsonStart = document.getElementById('json_start').value;
+  const params = JSON.parse(jsonStart);
+  const elements = document.getElementById('trigger_form');
+  for (let i = 0; i < elements.length; i += 1) {
+if (elements[i].name && elements[i].name.startsWith('element_')) {
+  const keyName = elements[i].name.substr(8);
+  if (elements[i].type === 'checkbox') {
+params[keyName] = elements[i].checked;
+  } else if (elements[i].attributes.valuetype && 
elements[i].attributes.valuetype.value === 'array') {
+const lines = elements[i].value.split('\n');
+const values = [];
+for (let j = 0; j < lines.length; j += 1) {
+  if (lines[j].trim().length > 0) {
+values[values.length] = lines[j].trim();
+  }
+}
+params[keyName] = values;
+  } else if (elements[i].value.length === 0) {
+params[keyName] = null;
+  } else if (elements[i].attributes.valuetype && 
elements[i].attributes.valuetype.value === 'object') {
+try {
+  const textValue = objectFields.get(elements[i].name).getValue();
+  if (textValue.length > 0) {
+const objValue = JSON.parse(textValue);
+params[keyName] = objValue;
+
objectFields.get(elements[i].name).setValue(JSON.stringify(objValue, null, 4));
+  } else {
+params[keyName] = null;
+  }
+} catch (e) {
+  // ignore JSON parsing errors
+}
+  } else if (Number.isNaN(elements[i].value)) {
+params[keyName] = elements[i].value;
+  } else if (elements[i].attributes.valuetype && 
elements[i].attributes.valuetype.value === 'number') {
+params[keyName] = Number(elements[i].value);
+  } else {
+params[keyName] = elements[i].value;
+  }
+}
+  }
+  jsonForm.setValue(JSON.stringify(params, null, 4));
+}
+
+/**
+ * Initialize the form during load of the web page
+ */
+function initForm() {
+  const formSectionsElement = document.getElementById('form_sections');
+  const formWithFields = (formSectionsElement != null);
+
+  // Initialize the Generated JSON form or JSON entry form
+  const minHeight = 300;
+  const maxHeight = (formWithFields ? window.innerHeight / 2 : 
window.innerHeight) - 550;
+  const height = maxHeight > minHeight ? maxHeight : minHeight;
+  jsonForm = CodeMirror.fromTextArea(document.getElementById('json'), {
+lineNumbers: true,
+mode: { name: 'javascript', json: true },
+gutters: ['CodeMirror-lint-markers'],
+lint: true,
+  });
+  jsonForm.setSize(null, height);
+
+  if (formWithFields) {
+// Apply JSON formatting and linting to all object fields in the form
+const elements = document.getElementById('trigger_form');
+for (let i = 0; i < elements.length; i += 1) {
+  if (elements[i].name && elements[i].name.startsWith('element_')) {
+if (elements[i].attributes.valuetype
+&& elements[i].attributes.valuetype.value === 'object') {
+  const field = CodeMirror.fromTextArea(elements[i], {
+lineNumbers: true,
+mode: { name: 'javascript', json: true },
+gutters: ['CodeMirror-lint-markers'],
+lint: true,
+  });
+  /* eslint-disable no-unused-vars */
+  field.on('blur', (cm, change) => { updateJSONconf(); });

Review Comment:
   ```suggestion
 field.on('blur', updateJSONconf);
   ```
   I think this should remove the need for the `eslint-disable`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27156: Add documentation for BigQuery transfer operators

2023-01-23 Thread via GitHub


potiuk commented on PR #27156:
URL: https://github.com/apache/airflow/pull/27156#issuecomment-1400616833

   > @eladkal Please assign me
   
   You can just take the PR and start working on yours (and make reference to 
this PR). There is no need to assign it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] alicj commented on a diff in pull request #21877: AIP-45 Remove dag parsing in airflow run local

2023-01-23 Thread via GitHub


alicj commented on code in PR #21877:
URL: https://github.com/apache/airflow/pull/21877#discussion_r1084263956


##
airflow/task/task_runner/base_task_runner.py:
##
@@ -95,24 +89,16 @@ def __init__(self, local_task_job):
 cfg_path = tmp_configuration_copy(chmod=0o600, include_env=False, 
include_cmds=False)
 
 self._cfg_path = cfg_path
-self._command = (
-popen_prepend
-+ self._task_instance.command_as_list(
-raw=True,
-pickle_id=local_task_job.pickle_id,
-mark_success=local_task_job.mark_success,
-job_id=local_task_job.id,
-pool=local_task_job.pool,
-cfg_path=cfg_path,
-)
-+ ["--error-file", self._error_file.name]

Review Comment:
   Hi @pingzh , thanks for working on Airflow! I'm curious to know why the 
`--error-file` parameter is removed here?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #29086: Remove upper bound limitation for `pytest`

2023-01-23 Thread via GitHub


potiuk merged PR #29086:
URL: https://github.com/apache/airflow/pull/29086


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (90e6277da6 -> c0f6bfa35b)

2023-01-23 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 90e6277da6 Auto ML assets (#25466)
 add c0f6bfa35b Remove upper bound limitation for `pytest` (#29086)

No new revisions were added by this update.

Summary of changes:
 setup.py | 10 ++
 1 file changed, 2 insertions(+), 8 deletions(-)



[GitHub] [airflow] potiuk commented on issue #29102: Airflow UI not showing mapped tasks

2023-01-23 Thread via GitHub


potiuk commented on issue #29102:
URL: https://github.com/apache/airflow/issues/29102#issuecomment-1400621201

   CC: @bbovenzi @pierrejeambrun @ephraimbuddy - if that one is confirmed, we 
might have a need for 2.5.2 rather quickly


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #29087: Unquarantine receive SIGTERM on Task Runner test (second attempt)

2023-01-23 Thread via GitHub


potiuk commented on PR #29087:
URL: https://github.com/apache/airflow/pull/29087#issuecomment-1400623052

   All was good. Rebased again with "use public runners".


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27829: Improving the release process

2023-01-23 Thread via GitHub


potiuk commented on PR #27829:
URL: https://github.com/apache/airflow/pull/27829#issuecomment-1400624961

   Nice!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27829: Improving the release process

2023-01-23 Thread via GitHub


potiuk commented on PR #27829:
URL: https://github.com/apache/airflow/pull/27829#issuecomment-1400625486

   Feel free to merge if you think it's GOOD and useful :).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] auvipy commented on pull request #29086: Remove upper bound limitation for `pytest`

2023-01-23 Thread via GitHub


auvipy commented on PR #29086:
URL: https://github.com/apache/airflow/pull/29086#issuecomment-1400626127

   thank you both


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] sudohainguyen commented on issue #28399: Dataproc Generator should allow creating spot instances

2023-01-23 Thread via GitHub


sudohainguyen commented on issue #28399:
URL: https://github.com/apache/airflow/issues/28399#issuecomment-1400637319

   to fulfill the need to create secondary workers as SPOT type, 
`google-cloud-dataproc` version must be `5.3.0`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] josh-fell commented on pull request #29048: Allow downloading of dbt Cloud artifacts to non-existent paths

2023-01-23 Thread via GitHub


josh-fell commented on PR #29048:
URL: https://github.com/apache/airflow/pull/29048#issuecomment-1400637565

   Rebased and resolved conflicts.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jedcunningham commented on a diff in pull request #28808: Allow setting the name for the base container within K8s Pod Operator

2023-01-23 Thread via GitHub


jedcunningham commented on code in PR #28808:
URL: https://github.com/apache/airflow/pull/28808#discussion_r1084275635


##
kubernetes_tests/test_kubernetes_pod_operator.py:
##
@@ -1153,3 +1153,134 @@ def test_using_resources(self):
 do_xcom_push=False,
 resources=resources,
 )
+
+def test_changing_base_container_name_with_get_logs(self):
+k = KubernetesPodOperator(
+namespace="default",
+image="ubuntu:16.04",
+cmds=["bash", "-cx"],
+arguments=["echo 10"],
+labels=self.labels,
+task_id=str(uuid4()),
+in_cluster=False,
+do_xcom_push=False,
+get_logs=True,
+base_container_name="apple-sauce",
+)
+assert k.base_container_name == "apple-sauce"
+context = create_context(k)
+with mock.patch.object(
+k.pod_manager, "fetch_container_logs", 
wraps=k.pod_manager.fetch_container_logs
+) as mock_fetch_container_logs:
+k.execute(context)
+
+assert mock_fetch_container_logs.call_args[1]["container_name"] == 
"apple-sauce"
+actual_pod = self.api_client.sanitize_for_serialization(k.pod)
+self.expected_pod["spec"]["containers"][0]["name"] = "apple-sauce"
+assert self.expected_pod["spec"] == actual_pod["spec"]
+
+def test_changing_base_container_name_no_logs(self):
+"""
+This test checks BOTH a modified base container name AND the 
get_logs=False flow..
+  and as a result, also checks that the flow works with fast containers
+  See #26796

Review Comment:
   ```suggestion
   This test checks BOTH a modified base container name AND the 
get_logs=False flow,
   and as a result, also checks that the flow works with fast containers
   See https://github.com/apache/airflow/issues/26796
   ```



##
kubernetes_tests/test_kubernetes_pod_operator.py:
##
@@ -1153,3 +1153,134 @@ def test_using_resources(self):
 do_xcom_push=False,
 resources=resources,
 )
+
+def test_changing_base_container_name_with_get_logs(self):
+k = KubernetesPodOperator(
+namespace="default",
+image="ubuntu:16.04",
+cmds=["bash", "-cx"],
+arguments=["echo 10"],
+labels=self.labels,
+task_id=str(uuid4()),
+in_cluster=False,
+do_xcom_push=False,
+get_logs=True,
+base_container_name="apple-sauce",
+)
+assert k.base_container_name == "apple-sauce"
+context = create_context(k)
+with mock.patch.object(
+k.pod_manager, "fetch_container_logs", 
wraps=k.pod_manager.fetch_container_logs
+) as mock_fetch_container_logs:
+k.execute(context)
+
+assert mock_fetch_container_logs.call_args[1]["container_name"] == 
"apple-sauce"
+actual_pod = self.api_client.sanitize_for_serialization(k.pod)
+self.expected_pod["spec"]["containers"][0]["name"] = "apple-sauce"
+assert self.expected_pod["spec"] == actual_pod["spec"]
+
+def test_changing_base_container_name_no_logs(self):
+"""
+This test checks BOTH a modified base container name AND the 
get_logs=False flow..
+  and as a result, also checks that the flow works with fast containers
+  See #26796
+"""
+k = KubernetesPodOperator(
+namespace="default",
+image="ubuntu:16.04",
+cmds=["bash", "-cx"],
+arguments=["echo 10"],
+labels=self.labels,
+task_id=str(uuid4()),
+in_cluster=False,
+do_xcom_push=False,
+get_logs=False,
+base_container_name="apple-sauce",
+)
+assert k.base_container_name == "apple-sauce"
+context = create_context(k)
+with mock.patch.object(
+k.pod_manager, "await_container_completion", 
wraps=k.pod_manager.await_container_completion
+) as mock_await_container_completion:
+k.execute(context)
+
+assert mock_await_container_completion.call_args[1]["container_name"] 
== "apple-sauce"
+actual_pod = self.api_client.sanitize_for_serialization(k.pod)
+self.expected_pod["spec"]["containers"][0]["name"] = "apple-sauce"
+assert self.expected_pod["spec"] == actual_pod["spec"]
+
+def test_changing_base_container_name_no_logs_long(self):
+"""
+Similar to test_changing_base_container_name_no_logs, but ensures that
+pods running longer than 1 second work too. See #26796

Review Comment:
   ```suggestion
   pods running longer than 1 second work too.
   See https://github.com/apache/airflow/issues/26796
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, ple

[GitHub] [airflow] sean-rose commented on a diff in pull request #29065: When clearing task instances try to get associated DAGs from database

2023-01-23 Thread via GitHub


sean-rose commented on code in PR #29065:
URL: https://github.com/apache/airflow/pull/29065#discussion_r1084283710


##
tests/models/test_cleartasks.py:
##
@@ -202,9 +203,9 @@ def test_clear_task_instances_without_task(self, dag_maker):
 # but it works for our case because we specifically constructed 
test DAGS
 # in the way that those two sort methods are equivalent
 qry = session.query(TI).filter(TI.dag_id == 
dag.dag_id).order_by(TI.task_id).all()
-clear_task_instances(qry, session)
+clear_task_instances(qry, session, dag=dag)

Review Comment:
   It wasn't failing outright, but it wasn't testing the intended logic.  With 
no DAG passed in it was essentially testing the exact same codepath as 
`test_clear_task_instances_without_dag`, where it short circuited [this 
conditional](https://github.com/apache/airflow/blob/c0f6bfa35b82f9a756b31bbba739c9918a28c771/airflow/models/taskinstance.py#L193)
 and never ran `dag.has_task(task_id)`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] kristopherkane opened a new pull request, #29111: Dataproc batches

2023-01-23 Thread via GitHub


kristopherkane opened a new pull request, #29111:
URL: https://github.com/apache/airflow/pull/29111

   #28399 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on pull request #29111: Dataproc batches

2023-01-23 Thread boring-cyborg


boring-cyborg[bot] commented on PR #29111:
URL: https://github.com/apache/airflow/pull/29111#issuecomment-1400644355

   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (ruff, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it's a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] kristopherkane commented on issue #29109: [Google Cloud] DataprocCreateBatchOperator returns incorrect results and does not reattach

2023-01-23 Thread via GitHub


kristopherkane commented on issue #29109:
URL: https://github.com/apache/airflow/issues/29109#issuecomment-1400653822

   How I tested this with the Breeze environment and a real GCP project: 
   
   - Batches DAG run from start to finish successfully
   - Batches DAG run that is cancelled after submission, returns 
AirflowException
   - Batches DAG run where the Airflow task PID is `kill -9` while the job is 
running.  Task retry happens, AlreadyExists is caught, reattach happens on 
first retry since job is in State.RUNNING


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #8545: [AIRFLOW-249] Refactor the SLA mechanism (Continuation from #3584 )

2023-01-23 Thread via GitHub


potiuk commented on PR #8545:
URL: https://github.com/apache/airflow/pull/8545#issuecomment-1400660574

   > to take it over where should some one focus? fixing conflicts first
   
   I guess with how far behind this change is and how many conflicts it has, 
starting from the scratch following the ideas here is far better idea.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #29077: add retries to stop_pipeline on conflict

2023-01-23 Thread via GitHub


potiuk merged PR #29077:
URL: https://github.com/apache/airflow/pull/29077


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on pull request #27829: Improving the release process

2023-01-23 Thread via GitHub


ephraimbuddy commented on PR #27829:
URL: https://github.com/apache/airflow/pull/27829#issuecomment-1400675244

   > Feel free to merge if you think it's GOOD and useful :).
   
   Been thinking about the useful side of it :) let @jedcunningham decide 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (c0f6bfa35b -> 6190e34388)

2023-01-23 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from c0f6bfa35b Remove upper bound limitation for `pytest` (#29086)
 add 6190e34388 add retries to stop_pipeline on conflict (#29077)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/hooks/sagemaker.py| 42 +++---
 tests/providers/amazon/aws/hooks/test_sagemaker.py | 40 +++--
 2 files changed, 67 insertions(+), 15 deletions(-)



[GitHub] [airflow] potiuk commented on a diff in pull request #29077: add retries to stop_pipeline on conflict

2023-01-23 Thread via GitHub


potiuk commented on code in PR #29077:
URL: https://github.com/apache/airflow/pull/29077#discussion_r1084317558


##
airflow/providers/amazon/aws/hooks/sagemaker.py:
##
@@ -1154,19 +1154,35 @@ def stop_pipeline(
 :return: Status of the pipeline execution after the operation.
 One of 'Executing'|'Stopping'|'Stopped'|'Failed'|'Succeeded'.
 """
-try:
-
self.conn.stop_pipeline_execution(PipelineExecutionArn=pipeline_exec_arn)
-except ClientError as ce:
-# we have to rely on the message to catch the right error here, 
because its type
-# (ValidationException) is shared with other kinds of error (for 
instance, badly formatted ARN)
-if (
-not fail_if_not_running
-and "Only pipelines with 'Executing' status can be stopped" in 
ce.response["Error"]["Message"]
-):
-self.log.warning("Cannot stop pipeline execution, as it was 
not running: %s", ce)
-else:
-self.log.error(ce)
-raise
+retries = 2  # i.e. 3 calls max, 1 initial + 2 retries

Review Comment:
   For those cases, I think we should only add configurable options if we 
cannot figure our sensible defaults - we already have too many options accross 
Airflow code. 
   
   Number of retries in this case is kind of reasonable (even if it is magic). 
And I also think having a comment like that:
   ```
retries = 2  # i.e. 3 calls max, 1 initial + 2 retries
   ```
   
   is FAR better than a constant defined somewhere at the top of the file. You 
do not have to look it up in the file. There is absolutely no reason to create 
a constant if it is used in one place and comment describing it. It does not 
solve any problem, it only adds the need to make an extra lookup.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] josh-fell closed issue #27107: Dbt cloud download artifact to a path not present fails

2023-01-23 Thread via GitHub


josh-fell closed issue #27107: Dbt cloud download artifact to a path not 
present fails
URL: https://github.com/apache/airflow/issues/27107


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] josh-fell merged pull request #29048: Allow downloading of dbt Cloud artifacts to non-existent paths

2023-01-23 Thread via GitHub


josh-fell merged PR #29048:
URL: https://github.com/apache/airflow/pull/29048


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (6190e34388 -> f805b4154a)

2023-01-23 Thread joshfell
This is an automated email from the ASF dual-hosted git repository.

joshfell pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 6190e34388 add retries to stop_pipeline on conflict (#29077)
 add f805b4154a Allow downloading of dbt Cloud artifacts to non-existent 
paths (#29048)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/dbt/cloud/operators/dbt.py   | 12 -
 .../dbt/cloud/operators/test_dbt_cloud.py  | 56 --
 2 files changed, 62 insertions(+), 6 deletions(-)



  1   2   3   >