[GitHub] [airflow] mlnsharma commented on issue #27282: KubernetesPodOperator: Option to show logs from all containers in a pod

2022-12-20 Thread GitBox


mlnsharma commented on issue #27282:
URL: https://github.com/apache/airflow/issues/27282#issuecomment-1360965724

   Hi @pulquero @bdsoha @potiuk 
   I have my code changes ready, could you please review it and merge it?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch constraints-main updated: Updating constraints. Build id:

2022-12-20 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch constraints-main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/constraints-main by this push:
 new 80e826fd5a Updating constraints. Build id:
80e826fd5a is described below

commit 80e826fd5a1fde8ac0a023889c2b70c0210e4bcd
Author: Automated GitHub Actions commit 
AuthorDate: Wed Dec 21 07:23:10 2022 +

Updating constraints. Build id:

This update in constraints is automatically committed by the CI 
'constraints-push' step based on
HEAD of '' in ''
with commit sha .

All tests passed in this build so we determined we can push the updated 
constraints.

See 
https://github.com/apache/airflow/blob/main/README.md#installing-from-pypi for 
details.
---
 constraints-3.10.txt  | 12 ++--
 constraints-3.7.txt   | 12 ++--
 constraints-3.8.txt   | 12 ++--
 constraints-3.9.txt   | 12 ++--
 constraints-no-providers-3.10.txt |  4 ++--
 constraints-no-providers-3.7.txt  |  4 ++--
 constraints-no-providers-3.8.txt  |  4 ++--
 constraints-no-providers-3.9.txt  |  4 ++--
 constraints-source-providers-3.10.txt | 14 +++---
 constraints-source-providers-3.7.txt  | 14 +++---
 constraints-source-providers-3.8.txt  | 14 +++---
 constraints-source-providers-3.9.txt  | 14 +++---
 12 files changed, 60 insertions(+), 60 deletions(-)

diff --git a/constraints-3.10.txt b/constraints-3.10.txt
index 5e65d892b8..e63669a98a 100644
--- a/constraints-3.10.txt
+++ b/constraints-3.10.txt
@@ -1,5 +1,5 @@
 #
-# This constraints file was automatically generated on 2022-12-20T09:16:44Z
+# This constraints file was automatically generated on 2022-12-21T07:22:29Z
 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow.
 # This variant of constraints install uses the HEAD of the branch version for 
'apache-airflow' but installs
 # the providers from PIP-released packages at the moment of the constraint 
generation.
@@ -173,9 +173,9 @@ billiard==3.6.4.0
 black==23.1a1
 bleach==5.0.1
 blinker==1.5
-boto3==1.26.33
+boto3==1.26.34
 boto==2.49.0
-botocore==1.29.33
+botocore==1.29.34
 bowler==0.9.0
 cachelib==0.9.0
 cachetools==4.2.2
@@ -313,7 +313,7 @@ gunicorn==20.1.0
 h11==0.14.0
 hdfs==2.7.0
 hmsclient==0.1.1
-httpcore==0.16.2
+httpcore==0.16.3
 httplib2==0.20.4
 httpx==0.23.1
 humanize==4.4.0
@@ -595,7 +595,7 @@ types-python-dateutil==2.8.19.5
 types-python-slugify==7.0.0.1
 types-pytz==2022.7.0.0
 types-redis==4.3.21.6
-types-requests==2.28.11.5
+types-requests==2.28.11.6
 types-setuptools==65.6.0.2
 types-tabulate==0.9.0.0
 types-termcolor==1.1.6
@@ -610,7 +610,7 @@ unicodecsv==0.14.1
 uritemplate==3.0.1
 urllib3==1.26.13
 userpath==1.8.0
-vertica-python==1.1.1
+vertica-python==1.2.0
 vine==5.0.0
 virtualenv==20.17.1
 volatile==2.1.0
diff --git a/constraints-3.7.txt b/constraints-3.7.txt
index 2add0a574c..774987c1c6 100644
--- a/constraints-3.7.txt
+++ b/constraints-3.7.txt
@@ -1,5 +1,5 @@
 #
-# This constraints file was automatically generated on 2022-12-20T09:17:22Z
+# This constraints file was automatically generated on 2022-12-21T07:23:06Z
 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow.
 # This variant of constraints install uses the HEAD of the branch version for 
'apache-airflow' but installs
 # the providers from PIP-released packages at the moment of the constraint 
generation.
@@ -173,9 +173,9 @@ billiard==3.6.4.0
 black==23.1a1
 bleach==5.0.1
 blinker==1.5
-boto3==1.26.33
+boto3==1.26.34
 boto==2.49.0
-botocore==1.29.33
+botocore==1.29.34
 bowler==0.9.0
 cached-property==1.5.2
 cachelib==0.9.0
@@ -313,7 +313,7 @@ gunicorn==20.1.0
 h11==0.14.0
 hdfs==2.7.0
 hmsclient==0.1.1
-httpcore==0.16.2
+httpcore==0.16.3
 httplib2==0.20.4
 httpx==0.23.1
 humanize==4.4.0
@@ -596,7 +596,7 @@ types-python-dateutil==2.8.19.5
 types-python-slugify==7.0.0.1
 types-pytz==2022.7.0.0
 types-redis==4.3.21.6
-types-requests==2.28.11.5
+types-requests==2.28.11.6
 types-setuptools==65.6.0.2
 types-tabulate==0.9.0.0
 types-termcolor==1.1.6
@@ -611,7 +611,7 @@ unicodecsv==0.14.1
 uritemplate==3.0.1
 urllib3==1.26.13
 userpath==1.8.0
-vertica-python==1.1.1
+vertica-python==1.2.0
 vine==5.0.0
 virtualenv==20.17.1
 volatile==2.1.0
diff --git a/constraints-3.8.txt b/constraints-3.8.txt
index e8349047a7..a4fd167aa2 100644
--- a/constraints-3.8.txt
+++ b/constraints-3.8.txt
@@ -1,5 +1,5 @@
 #
-# This constraints file was automatically generated on 2022-12-20T09:17:16Z
+# This constraints file was automatically generated on 2022-12-21T07:22:57Z
 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow.
 # This variant of constraints install uses the HEAD of the branch version for 
'apache-airflow' but installs
 # the providers from PIP-released packages at the m

[GitHub] [airflow] petrosbaltzis commented on issue #28465: Airflow 2.2.4 Jenkins Connection - unable to set as the hook expects to be

2022-12-20 Thread GitBox


petrosbaltzis commented on issue #28465:
URL: https://github.com/apache/airflow/issues/28465#issuecomment-1360940066

   Hi @Taragolis , Yes please assign it to me I could work on it. 
   
   There is no problem with the release version as well 👍 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Fix selective checks handling error tracebacks in CI (#28514)

2022-12-20 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new bc7feda66e Fix selective checks handling error tracebacks in CI 
(#28514)
bc7feda66e is described below

commit bc7feda66ed7bb2f2940fa90ef26ff90dd7a8c80
Author: Jarek Potiuk 
AuthorDate: Wed Dec 21 07:17:19 2022 +0100

Fix selective checks handling error tracebacks in CI (#28514)

Initially selective check was implemented in the way that it printed
diagnostic output on stdout and the GITHUB_OUTPUT compatible set of
outputs on stderr so that it could be redirected to the GITHUB_OUTPUT
in its entirety. But this turned out to be a bad idea because when
there was an error generated in selective-checks themselves, the
traceback was printed in stderr and redirecting stderr to GITHUB_OUTPUT
swallowed the traceback.

This change reverses the behaviour:

* diagnostic output is printed to stderr
* GITHUB_OUTPUT compatible output is printed to stdout

This way when traceback happens it is printed to stderr and is not
swalleowed by redirection to GITHUB_OUTPUT
---
 .github/workflows/ci.yml   |  3 +-
 .../src/airflow_breeze/commands/ci_commands.py | 12 
 .../src/airflow_breeze/utils/github_actions.py |  4 +--
 .../src/airflow_breeze/utils/selective_checks.py   | 36 +++---
 4 files changed, 28 insertions(+), 27 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 7cf54ccb7b..83d55acfea 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -221,7 +221,8 @@ jobs:
 env:
   PR_LABELS: "${{ steps.source-run-info.outputs.pr-labels }}"
   COMMIT_REF: "${{ github.sha }}"
-run: breeze ci selective-check 2>> ${GITHUB_OUTPUT}
+  VERBOSE: "false"
+run: breeze ci selective-check >> ${GITHUB_OUTPUT}
   - name: env
 run: printenv
 env:
diff --git a/dev/breeze/src/airflow_breeze/commands/ci_commands.py 
b/dev/breeze/src/airflow_breeze/commands/ci_commands.py
index 39de15a76a..e62875630b 100644
--- a/dev/breeze/src/airflow_breeze/commands/ci_commands.py
+++ b/dev/breeze/src/airflow_breeze/commands/ci_commands.py
@@ -50,7 +50,7 @@ from airflow_breeze.utils.common_options import (
 option_verbose,
 )
 from airflow_breeze.utils.confirm import Answer, user_confirm
-from airflow_breeze.utils.console import get_console
+from airflow_breeze.utils.console import get_console, get_stderr_console
 from airflow_breeze.utils.custom_param_types import BetterChoice
 from airflow_breeze.utils.docker_command_utils import (
 check_docker_resources,
@@ -179,14 +179,14 @@ def get_changed_files(commit_ref: str | None) -> 
tuple[str, ...]:
 ]
 result = run_command(cmd, check=False, capture_output=True, text=True)
 if result.returncode != 0:
-get_console().print(
+get_stderr_console().print(
 f"[warning] Error when running diff-tree command 
[/]\n{result.stdout}\n{result.stderr}"
 )
 return ()
 changed_files = tuple(result.stdout.splitlines()) if result.stdout else ()
-get_console().print("\n[info]Changed files:[/]\n")
-get_console().print(changed_files)
-get_console().print()
+get_stderr_console().print("\n[info]Changed files:[/]\n")
+get_stderr_console().print(changed_files)
+get_stderr_console().print()
 return changed_files
 
 
@@ -250,7 +250,7 @@ def selective_check(
 pr_labels=tuple(ast.literal_eval(pr_labels)) if pr_labels else (),
 github_event=github_event,
 )
-print(str(sc), file=sys.stderr)
+print(str(sc), file=sys.stdout)
 
 
 @ci_group.command(name="find-newer-dependencies", help="Finds which 
dependencies are being upgraded.")
diff --git a/dev/breeze/src/airflow_breeze/utils/github_actions.py 
b/dev/breeze/src/airflow_breeze/utils/github_actions.py
index 6b8043aa7e..1566bbe81c 100644
--- a/dev/breeze/src/airflow_breeze/utils/github_actions.py
+++ b/dev/breeze/src/airflow_breeze/utils/github_actions.py
@@ -20,11 +20,11 @@ from typing import Any
 
 from rich.markup import escape
 
-from airflow_breeze.utils.console import get_console
+from airflow_breeze.utils.console import get_stderr_console
 
 
 def get_ga_output(name: str, value: Any) -> str:
 output_name = name.replace("_", "-")
 printed_value = str(value).lower() if isinstance(value, bool) else value
-get_console().print(f"[info]{output_name}[/] = 
[green]{escape(str(printed_value))}[/]")
+get_stderr_console().print(f"[info]{output_name}[/] = 
[green]{escape(str(printed_value))}[/]")
 return f"{output_name}={printed_value}"
diff --git a/dev/breeze/src/airflow_breeze/utils/selective_checks.py 
b/dev/breeze/src/airflow_breeze/utils/selective_checks.py
index eeaf651

[GitHub] [airflow] jedcunningham merged pull request #28514: Fix selective checks handling error tracebacks in CI

2022-12-20 Thread GitBox


jedcunningham merged PR #28514:
URL: https://github.com/apache/airflow/pull/28514


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Remove deprecated AIPlatformConsoleLinkk from google/provider.yaml (#28449)

2022-12-20 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 7950fb9711 Remove deprecated AIPlatformConsoleLinkk from 
google/provider.yaml (#28449)
7950fb9711 is described below

commit 7950fb9711384f8ac4609fc19f319edb17e296ef
Author: Victor Chiapaikeo 
AuthorDate: Wed Dec 21 00:29:56 2022 -0500

Remove deprecated AIPlatformConsoleLinkk from google/provider.yaml (#28449)
---
 airflow/providers/google/provider.yaml | 1 -
 1 file changed, 1 deletion(-)

diff --git a/airflow/providers/google/provider.yaml 
b/airflow/providers/google/provider.yaml
index 7c3efe2d5a..cca854d1d8 100644
--- a/airflow/providers/google/provider.yaml
+++ b/airflow/providers/google/provider.yaml
@@ -965,7 +965,6 @@ connection-types:
 extra-links:
   - airflow.providers.google.cloud.operators.bigquery.BigQueryConsoleLink
   - 
airflow.providers.google.cloud.operators.bigquery.BigQueryConsoleIndexableLink
-  - airflow.providers.google.cloud.operators.mlengine.AIPlatformConsoleLink
   - airflow.providers.google.cloud.links.dataform.DataformRepositoryLink
   - airflow.providers.google.cloud.links.dataform.DataformWorkspaceLink
   - 
airflow.providers.google.cloud.links.dataform.DataformWorkflowInvocationLink



[GitHub] [airflow] eladkal closed issue #28393: Webserver reports "ImportError: Module "airflow.providers.google.cloud.operators.mlengine" does not define a "AIPlatformConsoleLink" attribute/class"

2022-12-20 Thread GitBox


eladkal closed issue #28393: Webserver reports "ImportError: Module 
"airflow.providers.google.cloud.operators.mlengine" does not define a 
"AIPlatformConsoleLink" attribute/class"
URL: https://github.com/apache/airflow/issues/28393


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal merged pull request #28449: Remove deprecated AIPlatformConsoleLink from google/provider.yaml

2022-12-20 Thread GitBox


eladkal merged PR #28449:
URL: https://github.com/apache/airflow/pull/28449


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] BobDu commented on pull request #28198: Fix scheduler orm DetachedInstanceError when find zombies in standalone dag processor mode

2022-12-20 Thread GitBox


BobDu commented on PR #28198:
URL: https://github.com/apache/airflow/pull/28198#issuecomment-1360797515

   @potiuk 
   This PR is a bug fix, I hope it doesn't miss 2.5.1 release. 
   The bug will cause the scheduler to never start, unless modify the database 
manually.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] xinbinhuang commented on a diff in pull request #28512: [Fixed] "Adding Flink on K8s Operator"

2022-12-20 Thread GitBox


xinbinhuang commented on code in PR #28512:
URL: https://github.com/apache/airflow/pull/28512#discussion_r1053911449


##
airflow/providers/apache/flink/operators/flink_kubernetes.py:
##
@@ -0,0 +1,143 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from typing import TYPE_CHECKING, Sequence
+
+from kubernetes.client import CoreV1Api
+
+from airflow.compat.functools import cached_property
+from airflow.configuration import conf
+from airflow.models import BaseOperator
+from airflow.providers.cncf.kubernetes.hooks.kubernetes import KubernetesHook
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class FlinkKubernetesOperator(BaseOperator):
+"""
+Creates flinkDeployment object in kubernetes cluster:
+
+.. seealso::
+For more detail about Flink Deployment Object have a look at the 
reference:
+
https://nightlies.apache.org/flink/flink-kubernetes-operator-docs-main/docs/custom-resource/reference/#flinkdeployment
+
+:param application_file: Defines Kubernetes 'custom_resource_definition' 
of 'flinkDeployment' as either a
+path to a '.yaml' file, '.json' file, YAML string or JSON string.
+:param namespace: kubernetes namespace to put flinkDeployment
+:param kubernetes_conn_id: The :ref:`kubernetes connection id 
`
+for the to Kubernetes cluster.
+:param api_group: kubernetes api group of flinkDeployment
+:param api_version: kubernetes api version of flinkDeployment
+:param in_cluster: run kubernetes client with in_cluster configuration.
+:param cluster_context: context that points to kubernetes cluster.
+Ignored when in_cluster is True. If None, current-context is used.
+:param config_file: The path to the Kubernetes config file. (templated)
+If not specified, default value is ``~/.kube/config``
+"""
+
+template_fields: Sequence[str] = ("application_file", "namespace")
+template_ext: Sequence[str] = (".yaml", ".yml", ".json")
+ui_color = "#f4a460"
+
+def __init__(
+self,
+*,
+application_file: str,
+namespace: str | None = None,
+kubernetes_conn_id: str = "kubernetes_default",
+api_group: str = "flink.apache.org",
+api_version: str = "v1beta1",
+in_cluster: bool | None = None,
+cluster_context: str | None = None,
+config_file: str | None = None,
+plural: str = "flinkdeployments",
+**kwargs,
+) -> None:
+super().__init__(**kwargs)
+self.application_file = application_file
+self.namespace = namespace
+self.kubernetes_conn_id = kubernetes_conn_id
+self.api_group = api_group
+self.api_version = api_version
+self.plural = plural
+self.in_cluster = in_cluster
+self.cluster_context = cluster_context
+self.config_file = config_file
+
+@cached_property
+def hook(self) -> KubernetesHook:
+hook = KubernetesHook(
+conn_id=self.kubernetes_conn_id,
+in_cluster=self.in_cluster,
+config_file=self.config_file,
+cluster_context=self.cluster_context,
+)
+self._patch_deprecated_k8s_settings(hook)
+return hook
+
+def _patch_deprecated_k8s_settings(self, hook: KubernetesHook):
+"""
+Here we read config from core Airflow config [kubernetes] section.
+In a future release we will stop looking at this section and require 
users
+to use Airflow connections to configure KPO.
+
+When we find values there that we need to apply on the hook, we patch 
special
+hook attributes here.
+"""
+# default for enable_tcp_keepalive is True; patch if False
+if conf.getboolean("kubernetes", "enable_tcp_keepalive") is False:
+hook._deprecated_core_disable_tcp_keepalive = True  # noqa
+
+# default verify_ssl is True; patch if False.
+if conf.getboolean("kubernetes", "verify_ssl") is False:
+hook._deprecated_core_disable_verify_ssl = True  # noqa
+
+# default for in_cluster is True; patch if False and no KPO param.
+conf_in_cluster =

[GitHub] [airflow] potiuk commented on pull request #28512: [Fixed] "Adding Flink on K8s Operator"

2022-12-20 Thread GitBox


potiuk commented on PR #28512:
URL: https://github.com/apache/airflow/pull/28512#issuecomment-1360653166

   Good one. Unforeseen interaction of stderr redirection to GITHUB_OUTPUT.
   
   Fix in https://github.com/apache/airflow/pull/28514 (explanation in the 
commit descripton - and you can see the changes in the PR :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk opened a new pull request, #28514: Fix selective checks handling error tracebacks in CI

2022-12-20 Thread GitBox


potiuk opened a new pull request, #28514:
URL: https://github.com/apache/airflow/pull/28514

   Initially selective check was implemented in the way that it printed 
diagnostic output on stdout and the GITHUB_OUTPUT compatible set of outputs on 
stderr so that it could be redirected to the GITHUB_OUTPUT in its entirety. But 
this turned out to be a bad idea because when there was an error generated in 
selective-checks themselves, the traceback was printed in stderr and 
redirecting stderr to GITHUB_OUTPUT swallowed the traceback.
   
   This change reverses the behaviour:
   
   * diagnostic output is printed to stderr
   * GITHUB_OUTPUT compatible output is printed to stdout
   
   This way when traceback happens it is printed to stderr and is not 
swalleowed by redirection to GITHUB_OUTPUT
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on pull request #28511: allow a configurable prefix for pod names created by kubernetes executor

2022-12-20 Thread GitBox


dstandish commented on PR #28511:
URL: https://github.com/apache/airflow/pull/28511#issuecomment-1360641959

   Hi @melugoyal 
   
   Thanks for the PR
   
   This seems like something you could use [task 
policy](https://airflow.apache.org/docs/apache-airflow/stable/concepts/cluster-policies.html#task-policies)
 to accomplish.  What do you think?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jedcunningham commented on a diff in pull request #27829: Improving the release process

2022-12-20 Thread GitBox


jedcunningham commented on code in PR #27829:
URL: https://github.com/apache/airflow/pull/27829#discussion_r1053602147


##
dev/breeze/src/airflow_breeze/commands/minor_release_command.py:
##
@@ -0,0 +1,181 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+import os
+
+import click
+
+from airflow_breeze.utils.common_options import option_answer
+from airflow_breeze.utils.confirm import confirm_action
+from airflow_breeze.utils.console import console_print
+from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT
+from airflow_breeze.utils.run_utils import run_command
+
+CI = os.environ.get("CI")
+
+
+def create_branch(version_branch):
+if CI:
+console_print("Skipping creating branch in CI")
+return
+if confirm_action(f"Create version branch: {version_branch}?"):
+run_command(["git", "checkout", "main"])
+run_command(["git", "checkout", "-b", f"v{version_branch}-test"])
+console_print(f"Created branch: v{version_branch}-test")
+
+
+def update_default_branch(version_branch):
+if confirm_action("Update default branches?"):
+console_print()
+console_print("You need to update the default branch at:")
+console_print("./dev/breeze/src/airflow_breeze/branch_defaults.py")
+console_print("Change the following:")
+console_print("AIRFLOW_BRANCH = 'main'")
+console_print("DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH = 
'constraints-main'")
+console_print()
+console_print("To:")
+console_print()
+console_print(f"AIRFLOW_BRANCH = 'v{version_branch}-test'")
+console_print(f"DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH = 
'constraints-{version_branch}'")
+
+
+def commit_changes(version_branch):
+if CI:
+console_print("Skipping committing changes in CI")
+return
+if confirm_action("Commit the above changes?"):
+run_command(["git", "add", "-p", "."])
+run_command(["git", "commit", "-m", f"Update default branches for 
{version_branch}"])
+
+
+def create_stable_branch(version_branch):
+if CI:
+console_print("Skipping creating stable branch in CI")
+return
+if confirm_action(f"Create stable branch: v{version_branch}-stable?"):

Review Comment:
   Should we checkout `v{version_branch}-stable` explicitly here, in case we've 
answered no to "Create version branch: {version_branch}?"?



##
dev/breeze/src/airflow_breeze/commands/minor_release_command.py:
##
@@ -0,0 +1,181 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+import os
+
+import click
+
+from airflow_breeze.utils.common_options import option_answer
+from airflow_breeze.utils.confirm import confirm_action
+from airflow_breeze.utils.console import console_print
+from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT
+from airflow_breeze.utils.run_utils import run_command
+
+CI = os.environ.get("CI")
+
+
+def create_branch(version_branch):
+if CI:
+console_print("Skipping creating branch in CI")
+return
+if confirm_action(f"Create version branch: {version_branch}?"):
+run_command(["git", "checkout", "main"])
+run_command(["git", "checkout", "-b", f"v{version_branch}-test"])
+console_print(f"Created branch: v{version_branch}-test")
+
+
+def update_default_branch(version_branch):
+if confirm_action("Update default branches?"):
+

[GitHub] [airflow] xinbinhuang commented on pull request #28512: [Fixed] "Adding Flink on K8s Operator"

2022-12-20 Thread GitBox


xinbinhuang commented on PR #28512:
URL: https://github.com/apache/airflow/pull/28512#issuecomment-1360628421

   @potiuk I think the CI has some issue reporting back the error ([CI 
log](https://github.com/apache/airflow/actions/runs/3745083722/jobs/6359128624))
   
   when I ran it locally, I got the traceback:
   
   ```Traceback (most recent call last):
 File "/Users/binh/.local/bin/breeze", line 8, in 
   sys.exit(main())
 File 
"/Users/binh/.local/pipx/venvs/apache-airflow-breeze/lib/python3.9/site-packages/click/core.py",
 line 1130, in __call__
   return self.main(*args, **kwargs)
 File 
"/Users/binh/.local/pipx/venvs/apache-airflow-breeze/lib/python3.9/site-packages/rich_click/rich_group.py",
 line 21, in main
   rv = super().main(*args, standalone_mode=False, **kwargs)
 File 
"/Users/binh/.local/pipx/venvs/apache-airflow-breeze/lib/python3.9/site-packages/click/core.py",
 line 1055, in main
   rv = self.invoke(ctx)
 File 
"/Users/binh/.local/pipx/venvs/apache-airflow-breeze/lib/python3.9/site-packages/click/core.py",
 line 1657, in invoke
   return _process_result(sub_ctx.command.invoke(sub_ctx))
 File 
"/Users/binh/.local/pipx/venvs/apache-airflow-breeze/lib/python3.9/site-packages/click/core.py",
 line 1657, in invoke
   return _process_result(sub_ctx.command.invoke(sub_ctx))
 File 
"/Users/binh/.local/pipx/venvs/apache-airflow-breeze/lib/python3.9/site-packages/click/core.py",
 line 1404, in invoke
   return ctx.invoke(self.callback, **ctx.params)
 File 
"/Users/binh/.local/pipx/venvs/apache-airflow-breeze/lib/python3.9/site-packages/click/core.py",
 line 760, in invoke
   return __callback(*args, **kwargs)
 File 
"/Users/binh/projects/airflow/dev/breeze/src/airflow_breeze/commands/ci_commands.py",
 line 253, in selective_check
   print(str(sc), file=sys.stderr)
 File 
"/Users/binh/projects/airflow/dev/breeze/src/airflow_breeze/utils/selective_checks.py",
 line 275, in __str__
   output.append(get_ga_output(field_name, getattr(self, field_name)))
 File "/Users/binh/.pyenv/versions/3.9.11/lib/python3.9/functools.py", line 
993, in __get__
   val = self.func(instance)
 File 
"/Users/binh/projects/airflow/dev/breeze/src/airflow_breeze/utils/selective_checks.py",
 line 558, in test_types
   current_test_types = set(self._get_test_types_to_run())
 File 
"/Users/binh/projects/airflow/dev/breeze/src/airflow_breeze/utils/selective_checks.py",
 line 539, in _get_test_types_to_run
   affected_providers = 
find_all_providers_affected(changed_files=self._files)
 File 
"/Users/binh/projects/airflow/dev/breeze/src/airflow_breeze/utils/selective_checks.py",
 line 237, in find_all_providers_affected
   add_dependent_providers(all_providers, provider, dependencies)
 File 
"/Users/binh/projects/airflow/dev/breeze/src/airflow_breeze/utils/selective_checks.py",
 line 223, in add_dependent_providers
   for dep_name in dependencies[provider_to_check]["cross-providers-deps"]:
   KeyError: 'apache.flink'
   ```
   
   Any clues if we can fix the CI to report back the traceback properly?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #28513: GCSToBigQueryOperator no longer loads DATASTORE_BACKUP formats

2022-12-20 Thread GitBox


boring-cyborg[bot] commented on issue #28513:
URL: https://github.com/apache/airflow/issues/28513#issuecomment-1360586362

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] watertree opened a new issue, #28513: GCSToBigQueryOperator no longer loads DATASTORE_BACKUP formats

2022-12-20 Thread GitBox


watertree opened a new issue, #28513:
URL: https://github.com/apache/airflow/issues/28513

   ### Apache Airflow Provider(s)
   
   google
   
   ### Versions of Apache Airflow Providers
   
   ```bash
   airflow@airflow-worker-XX-XX:~$ pip freeze | grep google-cloud 
   google-cloud-aiplatform @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_aiplatform-1.16.1-py2.py3-none-any.whl
   google-cloud-appengine-logging @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_appengine_logging-1.1.3-py2.py3-none-any.whl
   google-cloud-audit-log @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_audit_log-0.2.4-py2.py3-none-any.whl
   google-cloud-automl @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_automl-2.8.0-py2.py3-none-any.whl
   google-cloud-bigquery @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_bigquery-2.34.4-py2.py3-none-any.whl
   google-cloud-bigquery-datatransfer @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_bigquery_datatransfer-3.7.0-py2.py3-none-any.whl
   google-cloud-bigquery-storage @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_bigquery_storage-2.14.1-py2.py3-none-any.whl
   google-cloud-bigtable @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_bigtable-1.7.3-py2.py3-none-any.whl
   google-cloud-build @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_build-3.9.0-py2.py3-none-any.whl
   google-cloud-common @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_common-1.0.3-py2.py3-none-any.whl
   google-cloud-compute @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_compute-0.7.0-py2.py3-none-any.whl
   google-cloud-container @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_container-2.11.1-py2.py3-none-any.whl
   google-cloud-core @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_core-2.3.2-py2.py3-none-any.whl
   google-cloud-datacatalog @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_datacatalog-3.9.0-py2.py3-none-any.whl
   google-cloud-datacatalog-lineage @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_datacatalog_lineage-0.1.6-py3-none-any.whl
   google-cloud-datacatalog-lineage-producer-client @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_datacatalog_lineage_producer_client-0.0.9-py3-none-any.whl
   google-cloud-dataform @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_dataform-0.2.0-py2.py3-none-any.whl
   google-cloud-dataplex @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_dataplex-1.1.0-py2.py3-none-any.whl
   google-cloud-dataproc @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_dataproc-5.0.0-py2.py3-none-any.whl
   google-cloud-dataproc-metastore @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_dataproc_metastore-1.6.0-py2.py3-none-any.whl
   google-cloud-datastore @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_datastore-2.8.0-py2.py3-none-any.whl
   google-cloud-dlp @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_dlp-1.0.2-py2.py3-none-any.whl
   google-cloud-filestore @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_filestore-1.2.0-py2.py3-none-any.whl
   google-cloud-firestore @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_firestore-2.5.0-py2.py3-none-any.whl
   google-cloud-kms @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_kms-2.12.0-py2.py3-none-any.whl
   google-cloud-language @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_language-1.3.2-py2.py3-none-any.whl
   google-cloud-logging @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_logging-3.2.1-py2.py3-none-any.whl
   google-cloud-memcache @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_memcache-1.4.1-py2.py3-none-any.whl
   google-cloud-monitoring @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_monitoring-2.11.0-py2.py3-none-any.whl
   google-cloud-orchestration-airflow @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_orchestration_airflow-1.4.1-py2.py3-none-any.whl
   google-cloud-os-login @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_os_login-2.7.1-py2.py3-none-any.whl
   google-cloud-pubsub @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.3.4/python3.8/google_cloud_pubsub-2.13.4-py2.py3-none-any.whl
   google-cloud-pubsublite @ 
file:///usr/local/lib/a

[GitHub] [airflow] xinbinhuang opened a new pull request, #28512: Port https://github.com/apache/airflow/pull/26664

2022-12-20 Thread GitBox


xinbinhuang opened a new pull request, #28512:
URL: https://github.com/apache/airflow/pull/28512

   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on pull request #26162: DockerHook: obtain credentials and login to Amazon ECR

2022-12-20 Thread GitBox


github-actions[bot] commented on PR #26162:
URL: https://github.com/apache/airflow/pull/26162#issuecomment-1360536339

   This pull request has been automatically marked as stale because it has not 
had recent activity. It will be closed in 5 days if no further activity occurs. 
Thank you for your contributions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #28483: Issues with Custom Menu Items on Smaller Windows

2022-12-20 Thread GitBox


potiuk commented on issue #28483:
URL: https://github.com/apache/airflow/issues/28483#issuecomment-1360495019

   The --dev-mode of breeze should start the webserver in debug mode (with -d 
option) and webserver should print out that it is running in dev mode and you 
should see assets recompiling when they are changed.
   
   But you can do it manually too - you can always control-C in the webserver 
terminal and start it with -d. Generally speaking breeze makes it easier to 
automate some stuff, but then it also leaves it free for those who know what 
they are doing to do more stuff. 
   
   If you clearly see that the webserver runs in debug mode and your assets are 
being recompiled, then this could be another reason - my best guess is that it 
does come from FAB not Airflow's  CSS. But that's purely guess.
   
   But I guess also it's best to just check it - if you made a change you 
should be able track - whether you see the style in the browser (this is what I 
remember what I did when I did debug similar things). Inspect in Chrome was 
always a good friend to help to understand it :). I usually did some stupid 
changes to see if I can see it in the browser just to see if the "pipeline" is 
workign - trackign some very straightforward elements and seeing effect of 
those changes. Then I'd start debugging if I don't see it (usually ending up in 
finding out who overwrote my change and why). 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] melugoyal commented on a diff in pull request #28511: allow a configurable prefix for pod names created by kubernetes executor

2022-12-20 Thread GitBox


melugoyal commented on code in PR #28511:
URL: https://github.com/apache/airflow/pull/28511#discussion_r1053849270


##
airflow/config_templates/config.yml:
##
@@ -2544,6 +2544,13 @@
   type: boolean
   example: ~
   default: "True"
+- name: worker_pods_name_prefix
+  description: |
+Prefix to be added to the beginning of worker pods' names
+  version_added: 2.6.0

Review Comment:
   i'm unsure if this is correct. @jedcunningham @dstandish 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] melugoyal commented on a diff in pull request #28511: allow a configurable prefix for pod names created by kubernetes executor

2022-12-20 Thread GitBox


melugoyal commented on code in PR #28511:
URL: https://github.com/apache/airflow/pull/28511#discussion_r1053848453


##
airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py:
##
@@ -91,6 +91,7 @@ def _add_pod_suffix(*, pod_name, rand_len=8, max_len=253):
 def _create_pod_id(
 dag_id: str | None = None,
 task_id: str | None = None,
+prefix: str | None = None,

Review Comment:
   had to copy the changes to this file since we run the same set of unit tests 
against both modules: 
https://github.com/apache/airflow/blob/b213f4fd2627bb2a2a4c96fe2845471db430aa5d/tests/kubernetes/test_kubernetes_helper_functions.py#L35



##
airflow/config_templates/config.yml:
##
@@ -2544,6 +2544,13 @@
   type: boolean
   example: ~
   default: "True"
+- name: worker_pods_name_prefix
+  description: |
+Prefix to be added to the beginning of worker pods' names
+  version_added: 2.6.0

Review Comment:
   i'm unsure if this is correct. cc @jedcunningham @dstandish 



##
airflow/kubernetes/kube_config.py:
##
@@ -26,7 +26,6 @@ class KubeConfig:
 
 core_section = "core"
 kubernetes_section = "kubernetes_executor"
-logging_section = "logging"

Review Comment:
   this was unused



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on pull request #28511: allow a configurable prefix to pod names created for kubernetes executor

2022-12-20 Thread GitBox


boring-cyborg[bot] commented on PR #28511:
URL: https://github.com/apache/airflow/pull/28511#issuecomment-1360476012

   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it's a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] melugoyal opened a new pull request, #28511: allow a configurable prefix to pod names created for kubernetes executor

2022-12-20 Thread GitBox


melugoyal opened a new pull request, #28511:
URL: https://github.com/apache/airflow/pull/28511

   
   
   Allow worker pods created by Kubernetes Executor to be prefixed by a 
configurable string. The prefix defaults to the empty string.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28509: Add cncf.kubernetes extra for Amazon and Google providers

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28509:
URL: https://github.com/apache/airflow/pull/28509#discussion_r1053835725


##
airflow/providers/amazon/aws/operators/eks.py:
##
@@ -26,7 +26,13 @@
 from airflow import AirflowException
 from airflow.models import BaseOperator
 from airflow.providers.amazon.aws.hooks.eks import EksHook
-from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+
+try:
+from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+except ImportError as e:
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+raise AirflowOptionalProviderFeatureException(e)

Review Comment:
   It is just thinking out loud, I really do not know are eks operators provide 
something more than just use with `EksPodOperator`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


potiuk commented on PR #28505:
URL: https://github.com/apache/airflow/pull/28505#issuecomment-1360464746

   Moved the description up to breaking changes.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


potiuk commented on PR #28505:
URL: https://github.com/apache/airflow/pull/28505#issuecomment-1360462705

   > Yes, on upgrade everything looks fine because as you said pandas lib is 
already there but if users are using install scripts that install airflow and 
providers seperatly then once workers restart and reinstall from scratch it 
will cause broken dags because now pandas is not present. This require manual 
change to add the pandas dependency thus this is not backward compatible.
   
   Not that it matters in this case (we are releasing breaking version anyway), 
but let me venture into academic discussion (just for the sake of discussion). 
And more for "food for thoughts".  I do not want to argue vehemotly on this, I 
am happy to move it now to breaking, I would like more to present a different 
view on that.
   
   It really depends what WE consider as breaking change. It's never 0/1 
(breaking/not breaking). As I mentioned multiple time and as [Hyrum's 
law](https://www.hyrumslaw.com/) nicely states: "With a sufficient number of 
users of an API, it does not matter what you promise in the contract: all 
observable behaviors of your system will be depended on by somebody".
   
   Every single change in our case we do is breaking someone's workflow. We 
should accept this.  And whether we classify something as breaking or not 
depends purely on our assesment of the situation on how breaking it is. We 
earlier agreed that just dependency change is not breaking on its own (and we 
released a number of providers with dependency changes without increasing 
major.  
   
   What you described is a possibility of course, but more likely than not 
pandas will already be installed as dependency anyway there for mutliple 
reasons. And even if not it is just the matter of adding "[pandas]" in their 
scripts. No DAG modificaitons needed, no changes in the code just "environment" 
change. We are just dicussing what is what "Airflow Public Interface" - 
https://github.com/apache/airflow/pull/28300 and I have not seen anyone asking 
to add "list of dependencies to install automatically" as part of the public 
API. Once we merge this change, this kind of change is not going to be 
considered as "breaking" on the list. 
   
   So - super easy way to recover and error message will be very clear. 
   
   Also there are other things we should consider - do you think the users who 
install airflow automatically in the way do not have a test harness to verify 
if their new install is working ? There might be many more problems than 
missing pandas if every time they rebuild it from scratch, so in your scenario 
- they for sure have a test harness in place. They have a staging system to 
test it.
   
   And even if they don't have test harness - do we realistically think they 
will limit their installation and not install 7.0.0 version of the provider 
gets released in the same automated way? Do you think they have a separate test 
harness that they will test against such breaking relase of every single 
provider they use? Will that make a difference for them whether that 
description is put in "breaking" or "feature" changes" (because that will be 
the only difference in this case) ? I am not sure, but I think this is quite 
unrealistic.
   
   And of course we can be wrong in our assesment. Happens. We can always 
produce a bugfix for that if many of our users will start complaining. And we 
should of course try to avoid this, but ultimately, we will break some 
workflows of some people  - this is unavoidable. And there is no way to 100% 
protect against it.
   
   And finally - yes, I think this is a border-line for me this is really the 
case where things aren't clearly 0/1. And we can either err on a side of 
caution or be a bit more bold 
   
   This is all purely academic discussion of course - I have 0 problem with 
putting it into breaking section - it does not really matter in this case of 
course.  Just wanted to explain that putting everything that potentially 
changes some particular workflow is not necessarily breaking. It's a very 
simplistic view IMHO and we should not only asses if the change "actually" 
breaks something but also how disruptive it really is and whether it is simple 
to recover for the user at early stage, because it is realy harmful.
   
   
![image](https://user-images.githubusercontent.com/595491/208781135-c6a9538a-f5e7-43ae-a77d-f9a32f3e8a2e.png)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28509: Add cncf.kubernetes extra for Amazon and Google providers

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28509:
URL: https://github.com/apache/airflow/pull/28509#discussion_r1053835725


##
airflow/providers/amazon/aws/operators/eks.py:
##
@@ -26,7 +26,13 @@
 from airflow import AirflowException
 from airflow.models import BaseOperator
 from airflow.providers.amazon.aws.hooks.eks import EksHook
-from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+
+try:
+from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+except ImportError as e:
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+raise AirflowOptionalProviderFeatureException(e)

Review Comment:
   It is just thinking out loud, I really do not know is eks operators provide 
something more than just use with `EksPodOperator`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28509: Add cncf.kubernetes extra for Amazon and Google providers

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28509:
URL: https://github.com/apache/airflow/pull/28509#discussion_r1053835725


##
airflow/providers/amazon/aws/operators/eks.py:
##
@@ -26,7 +26,13 @@
 from airflow import AirflowException
 from airflow.models import BaseOperator
 from airflow.providers.amazon.aws.hooks.eks import EksHook
-from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+
+try:
+from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+except ImportError as e:
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+raise AirflowOptionalProviderFeatureException(e)

Review Comment:
   It is just thinking out loud, I really do not know is eks operators provide 
something more then use it with `EksPodOperator`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28509: Add cncf.kubernetes extra for Amazon and Google providers

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28509:
URL: https://github.com/apache/airflow/pull/28509#discussion_r1053829747


##
airflow/providers/amazon/aws/operators/eks.py:
##
@@ -26,7 +26,13 @@
 from airflow import AirflowException
 from airflow.models import BaseOperator
 from airflow.providers.amazon.aws.hooks.eks import EksHook
-from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+
+try:
+from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+except ImportError as e:
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+raise AirflowOptionalProviderFeatureException(e)

Review Comment:
   I'm not sure is it available into provider but also EKS could use for run 
EMR jobs as well as compute environment for run AWS Batch Job (we definitely 
can't do it right now).
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on a diff in pull request #28509: Add cncf.kubernetes extra for Amazon and Google providers

2022-12-20 Thread GitBox


eladkal commented on code in PR #28509:
URL: https://github.com/apache/airflow/pull/28509#discussion_r1053827477


##
airflow/providers/amazon/aws/operators/eks.py:
##
@@ -26,7 +26,13 @@
 from airflow import AirflowException
 from airflow.models import BaseOperator
 from airflow.providers.amazon.aws.hooks.eks import EksHook
-from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+
+try:
+from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+except ImportError as e:
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+raise AirflowOptionalProviderFeatureException(e)

Review Comment:
   You cant really use eks module without cncf so probably most users has cncf 
already installed



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28509: Add cncf.kubernetes extra for Amazon and Google providers

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28509:
URL: https://github.com/apache/airflow/pull/28509#discussion_r1053824118


##
airflow/providers/amazon/aws/operators/eks.py:
##
@@ -26,7 +26,13 @@
 from airflow import AirflowException
 from airflow.models import BaseOperator
 from airflow.providers.amazon.aws.hooks.eks import EksHook
-from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+
+try:
+from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import 
KubernetesPodOperator
+except ImportError as e:
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+raise AirflowOptionalProviderFeatureException(e)

Review Comment:
   H... It appears that if user did not install `cncf.kubernetes` extra 
then all other operators in this module unavailable.
   
   All other operators, as far as I know, use AWS API. So is it better move 
this operator into separate module as follow up?



##
airflow/providers/amazon/provider.yaml:
##
@@ -576,3 +576,8 @@ secrets-backends:
 logging:
   - airflow.providers.amazon.aws.log.s3_task_handler.S3TaskHandler
   - 
airflow.providers.amazon.aws.log.cloudwatch_task_handler.CloudwatchTaskHandler
+
+additional-extras:
+  - name: cncf.kubernetes
+dependencies:
+  - apache-airflow-providers-cncf-kubernetes>=5.0.0

Review Comment:
   I thought this is automatically added as extra dependency, at least it 
listed in https://pypi.org/pypi/apache-airflow-providers-amazon/json or maybe 
we need specific version of provider?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (39abd5e065 -> b213f4fd26)

2022-12-20 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 39abd5e065 Add AWS SageMaker operator to register a model's version 
(#28024)
 add b213f4fd26 Fix minor typo (#28508)

No new revisions were added by this update.

Summary of changes:
 BREEZE.rst | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)



[GitHub] [airflow] eladkal merged pull request #28508: Fix minor typo in doc

2022-12-20 Thread GitBox


eladkal merged PR #28508:
URL: https://github.com/apache/airflow/pull/28508


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] michaelmicheal commented on issue #28483: Issues with Custom Menu Items on Smaller Windows

2022-12-20 Thread GitBox


michaelmicheal commented on issue #28483:
URL: https://github.com/apache/airflow/issues/28483#issuecomment-1360404531

   > You may need to run the webserver in dev mode `airflow webserver -d`
   
   That didn't seem to work. Is dev-mode just a [breeze command 
option](https://github.com/apache/airflow/blob/7d13b65f53bcd83c5766b21873d8dfb33e593a48/dev/breeze/src/airflow_breeze/commands/developer_commands.py#L223)?
 -d seems to be debug.
   
   ```bash
   Start a Airflow webserver instance
   
   optional arguments:
 -h, --helpshow this help message and exit
 -A ACCESS_LOGFILE, --access-logfile ACCESS_LOGFILE
   The logfile to store the webserver access log. Use 
'-' to print to stderr
 -L ACCESS_LOGFORMAT, --access-logformat ACCESS_LOGFORMAT
   The access log format for gunicorn logs
 -D, --daemon  Daemonize instead of running in the foreground
 -d, --debug   Use the server that ships with Flask in debug mode
 ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal opened a new issue, #28510: Add pre-commit/test to verify extra links refer to existed classes

2022-12-20 Thread GitBox


eladkal opened a new issue, #28510:
URL: https://github.com/apache/airflow/issues/28510

   ### Body
   
   We had an issue where extra link class (`AIPlatformConsoleLink`) was removed 
in [PR](https://github.com/apache/airflow/pull/26836) without removing the 
class from the `provider.yaml` extra links this resulted in web server 
exception as shown in https://github.com/apache/airflow/pull/28449
   
   
   **The Task:**
   Add validation that classes of extra-links in provider.yaml are importable
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] chriscmorgan commented on issue #28452: TaskInstances do not succeed when using enable_logging=True option in DockerSwarmOperator

2022-12-20 Thread GitBox


chriscmorgan commented on issue #28452:
URL: https://github.com/apache/airflow/issues/28452#issuecomment-1360401766

   @maxim317, yes disabling logging is a workaround. Ideally we would like the 
logs though in airflow.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (820f5a9374 -> 39abd5e065)

2022-12-20 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 820f5a9374 Use object instead of array in config.yml for config 
template (#28417)
 add 39abd5e065 Add AWS SageMaker operator to register a model's version 
(#28024)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/hooks/sagemaker.py| 33 
 .../providers/amazon/aws/operators/sagemaker.py| 90 
 .../amazon/aws/utils/{rds.py => sagemaker.py}  | 10 ++-
 .../operators/sagemaker.rst| 18 
 tests/providers/amazon/aws/hooks/test_sagemaker.py | 31 ++-
 .../amazon/aws/operators/test_sagemaker_model.py   | 97 +++---
 .../providers/amazon/aws/example_sagemaker.py  | 29 ++-
 7 files changed, 291 insertions(+), 17 deletions(-)
 copy airflow/providers/amazon/aws/utils/{rds.py => sagemaker.py} (81%)



[GitHub] [airflow] eladkal merged pull request #28024: Add AWS SageMaker operator to register a model's version

2022-12-20 Thread GitBox


eladkal merged PR #28024:
URL: https://github.com/apache/airflow/pull/28024


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal opened a new pull request, #28509: Add cncf.kubernetes extra for Amazon and Google providers

2022-12-20 Thread GitBox


eladkal opened a new pull request, #28509:
URL: https://github.com/apache/airflow/pull/28509

   `EksPodOperator` and `GKEStartPodOperator` require kubernetes provider but 
cncf is not listed as dependency
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch amazon created (now 820f5a9374)

2022-12-20 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a change to branch amazon
in repository https://gitbox.apache.org/repos/asf/airflow.git


  at 820f5a9374 Use object instead of array in config.yml for config 
template (#28417)

No new revisions were added by this update.



[GitHub] [airflow] eladkal commented on pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


eladkal commented on PR #28505:
URL: https://github.com/apache/airflow/pull/28505#issuecomment-1360367756

   > Only people who install provider manually will have to install "pandas" 
extra either for Airflow core package or for the provider.
   
   This means it is a breaking change.
   Yes, on upgrade everything looks fine because as you said pandas lib is 
already there but if users are using install scripts that install airflow and 
providers seperatly then once workers restart and reinstall from scratch it 
will cause broken dags.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28485: [WIP] Add Azure functions operator and hook

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28485:
URL: https://github.com/apache/airflow/pull/28485#discussion_r1053762498


##
airflow/providers/microsoft/azure/hooks/azure_functions.py:
##
@@ -0,0 +1,177 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import annotations
+
+from typing import Any
+
+import requests
+from azure.identity import ClientSecretCredential
+from requests import Response
+
+from airflow.exceptions import AirflowException
+from airflow.hooks.base import BaseHook
+
+
+class AzureFunctionsHook(BaseHook):

Review Comment:
   Maybe my question dumb. I wondering is it possible to reuse AzureBaseHook?
   
   
https://github.com/apache/airflow/blob/2f552233f5c99b206c8f4c2088fcc0c05e7e26dc/airflow/providers/microsoft/azure/hooks/base_azure.py#L27
   
   And is it better to use some kind of SDK rather than build HTTP requests 
manually? For example 
[`azure-functions`](https://github.com/Azure/azure-functions-python-library)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (9ac76ec526 -> 820f5a9374)

2022-12-20 Thread dstandish
This is an automated email from the ASF dual-hosted git repository.

dstandish pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 9ac76ec526 Remove outdated Optional Provider Feature outdated 
documentation (#28506)
 add 820f5a9374 Use object instead of array in config.yml for config 
template (#28417)

No new revisions were added by this update.

Summary of changes:
 airflow/config_templates/config.yml| 668 ++---
 airflow/config_templates/config.yml.schema.json| 130 ++--
 airflow/configuration.py   |   2 +-
 .../providers/google/config_templates/config.yml   |   4 +-
 .../airflow_breeze/commands/developer_commands.py  |   2 +-
 docs/apache-airflow/configurations-ref.rst |  30 +-
 docs/conf.py   |  10 +-
 scripts/ci/pre_commit/pre_commit_yaml_to_cfg.py|  23 +-
 8 files changed, 447 insertions(+), 422 deletions(-)



[GitHub] [airflow] dstandish merged pull request #28417: Use object instead of array in config.yml for config template

2022-12-20 Thread GitBox


dstandish merged PR #28417:
URL: https://github.com/apache/airflow/pull/28417


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


potiuk commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053742976


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-

Review Comment:
   We are .. just fixed it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053740864


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-

Review Comment:
   I just realised that not yet after see this header



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053739925


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-

Review Comment:
   Actually I thought we already release 7.0.0 😆 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


potiuk commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053738869


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-

Review Comment:
   Fixed. 
   
   @dwreeves  - could you please help with adding a paragraph or two of 
description from https://github.com/apache/airflow/pull/27920. The context 
description was pretty long but we need to distill it to a single paragraph or 
so text (maybe with one example of code? )



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28503: Initial support for SAS community provider

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28503:
URL: https://github.com/apache/airflow/pull/28503#discussion_r1053732522


##
airflow/providers/sas/_utils/logon.py:
##
@@ -0,0 +1,95 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import annotations
+
+import base64
+import urllib.parse
+
+import requests
+import urllib3
+from urllib3.exceptions import InsecureRequestWarning
+
+from airflow.hooks.base import BaseHook
+
+
+def create_session_for_connection(connection_name: str):
+print(f"Creating session for connection named {connection_name}")

Review Comment:
   We use Hooks for this purpose: 
https://airflow.apache.org/docs/apache-airflow/stable/concepts/connections.html#hooks
 which is also provide nice ability to create Connection into the UI: 
https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html#custom-connection-types



##
airflow/providers/sas/_utils/logon.py:
##
@@ -0,0 +1,95 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import annotations
+
+import base64
+import urllib.parse
+
+import requests
+import urllib3
+from urllib3.exceptions import InsecureRequestWarning
+
+from airflow.hooks.base import BaseHook
+
+
+def create_session_for_connection(connection_name: str):
+print(f"Creating session for connection named {connection_name}")
+
+# disable insecure HTTP requests warnings
+urllib3.disable_warnings(InsecureRequestWarning)

Review Comment:
   Hmm.. is this globally disable all `InsecureRequestWarning`? 



##
airflow/providers/sas/operators/sas_jobexecution.py:
##
@@ -0,0 +1,76 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from __future__ import annotations
+
+import urllib.parse
+
+from airflow.exceptions import AirflowFailException
+from airflow.models import BaseOperator
+from airflow.providers.sas._utils.logon import create_session_for_connection
+
+JES_URI = "/jobExecution"
+JOB_URI = f"{JES_URI}/jobs"
+
+
+class SASJobExecutionOperator(BaseOperator):
+"""
+Executes a SAS Job
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:SASJobExecutionOperator`
+
+:param connection_name: Name of the SAS Viya connection stored as an 
Airflow HTTP connection
+:param job_name: Name of the SAS Job to be run
+:param parameters Dictionary of all the parameters that should be passed 
to the
+SAS Job as SAS Macro variables
+"""
+
+def __init__(self, connection_name: str, job_name: str, parameters: dict, 
**kwargs) -> None:
+super().__init__(**kwargs)
+self.connection_name = connection_name
+self.job_name = job_name

[GitHub] [airflow] potiuk commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


potiuk commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053734472


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-
+
+Features
+
+
+Pandas is now an optional dependency of the provider. The ``SqlToS3Operator`` 
and ``HiveToDynamoDBOperator``
+require Pandas to be installed (you can install it automatically by adding 
``[pandas]`` extra when installing
+the provider. Those who installed previous version of the provider, should 
already have pandas installed, so
+this is not a breaking change.

Review Comment:
   That's a good point - those are local imports there and they   `raise 
Exception(` but now we can change them to the OptionalProviderFeature 
exception. I will do that as a follow-up.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


potiuk commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053732628


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-

Review Comment:
   That's what reviews are for :)
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (1f75e9ffcf -> 9ac76ec526)

2022-12-20 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 1f75e9ffcf Move MyPY plugins of ours to dev folder (#28498)
 add 9ac76ec526 Remove outdated Optional Provider Feature outdated 
documentation (#28506)

No new revisions were added by this update.

Summary of changes:
 .../howto/create-update-providers.rst  | 33 +-
 1 file changed, 1 insertion(+), 32 deletions(-)



[GitHub] [airflow] potiuk merged pull request #28506: Remove outdated Optional Provider Feature outdated documentation

2022-12-20 Thread GitBox


potiuk merged PR #28506:
URL: https://github.com/apache/airflow/pull/28506


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053726699


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-

Review Comment:
   But we forget add breaking changes into the CHANGELOG `¯\_(ツ)_/¯`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053725692


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-

Review Comment:
   I bet next version should be 7.0.0 due to 
https://github.com/apache/airflow/pull/27920



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


Taragolis commented on code in PR #28505:
URL: https://github.com/apache/airflow/pull/28505#discussion_r1053711582


##
airflow/providers/amazon/CHANGELOG.rst:
##
@@ -24,6 +24,18 @@
 Changelog
 -
 
+6.3.0
+-
+
+Features
+
+
+Pandas is now an optional dependency of the provider. The ``SqlToS3Operator`` 
and ``HiveToDynamoDBOperator``
+require Pandas to be installed (you can install it automatically by adding 
``[pandas]`` extra when installing
+the provider. Those who installed previous version of the provider, should 
already have pandas installed, so
+this is not a breaking change.

Review Comment:
   Just wondering, should we also mentioned `RedshiftSQLHook` methods 
`get_pandas_df` and `get_pandas_df_by_chunks`? However this methods provided by 
`DbApiHook` so maybe not.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] snjypl opened a new pull request, #28508: Fix minor typo in doc

2022-12-20 Thread GitBox


snjypl opened a new pull request, #28508:
URL: https://github.com/apache/airflow/pull/28508

   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis opened a new pull request, #28507: Remove outdated compat imports/code from providers

2022-12-20 Thread GitBox


Taragolis opened a new pull request, #28507:
URL: https://github.com/apache/airflow/pull/28507

   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk opened a new pull request, #28506: Remove outdated Optional Provider Feature outdated documentation

2022-12-20 Thread GitBox


potiuk opened a new pull request, #28506:
URL: https://github.com/apache/airflow/pull/28506

   After bumping min_airflow_version to 2.3 the section about optional provider 
feature and the way to add it for pre-2.3 compatible providers is outdated and 
should be removed.
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


potiuk commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1360076239

   I told you it should be simple :) 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


potiuk commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1360075806

   PR in #28505


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk opened a new pull request, #28505: Make pandas dependency optional for Amazon Provider

2022-12-20 Thread GitBox


potiuk opened a new pull request, #28505:
URL: https://github.com/apache/airflow/pull/28505

   Pandas dependency is only used for a few features in Amazon providers amd it 
is a huge "drag" on a package in general - we use the existing Airflow 2.3+ 
feature of Optional Provider Feature to make pandas an optional dependency 
(with "[pandas]" extra).
   
   This is not a breaking change, since those who already installed previous 
provider version have pandas installed already. Only people who install 
provider manually will have to install "pandas" extra either for Airflow core 
package or for the provider.
   
   Fixes: #28468
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] RachitSharma2001 commented on pull request #28318: Add FTPSFileTransmitOperator

2022-12-20 Thread GitBox


RachitSharma2001 commented on PR #28318:
URL: https://github.com/apache/airflow/pull/28318#issuecomment-1360074064

   Hi everyone, I was wondering if I could get some additional reviews on this 
PR? No rush, but was just wondering if these changes look good or if there is 
anything else that should be added.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on issue #28504: When PythonVirtualenvOperator is called from within Custom Operator DAG must be assigned and task_id cannot be reused

2022-12-20 Thread GitBox


Taragolis commented on issue #28504:
URL: https://github.com/apache/airflow/issues/28504#issuecomment-1360061675

   Simple Operator
   
   ```python
   class SimplePythonVirtualenvOperator(PythonVirtualenvOperator):
   def __init__(self, task_id: str, **kwargs):
   task_id = kwargs
   kwargs["python_callable"] = my_func
   kwargs["python_version"] = "3.9"
   kwargs["requirements"] = ["requests"]
   super().__init__(task_id=task_id, **kwargs):
   
   ```
   
   Or simple "factory"
   
   ```python
   def simple_task_factory(
   task_id: str, python_callable = None, python_version = None, 
requirements = None, **kwargs
   ):
   return PythonVirtualenvOperator(
   task_id=task_id,
   python_callable=python_callable or my_func,
   python_version=python_version or "3.9",
   requirements=requirements or ["requests"],
   **kwargs
   )
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


potiuk commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1360055481

   BTW. this is one of the 2.3+ features so we are again benefiting from 
bumping min airflow version


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis closed issue #28504: When PythonVirtualenvOperator is called from within Custom Operator DAG must be assigned and task_id cannot be reused

2022-12-20 Thread GitBox


Taragolis closed issue #28504: When PythonVirtualenvOperator is called from 
within Custom Operator DAG must be assigned and task_id cannot be reused
URL: https://github.com/apache/airflow/issues/28504


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on issue #28504: When PythonVirtualenvOperator is called from within Custom Operator DAG must be assigned and task_id cannot be reused

2022-12-20 Thread GitBox


Taragolis commented on issue #28504:
URL: https://github.com/apache/airflow/issues/28504#issuecomment-1360053174

   ```python
   self.operator.execute(context)  # Don't do this
   ```
   
   You should not call `BaseOperator.execute(context: Context) `method directly.
   If you need custom `PythonVirtualenvOperator` you need to create own class 
based on it.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


potiuk commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1360051845

   > Is it might be better make in optional for all providers where it listed 
as core dependency?
   
   I can do an example PR for amazon as an example and we can split the job for 
others :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] frankbreetz opened a new issue, #28504: When PythonVirtualenvOperator is called from within Custom Operator DAG must be assigned and task_id cannot be reused

2022-12-20 Thread GitBox


frankbreetz opened a new issue, #28504:
URL: https://github.com/apache/airflow/issues/28504

   ### Apache Airflow version
   
   2.5.0
   
   ### What happened
   
   When instantiating the PythonVirtualenvOperator as part of the execute 
function of a custom Operator you must pass the DAG to it and you cannot reuse 
the task name. This is different than how it worked in Airflow Version 2.0
   
   
   ### What you think should happen instead
   
   The following DAG would parse and run:
   
   
   ```
   from datetime import datetime
   from airflow.decorators import dag, task
   from airflow.models import BaseOperator
   from airflow.operators.bash import BashOperator
   from airflow.operators.python import PythonVirtualenvOperator
   
   
   def my_func():
   print("this is a function")
   
   
   class MyOperator(BaseOperator):
   def __init__(
   self,
   **kwargs,
   ) -> None:
   super().__init__(**kwargs)
   
   def execute(self, context):
   self.operator = PythonVirtualenvOperator(
   task_id=f"{self.task_id}",
   python_callable=my_func,
   python_version="3.9",
   requirements=[f"requests"],
   )
   self.operator.execute(context)
   
   
   @dag(
   schedule=None,
   start_date=datetime(2021, 1, 1),
   catchup=False,
   default_args={
   "retries": 0,
   },
   )
   def my_dag():
   my_operator = MyOperator(task_id="my_task")
   ```
   
   ### How to reproduce
   
   Upload this dag to airflow and you will to the following error when you try 
to run it.
   
   ```
   from datetime import datetime
   from airflow.decorators import dag, task
   from airflow.models import BaseOperator
   from airflow.operators.bash import BashOperator
   from airflow.operators.python import PythonVirtualenvOperator
   
   
   def my_func():
   print("this is a function")
   
   
   class MyOperator(BaseOperator):
   def __init__(
   self,
   **kwargs,
   ) -> None:
   super().__init__(**kwargs)
   
   def execute(self, context):
   self.operator = PythonVirtualenvOperator(
   task_id=f"{self.task_id}_1",
   python_callable=my_func,
   python_version="3.9",
   requirements=[f"requests"],
  # uncomment this line to get the dag working
  # dag=context["dag"], 
   )
   # This operator works without passing the DAG in as an argument or 
updating the task name
   # self.operator = BashOperator(
   # task_id=f"{self.task_id}",
   # bash_command='echo Hello World!',
   # )
   self.operator.execute(context)
   
   
   @dag(
   schedule=None,
   start_date=datetime(2021, 1, 1),
   catchup=False,
   default_args={
   "retries": 0,
   },
   )
   def my_dag():
   my_operator = MyOperator(task_id="my_task")
   
   
   my_dag = my_dag()
   
   ```
   
   ### Operating System
   
   Debian GNU/Linux 11 (bullseye)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   reproduce in Astro CLI and in an Astro Deployment
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] AndrewShakinovsky-SAS commented on pull request #28503: Initial support for SAS community provider

2022-12-20 Thread GitBox


AndrewShakinovsky-SAS commented on PR #28503:
URL: https://github.com/apache/airflow/pull/28503#issuecomment-1360041558

   @potiuk @XD-DENG 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on pull request #28503: Initial support for SAS community provider

2022-12-20 Thread GitBox


boring-cyborg[bot] commented on PR #28503:
URL: https://github.com/apache/airflow/pull/28503#issuecomment-1360039716

   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it's a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] AndrewShakinovsky-SAS opened a new pull request, #28503: Initial support for SAS community provider

2022-12-20 Thread GitBox


AndrewShakinovsky-SAS opened a new pull request, #28503:
URL: https://github.com/apache/airflow/pull/28503

   Initial support for SAS community provider
   
   Adds:
   
   SAS Studio Flow Operator
   SAS Job Execution Operator
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


Taragolis commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1360016607

   Or where it applicable


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


Taragolis commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1360012551

   Is it might be better make in optional for all providers where it listed as 
core dependency?
   
   - amazon
   - apache.hive
   - exasol
   - google
   - presto
   - salesforce
   - trino


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


vincbeck commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1360007187

   I dont have much context on this one so if creating a PR for it is easy for 
you @potiuk, be my guest :) Happy to review it if needed!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on pull request #28502: Migrate DagFileProcessor.manage_slas to Internal API

2022-12-20 Thread GitBox


vincbeck commented on PR #28502:
URL: https://github.com/apache/airflow/pull/28502#issuecomment-1359993143

   cc @potiuk, @mhenc 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck opened a new pull request, #28502: Migrate DagFileProcessor.manage_slas to Internal API

2022-12-20 Thread GitBox


vincbeck opened a new pull request, #28502:
URL: https://github.com/apache/airflow/pull/28502

   Migrate DagFileProcessor.manage_slas to Internal API
   
   Addresses #28268 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on pull request #28417: Use object instead of array in config.yml for config template

2022-12-20 Thread GitBox


dstandish commented on PR #28417:
URL: https://github.com/apache/airflow/pull/28417#issuecomment-1359980647

   hi @potiuk , this is a small quality of life improvement making  the yaml 
more easily navigable with tooling


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


potiuk commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1359981860

   I can even make a draft PR for that - we have a few prior-art cases (plyvel 
in google provider was pretty similar and we made it optional, we even have the 
`AirflowOptionalProviderFeatureException` foreseen for the case where people 
would use an exssting code in the provider and the needed dependency is not 
installed. It's actually pretty easy to do.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


potiuk commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1359974090

   Yeah. Pandas is quite a "drag" as a dependency- regardles if it is a binary 
wheel or not so making it optional for Amazon provider would be a good idea. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


eladkal commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1359972491

   This is probably leftovers from where pandas was a core dependency.
   
   @o-nikolas @ferruzzi @vincbeck probbaly something worth looking into.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on issue #28500: Wrong skipping operator with NONE_FAILED_MIN_ONE_SUCCESS trigger rule after ShortCircuitOperator

2022-12-20 Thread GitBox


eladkal commented on issue #28500:
URL: https://github.com/apache/airflow/issues/28500#issuecomment-1359968082

   It's not a bug. It's by desgin.
   Please read ShortCircuitOperator documntation.
   You will find explnation and how to get the functionality you want :)
   
   
https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/operators/python/index.html#airflow.operators.python.ShortCircuitOperator


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal closed issue #28500: Wrong skipping operator with NONE_FAILED_MIN_ONE_SUCCESS trigger rule after ShortCircuitOperator

2022-12-20 Thread GitBox


eladkal closed issue #28500: Wrong skipping operator with 
NONE_FAILED_MIN_ONE_SUCCESS trigger rule after ShortCircuitOperator
URL: https://github.com/apache/airflow/issues/28500


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Move MyPY plugins of ours to dev folder (#28498)

2022-12-20 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 1f75e9ffcf Move MyPY plugins of ours to dev folder (#28498)
1f75e9ffcf is described below

commit 1f75e9ffcf0e61115ea141bc1c5de5002ef8f2c0
Author: Jarek Potiuk 
AuthorDate: Tue Dec 20 19:33:46 2022 +0100

Move MyPY plugins of ours to dev folder (#28498)

The Plugins are only used in the static check phase. The problem with
having them in "airflow" package is that mypy imports "airlfow" during
loading of the plugins and it means that it has to have fully working
Airflow configuration to work - otherwise this import fails while
reading the configuration values.

Moving the whole mypy plugins to dev solves the problem entirely.
---
 .github/boring-cyborg.yml  | 1 -
 {airflow => dev}/mypy/__init__.py  | 0
 {airflow => dev}/mypy/plugin/__init__.py   | 0
 {airflow => dev}/mypy/plugin/decorators.py | 0
 {airflow => dev}/mypy/plugin/outputs.py| 0
 setup.cfg  | 4 ++--
 6 files changed, 2 insertions(+), 3 deletions(-)

diff --git a/.github/boring-cyborg.yml b/.github/boring-cyborg.yml
index 0d72848862..7a444cf253 100644
--- a/.github/boring-cyborg.yml
+++ b/.github/boring-cyborg.yml
@@ -83,7 +83,6 @@ labelPRBasedOnFilePath:
 - tests/www/api/**/*
 
   area:dev-tools:
-- airflow/mypy/**/*
 - scripts/**/*
 - dev/**/*
 - .github/**/*
diff --git a/airflow/mypy/__init__.py b/dev/mypy/__init__.py
similarity index 100%
rename from airflow/mypy/__init__.py
rename to dev/mypy/__init__.py
diff --git a/airflow/mypy/plugin/__init__.py b/dev/mypy/plugin/__init__.py
similarity index 100%
rename from airflow/mypy/plugin/__init__.py
rename to dev/mypy/plugin/__init__.py
diff --git a/airflow/mypy/plugin/decorators.py b/dev/mypy/plugin/decorators.py
similarity index 100%
rename from airflow/mypy/plugin/decorators.py
rename to dev/mypy/plugin/decorators.py
diff --git a/airflow/mypy/plugin/outputs.py b/dev/mypy/plugin/outputs.py
similarity index 100%
rename from airflow/mypy/plugin/outputs.py
rename to dev/mypy/plugin/outputs.py
diff --git a/setup.cfg b/setup.cfg
index 2274d98e3a..5cc2a1342b 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -174,8 +174,8 @@ no_implicit_optional = True
 warn_redundant_casts = True
 warn_unused_ignores = False
 plugins =
-  airflow.mypy.plugin.decorators,
-  airflow.mypy.plugin.outputs
+  dev.mypy.plugin.decorators,
+  dev.mypy.plugin.outputs
 pretty = True
 show_error_codes = True
 



[GitHub] [airflow] jedcunningham merged pull request #28498: Move MyPY plugins of ours to dev folder

2022-12-20 Thread GitBox


jedcunningham merged PR #28498:
URL: https://github.com/apache/airflow/pull/28498


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vchiapaikeo commented on pull request #28444: Fix GCSToBigQueryOperator not respecting schema_obj

2022-12-20 Thread GitBox


vchiapaikeo commented on PR #28444:
URL: https://github.com/apache/airflow/pull/28444#issuecomment-1359965360

   My pleasure @eladkal! Sure I can try to take a look at it this weekend.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] morooshka opened a new issue, #28500: Wrong skipping operator with NONE_FAILED_MIN_ONE_SUCCESS trigger rule after ShortCircuitOperator

2022-12-20 Thread GitBox


morooshka opened a new issue, #28500:
URL: https://github.com/apache/airflow/issues/28500

   ### Apache Airflow version
   
   2.5.0
   
   ### What happened
   
   Operator with NONE_FAILED_MIN_ONE_SUCCESS trigger rule being wrongly skipped 
after the first parent operator had been skipped with its parent 
ShortCircuitOperator but some other parent operators are still being working. 
The same is reproducing with ONE_SUCCESS trigger rule
   
   ### What you think should happen instead
   
   The operator with NONE_FAILED_MIN_ONE_SUCCESS (ONE_SUCCESS) trigger rule 
should wait for all parents (one success parent) operators
   
   ### How to reproduce
   
   Please find the DAG example:
   
   ```
   from datetime import datetime
   from time import sleep
   from airflow import DAG
   from airflow.utils.trigger_rule import TriggerRule
   from airflow.operators.dummy import DummyOperator
   from airflow.operators.python import PythonOperator
   from airflow.operators.python import ShortCircuitOperator
   
   
   def do_sleep() -> bool:
   sleep(1)
   return True
   
   
   with DAG(
   dag_id="wrong_skip",
   description='Wrong skip',
   schedule_interval=None,
   start_date=datetime(2022, 11, 29),
   default_args={},
   max_active_runs=1,
   catchup=False,
   params={},
   is_paused_upon_creation=True,
   tags=['sandbox']
   ) as dag:
   task_switcher = ShortCircuitOperator(task_id="switcher", 
python_callable=lambda: False)
   task_skipped = PythonOperator(task_id="skipped", python_callable=lambda: 
True)
   task_executed = PythonOperator(task_id="executed", 
python_callable=do_sleep)
   task_end = DummyOperator(task_id='problem', 
trigger_rule=TriggerRule.NONE_FAILED_MIN_ONE_SUCCESS)
   
   task_switcher >> task_skipped >> task_end
   task_executed >> task_end
   
   ```
   
   The result of execution:
   
![image](https://user-images.githubusercontent.com/72786791/208738725-8c8f1b58-7020-4938-a272-c53a33d5efb4.png)
   
   
   ### Operating System
   
   CentOS Linux 7 (Core)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   The problem reproduces every time
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on issue #23824: Race condition between Triggerer and Scheduler

2022-12-20 Thread GitBox


Taragolis commented on issue #23824:
URL: https://github.com/apache/airflow/issues/23824#issuecomment-1359954109

   @RNHTTR better to open a new issue with additional details and how to 
reproduce it.
   
   However it also could be a problem with `ExternalTaskSensorAsync` which not 
a part of Airflow or community providers and in this case better open an issue 
in 
[astronomer/astronomer-providers](https://github.com/astronomer/astronomer-providers/issues)
   
   As far as I know many of deferrable Operators use 
[`asgiref.sync.sync_to_async`](https://docs.djangoproject.com/en/4.1/topics/async/#asgiref.sync.sync_to_async)
 for communicate with synchronous part of Airflow (most of the code of Airflow 
are synchronous) such as Connections, Variable, Configurations, read/write into 
Airflow metadata database and this approach could have some side effects. But 
it just my assumption.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #28280: Inconsistent handling of newlines in tasks

2022-12-20 Thread GitBox


potiuk commented on issue #28280:
URL: https://github.com/apache/airflow/issues/28280#issuecomment-1359916847

   Good catch.
   
   
   It is because this is what Jinja2 does by default - it replaces any new line 
it detects in the template (not in the values) with the value of 
newline_sequence passed as parameter of the Template (which is `\n` by 
default). This is nothing specific to Airflow - it's just how Jinja2 works.
   
   You can test it yourself:
   
   When you run this In a regular terminal this one (same with no 
newline_sequence):
   ```
   import jinja2
   
   jinjatemplate = jinja2.Template('Hello \r{{ name }}!', newline_sequence="\n")
   print(jinjatemplate.render(name="World"))
   ```
   
   You will get:
   
   ```
   Hello 
   World!
   ```
   
   Where this one:
   
   ```
   import jinja2
   
   jinjatemplate = jinja2.Template('Hello \r{{ name }}!', newline_sequence="\r")
   print(jinjatemplate.render(name="World"))
   ```
   
   Will print just:
   
   ```
   World!
   ```
   
   The thing is that Jinja2 treats the template as multi-line text file.  It 
reads the temple line by line and later joins the line using "newline_sequence" 
as separator. I think this has a lot to do with the ways how some formatting of 
jinja template and whitespace is done. For example it would be rather difficult 
to achieve whitespace manipulation that Jinja allows you to do with whitespace 
control: 
https://jinja.palletsprojects.com/en/3.1.x/templates/#whitespace-control - so 
JINJA authors actually opted to define what is the "canonical newline" they 
always use in the rendered template. Which makes a lot of sense - imagine the 
complexity if you have a template that mixes both \r and 'n and Jinja 
adding/removing those - with some level of consistency.
   
   Additionally, the `test_bash_operator_heredoc_contains_newlines` does not 
complete successfuly for you because it has a bug (in the task definition). 
   
   Proper command should be:
   
   ```
  bash_command="""
   diff <(
   cat <&2 "Bash heredoc contains newlines incorrectly converted to 
Windows CRLF"
   exit 1
   }
   """,
   ```
   
   The problem was that heredoc was pased as input to dos2unix command, not to 
the `cat` command and the cat command simply read from the stdin. Apparently 
astro sdk somehow replaces the stdin when it runs airflow tasks and cat simply 
gets empty string or smth like that.
   
   Solution:
   
   I am not sure if we want to do anything with it. This seems to be standard 
Jinja2 behaviour, and even if it kinda unexpected, I think this is the 
behaviour our users already likely rely on and changing it would be 
backwards-incompatible. Maybe we could specify newline_sequence somewhere in 
DAG definition (same as render_template_as_native_object?) 
   
   In any case, It would be great however if this behaviour is documented.
   
   WDYT?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jedcunningham commented on a diff in pull request #27829: Improving the release process

2022-12-20 Thread GitBox


jedcunningham commented on code in PR #27829:
URL: https://github.com/apache/airflow/pull/27829#discussion_r1053594395


##
.github/workflows/ci.yml:
##
@@ -746,6 +746,44 @@ jobs:
 run: breeze ci fix-ownership
 if: always()
 
+  test-airflow-release-commands:
+timeout-minutes: 80
+name: "Test Airflow release commands"
+runs-on: "${{needs.build-info.outputs.runs-on}}"
+needs: [build-info, wait-for-ci-images]
+if: needs.build-info.outputs.runs-on == 'self-hosted'
+env:
+  RUNS_ON: "${{needs.build-info.outputs.runs-on}}"
+  PYTHON_MAJOR_MINOR_VERSION: 
"${{needs.build-info.outputs.default-python-version}}"
+steps:
+  - name: Cleanup repo
+run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm 
-rf /workspace/*"
+  - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
+uses: actions/checkout@v3
+with:
+  persist-credentials: false
+  - name: >
+  Prepare breeze & CI image: 
${{needs.build-info.outputs.default-python-version}}:${{env.IMAGE_TAG}}
+uses: ./.github/actions/prepare_breeze_and_image
+  - name: "Cleanup dist files"
+run: rm -fv ./dist/*
+  - name: "Check Airflow RC process command"
+run: |
+  pip install twine rich_click jinja2 gitpython
+  breeze release-management start-rc-process --version 2.4.3rc1 
--previous-version 2.4.2  --answer y
+  - name: "Check Airflow release command"
+run: |
+  pip install twine

Review Comment:
   Do we need to do this repetitively? Can we have a single step that just 
installs what we need for the rest?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] manugarri commented on issue #28468: Make pandas an optional dependency for amazon provider

2022-12-20 Thread GitBox


manugarri commented on issue #28468:
URL: https://github.com/apache/airflow/issues/28468#issuecomment-1359877907

   @Taragolis we use AWS MWAA, so we are stuck with python 3.7 unfortunately :/


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi commented on issue #28483: Issues with Custom Menu Items on Smaller Windows

2022-12-20 Thread GitBox


bbovenzi commented on issue #28483:
URL: https://github.com/apache/airflow/issues/28483#issuecomment-1359726425

   The `padding-top` should work. You may need to run the webserver in dev mode 
`airflow webserver -d`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] michaelmicheal commented on issue #28483: Issues with Custom Menu Items on Smaller Windows

2022-12-20 Thread GitBox


michaelmicheal commented on issue #28483:
URL: https://github.com/apache/airflow/issues/28483#issuecomment-1359718366

   I was playing around with a patch in our Airflow instance like this, but 
didn't seem to change the css in the browser. Is there some sort of style 
rendering that's done in CI or something that prevents me from just changing 
main.css in our Airflow image to make this change?
   ```diff
   --- /usr/local/lib/python3.9/site-packages/airflow/www/static/css/main.css
   +++ /usr/local/lib/python3.9/site-packages/airflow/www/static/css/main.css
   @@ -50,6 +50,7 @@ div.container {
  width: 98%;
  padding-left: 15px;
  padding-right: 15px;
   +  padding-top: 50px;
}
   
.navbar a {
   @@ -61,6 +62,11 @@ div.container {
  color: #e2d2e2;
}
   
   +.navbar li.dropdown .dropdown-menu {
   +  overflow-y: auto;
   +}
   +
   +
.navbar-nav li.dropdown:hover > .dropdown-menu,
.navbar-nav li.dropdown:focus-within > .dropdown-menu {
  display: block;
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi commented on issue #28483: Issues with Custom Menu Items on Smaller Windows

2022-12-20 Thread GitBox


bbovenzi commented on issue #28483:
URL: https://github.com/apache/airflow/issues/28483#issuecomment-1359707933

   1. We'd need some sort of `overflow-y: auto;` to `.dropdown-menu` and have a 
max-height never be more than the window height 
   2. We'd need to set a `max-width` and `overflow-x: auto` to `.navbar-nav`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] sdebruyn opened a new issue, #28499: "Dependency already registered for DAG" warnings with tasks with multiple outputs

2022-12-20 Thread GitBox


sdebruyn opened a new issue, #28499:
URL: https://github.com/apache/airflow/issues/28499

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   Airflow 2.4.3
   
   Just like in issue #26599 I get a lot of warnings saying I'm registering 
dependencies multiple times. With a little bit of debugging, I found the 
culprit:.
   
   Some of my tasks have multiple outputs, which are then used by other tasks. 
These are TaskFlow-based tasks. The task dependency is added when it detects 
the first XComArg and then it's added again with every following XComArg, 
generating these warnings.
   
   Call stack:
   
   https://user-images.githubusercontent.com/963413/208720058-bf66a57c-8849-4fda-9bd4-b570eac4ce93.png";>
   
   
   ### What you think should happen instead
   
   You should not see the warnings if the cause is that you're using multiple 
outputs from the same upstream task.
   
   ### How to reproduce
   
   Create a function that returns a `dict[str, str]` and make it a task with 
`@task(multiple_outputs=True)`. Then create another task that takes multiple of 
these outputs as input.
   
   ### Operating System
   
   Debian GNU/Linux 11 (bullseye)
   
   ### Versions of Apache Airflow Providers
   
   not relevant
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   You can get this with unit tests just by creating a `DagBag` instance with 
your DAGs.
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #28498: Move MyPY plugins of ours to dev folder

2022-12-20 Thread GitBox


potiuk commented on PR #28498:
URL: https://github.com/apache/airflow/pull/28498#issuecomment-1359701473

   It will also make "mypy" a bit faster to start because it will not have to 
import airflow as "regular" package. I beleive that when we had it in airflow, 
it actually imported "airflow" twice - once to initialize the plugin and second 
time to import everything with "TYPE_CHECKING". Now only the "TYPE_CHECKING" 
import remains. It does not perform all the intialization that normally is 
performed when "airflow" package gets importe, so it shoudl make mypy quite a 
bit faster (especially when running with few files).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #28498: Move MyPY plugins of ours to dev folder

2022-12-20 Thread GitBox


potiuk commented on PR #28498:
URL: https://github.com/apache/airflow/pull/28498#issuecomment-1359696663

   I noticed it while implementing #28495 and realized we shoudl separate it 
out.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk opened a new pull request, #28498: Move MyPY plugins of ours to dev folder

2022-12-20 Thread GitBox


potiuk opened a new pull request, #28498:
URL: https://github.com/apache/airflow/pull/28498

   The Plugins are only used in the static check phase. The problem with having 
them in "airflow" package is that mypy imports "airlfow" during loading of the 
plugins and it means that it has to have fully working Airflow configuration to 
work - otherwise this import fails while reading the configuration values.
   
   Moving the whole mypy plugins to dev solves the problem entirely.
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi commented on a diff in pull request #28411: Fix calendar view for CronTriggerTimeTable dags

2022-12-20 Thread GitBox


bbovenzi commented on code in PR #28411:
URL: https://github.com/apache/airflow/pull/28411#discussion_r1053517558


##
airflow/www/views.py:
##
@@ -2828,7 +2828,7 @@ def _convert_to_date(session, column):
 restriction = TimeRestriction(dag.start_date, dag.end_date, False)
 dates = collections.Counter()
 
-if isinstance(dag.timetable, CronDataIntervalTimetable):
+if isinstance(dag.timetable, CronTriggerTimetable):

Review Comment:
   1. Included both cron timetables in the if statement
   
   2. `next_dagrun_info` was always the same 
[here](https://github.com/apache/airflow/blob/main/airflow/www/views.py#L2844). 
So we never reached either of the `break` statements. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



  1   2   >