[GitHub] [airflow] sahilthapar commented on pull request #8625: [AIRFLOW-4734] Upsert functionality for PostgresHook.insert_rows()

2020-09-02 Thread GitBox


sahilthapar commented on pull request #8625:
URL: https://github.com/apache/airflow/pull/8625#issuecomment-686296002


   The upsert functionality doesn't support "Overriding system value" for 
identity columns, any planned support?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Update INTHEWILD.md (#10703)

2020-09-02 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 2f5bf8b  Update INTHEWILD.md (#10703)
2f5bf8b is described below

commit 2f5bf8bc48b0ec8e3f7377fc9d627c832f0fb1d0
Author: Diego Lopes 
AuthorDate: Thu Sep 3 03:43:47 2020 -0300

Update INTHEWILD.md (#10703)
---
 INTHEWILD.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/INTHEWILD.md b/INTHEWILD.md
index d6e349f..f875d14 100644
--- a/INTHEWILD.md
+++ b/INTHEWILD.md
@@ -125,7 +125,7 @@ Currently **officially** using Airflow:
 1. [Data Reply](https://www.datareply.co.uk/) 
[[@kaxil](https://github.com/kaxil)]
 1. [DataCamp](https://datacamp.com/) [[@dgrtwo](https://github.com/dgrtwo)]
 1. [DataFox](https://www.datafox.com/) 
[[@sudowork](https://github.com/sudowork)]
-1. [DataSprints](https://datasprints.com/) 
[[@lopesdiego12](https://github.com/lopesdiego12)]
+1. [DataSprints](https://datasprints.com/) 
[[@lopesdiego12](https://github.com/lopesdiego12) & 
[@rafaelsantanaep](https://github.com/rafaelsantanaep)]
 1. [Dentsu Inc.](http://www.dentsu.com/) 
[[@bryan831](https://github.com/bryan831) & 
[@loozhengyuan](https://github.com/loozhengyuan)]
 1. [Deseret Digital Media](http://deseretdigital.com/) 
[[@formigone](https://github.com/formigone)
 1. [Digital First Media](http://www.digitalfirstmedia.com/) 
[[@duffn](https://github.com/duffn) & [@mschmo](https://github.com/mschmo) & 
[@seanmuth](https://github.com/seanmuth)]



[GitHub] [airflow] turbaszek merged pull request #10703: Update INTHEWILD.md

2020-09-02 Thread GitBox


turbaszek merged pull request #10703:
URL: https://github.com/apache/airflow/pull/10703


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] msumit commented on issue #10704: Unable to Stop / Kill the currently running Airflow DAG programmatically.

2020-09-02 Thread GitBox


msumit commented on issue #10704:
URL: https://github.com/apache/airflow/issues/10704#issuecomment-686280441


   It feels like a use case of DAG 
[versioning](https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-36+DAG+Versioning),
 with a flag to let scheduler/worker to kill the running tasks if they are not 
at the latest version. 
   
   @kaxil should be able to shed more light here. 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] msumit commented on pull request #10681: Add Stacktrace when DagFileProcessorManager gets killed

2020-09-02 Thread GitBox


msumit commented on pull request #10681:
URL: https://github.com/apache/airflow/pull/10681#issuecomment-686250837


   @potiuk @XD-DENG can you guys approve it, if looks fine?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] tag nightly-master updated (f40ac9b -> 4e09cb5)

2020-09-02 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to tag nightly-master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


*** WARNING: tag nightly-master was modified! ***

from f40ac9b  (commit)
  to 4e09cb5  (commit)
from f40ac9b  Add placement_strategy option (#9444)
 add 804548d  Add Dataprep operators (#10304)
 add cc551ba  Add packages to function names in bash (#10670)
 add 70f05ac  Add `log_id` field to log lines on ES handler (#10411)
 add 8ac6f29  Fix format of install commands (#10676)
 add 4c4a7a8  Improve getting started section (#10680)
 add 50c9411  Remove airflow-pr tool  (#10675)
 add 805781b  Update INTHEWILD.md (#10683)
 add 0d76b59  Remove redundant section from dev/README.md toc (#10689)
 add 72b2be7  [AIRFLOW-XXX] Add task execution process on Celery Execution 
diagram (#6961)
 add 9108cb5  docs: They added support for celltags to Jupyter Lab (#9141)
 add 0d9e421  Unify command names in CLI (#10669)
 add 338b412  Add on_kill support for the KubernetesPodOperator (#10666)
 add 9a10f83  Revert recent breeze changes (#10651 & #10670) (#10694)
 add 02b853b  Fix failing black test (#10697)
 add 48ce4bd  Fix missing dash in flag for statsd container (#10691)
 add e5785d4  Chart: Flower deployment should use Flower image (#10701)
 add 649ce4b  Implement Google Shell Conventions for breeze script (#10695)
 add 4e09cb5  Add packages to function names in bash (#10670) (#10696)

No new revisions were added by this update.

Summary of changes:
 .pre-commit-config.yaml|6 +-
 BREEZE.rst |2 +-
 IMAGES.rst |4 +-
 INTHEWILD.md   |1 +
 README.md  |8 +-
 UPDATING.md|   75 +-
 airflow/cli/cli_parser.py  |   16 +-
 airflow/cli/commands/legacy_commands.py|4 +-
 airflow/cli/commands/task_command.py   |2 +-
 airflow/models/connection.py   |2 +-
 .../cncf/kubernetes/operators/kubernetes_pod.py|   10 +
 .../providers/elasticsearch/log/es_task_handler.py |3 +
 .../google/cloud/example_dags/example_dataprep.py  |   45 +-
 airflow/providers/google/cloud/hooks/dataprep.py   |   76 +-
 .../providers/google/cloud/operators/dataprep.py   |   76 +-
 breeze |  426 
 chart/templates/_helpers.yaml  |2 +-
 chart/templates/statsd/statsd-deployment.yaml  |2 +-
 dev/README.md  |   85 +-
 dev/airflow-pr | 1036 
 dev/requirements.txt   |4 +-
 docs/executor/celery.rst   |   35 +
 docs/howto/operator/google/cloud/dataprep.rst  |   76 +-
 docs/howto/operator/papermill.rst  |3 -
 docs/img/run_task_on_celery_executor.png   |  Bin 0 -> 55939 bytes
 docs/img/run_task_on_celery_executor.puml  |   77 ++
 docs/security/secrets/fernet.rst   |2 +-
 kubernetes_tests/test_kubernetes_pod_operator.py   |   28 +
 .../ci_prepare_backport_packages.sh|6 +-
 .../ci_prepare_backport_readme.sh  |6 +-
 ...ci_test_backport_packages_import_all_classes.sh |4 +-
 ...ci_test_backport_packages_install_separately.sh |4 +-
 scripts/ci/constraints/ci_generate_constraints.sh  |6 +-
 scripts/ci/docs/ci_docs.sh |6 +-
 scripts/ci/images/ci_build_dockerhub.sh|   12 +-
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |   12 +-
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |8 +-
 scripts/ci/images/ci_push_ci_images.sh |4 +-
 scripts/ci/images/ci_push_production_images.sh |4 +-
 scripts/ci/images/ci_wait_for_all_ci_images.sh |4 +-
 scripts/ci/images/ci_wait_for_all_prod_images.sh   |6 +-
 .../ci/kubernetes/ci_deploy_app_to_kubernetes.sh   |   20 +-
 scripts/ci/kubernetes/ci_run_kubernetes_tests.sh   |6 +-
 scripts/ci/libraries/_build_images.sh  |  190 ++--
 scripts/ci/libraries/_initialization.sh|  102 +-
 scripts/ci/libraries/_kind.sh  |   48 +-
 scripts/ci/libraries/_local_mounts.sh  |6 +-
 scripts/ci/libraries/_md5sum.sh|   44 +-
 scripts/ci/libraries/_parameters.sh|8 +-
 scripts/ci/libraries/_permissions.sh   |   22 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |   52 +-
 scripts/ci/libraries/_pylint.sh|2 +-
 scripts/ci/libraries/_runs.sh  |8 +-
 scripts/ci/libraries/_sanity_checks.sh |   

[airflow] branch 1-10-yaml-generator created (now ebe7149)

2020-09-02 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a change to branch 1-10-yaml-generator
in repository https://gitbox.apache.org/repos/asf/airflow.git.


  at ebe7149  add generate_yaml to 1.10

This branch includes the following new commits:

 new ca37685  YAML Generation function
 new ebe7149  add generate_yaml to 1.10

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




[airflow] 02/02: add generate_yaml to 1.10

2020-09-02 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a commit to branch 1-10-yaml-generator
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ebe71491fc1c8e6489d4417e543ec5de23736943
Author: Daniel Imberman 
AuthorDate: Wed Sep 2 18:37:59 2020 -0700

add generate_yaml to 1.10
---
 airflow/bin/cli.py |  33 -
 chart/charts/postgresql-6.3.12.tgz | Bin 0 -> 22754 bytes
 2 files changed, 20 insertions(+), 13 deletions(-)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index 619e305..1e01ab2 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -1494,40 +1494,47 @@ def list_users(args):
 
 @cli_utils.action_logging
 def generate_kubernetes_pod_yaml(args):
-from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
-from airflow.kubernetes.pod_generator import PodGenerator
-from airflow.kubernetes.worker_configuration import WorkerConfiguration
+from airflow.contrib.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig, KubernetesExecutorConfig
+from airflow.contrib.kubernetes.worker_configuration import 
WorkerConfiguration
+from airflow.settings import pod_mutation_hook
+from airflow.contrib.kubernetes.kubernetes_request_factory import \
+pod_request_factory as pod_factory
+
+kube_request_factory = pod_factory.SimplePodRequestFactory()
 execution_date = datetime.datetime(2020, 11, 3)
 dag = get_dag(args)
 yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"
 kube_config = KubeConfig()
 for task in dag.tasks:
 ti = TaskInstance(task, execution_date)
-pod = PodGenerator.construct_pod(
+pod_id = AirflowKubernetesScheduler._create_pod_id(  # pylint: 
disable=W0212
+args.dag_id, ti.task_id)
+print("pod id " + str(pod_id))
+worker_configuration = WorkerConfiguration(kube_config=kube_config)
+pod = worker_configuration.make_pod(
 dag_id=args.dag_id,
 task_id=ti.task_id,
-pod_id=AirflowKubernetesScheduler._create_pod_id(  # pylint: 
disable=W0212
-args.dag_id, ti.task_id),
+pod_id=pod_id,
 try_number=ti.try_number,
-date=ti.execution_date,
-command=ti.command_as_list(),
-kube_executor_config=PodGenerator.from_obj(ti.executor_config),
+airflow_command=ti.command_as_list(),
+execution_date=execution_date,
+
kube_executor_config=KubernetesExecutorConfig.from_dict(ti.executor_config),
 worker_uuid="worker-config",
 namespace=kube_config.executor_namespace,
-worker_config=WorkerConfiguration(kube_config=kube_config).as_pod()
 )
+pod_mutation_hook(pod)
+request = kube_request_factory.create(pod)
+
 import os
 
 import yaml
 from kubernetes.client.api_client import ApiClient
-api_client = ApiClient()
 date_string = 
AirflowKubernetesScheduler._datetime_to_label_safe_datestring( # pylint: 
disable=W0212
 execution_date)
 yaml_file_name = f"{args.dag_id}_{ti.task_id}_{date_string}.yml"
 os.makedirs(os.path.dirname(yaml_output_path), exist_ok=True)
 with open(yaml_output_path + yaml_file_name, "w") as output:
-sanitized_pod = api_client.sanitize_for_serialization(pod)
-output.write(yaml.dump(sanitized_pod))
+output.write(yaml.dump(request))
 print(f"YAML output can be found at {yaml_output_path}")
 
 @cli_utils.action_logging
diff --git a/chart/charts/postgresql-6.3.12.tgz 
b/chart/charts/postgresql-6.3.12.tgz
new file mode 100644
index 000..51751d7
Binary files /dev/null and b/chart/charts/postgresql-6.3.12.tgz differ



[airflow] 01/02: YAML Generation function

2020-09-02 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a commit to branch 1-10-yaml-generator
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ca376853360c6fcbed033122e1ca11ad1d19d429
Author: Daniel Imberman 
AuthorDate: Wed Sep 2 18:07:26 2020 -0700

YAML Generation function

(cherry picked from commit 1d49f62589f981e64fd58ddc39ad78717590bc26)
---
 airflow/bin/cli.py | 53 +
 1 file changed, 53 insertions(+)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index 82162f2..619e305 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -19,6 +19,7 @@
 # under the License.
 
 from __future__ import print_function
+import datetime
 import errno
 import importlib
 import logging
@@ -1491,6 +1492,43 @@ def list_users(args):
 msg = msg.encode('utf-8')
 print(msg)
 
+@cli_utils.action_logging
+def generate_kubernetes_pod_yaml(args):
+from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.kubernetes.worker_configuration import WorkerConfiguration
+execution_date = datetime.datetime(2020, 11, 3)
+dag = get_dag(args)
+yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"
+kube_config = KubeConfig()
+for task in dag.tasks:
+ti = TaskInstance(task, execution_date)
+pod = PodGenerator.construct_pod(
+dag_id=args.dag_id,
+task_id=ti.task_id,
+pod_id=AirflowKubernetesScheduler._create_pod_id(  # pylint: 
disable=W0212
+args.dag_id, ti.task_id),
+try_number=ti.try_number,
+date=ti.execution_date,
+command=ti.command_as_list(),
+kube_executor_config=PodGenerator.from_obj(ti.executor_config),
+worker_uuid="worker-config",
+namespace=kube_config.executor_namespace,
+worker_config=WorkerConfiguration(kube_config=kube_config).as_pod()
+)
+import os
+
+import yaml
+from kubernetes.client.api_client import ApiClient
+api_client = ApiClient()
+date_string = 
AirflowKubernetesScheduler._datetime_to_label_safe_datestring( # pylint: 
disable=W0212
+execution_date)
+yaml_file_name = f"{args.dag_id}_{ti.task_id}_{date_string}.yml"
+os.makedirs(os.path.dirname(yaml_output_path), exist_ok=True)
+with open(yaml_output_path + yaml_file_name, "w") as output:
+sanitized_pod = api_client.sanitize_for_serialization(pod)
+output.write(yaml.dump(sanitized_pod))
+print(f"YAML output can be found at {yaml_output_path}")
 
 @cli_utils.action_logging
 def list_dag_runs(args, dag=None):
@@ -1581,6 +1619,11 @@ class CLIFactory(object):
 'execution_date': Arg(
 ("execution_date",), help="The execution date of the DAG",
 type=parsedate),
+'output_path': Arg(
+('-o', '--output-path'),
+help="output path for yaml file",
+default="/tmp/airflow_yaml_output/"
+),
 'task_regex': Arg(
 ("-t", "--task_regex"),
 "The regex to filter specific task_ids to backfill (optional)"),
@@ -2076,6 +2119,16 @@ class CLIFactory(object):
 'dag_id', 'no_backfill', 'state'
 )
 }, {
+'func': generate_kubernetes_pod_yaml,
+'help': "List dag runs given a DAG id. If state option is given, 
it will only"
+"search for all the dagruns with the given state. "
+"If no_backfill option is given, it will filter out"
+"all backfill dagruns for given dag id.",
+'args': (
+'dag_id', 'output_path', 'subdir'
+)
+
+}, {
 'func': list_tasks,
 'help': "List the tasks within a DAG",
 'args': ('dag_id', 'tree', 'subdir'),



[airflow] branch 1-10-yaml-generator created (now ebe7149)

2020-09-02 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a change to branch 1-10-yaml-generator
in repository https://gitbox.apache.org/repos/asf/airflow.git.


  at ebe7149  add generate_yaml to 1.10

This branch includes the following new commits:

 new ca37685  YAML Generation function
 new ebe7149  add generate_yaml to 1.10

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




[airflow] 01/02: YAML Generation function

2020-09-02 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a commit to branch 1-10-yaml-generator
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ca376853360c6fcbed033122e1ca11ad1d19d429
Author: Daniel Imberman 
AuthorDate: Wed Sep 2 18:07:26 2020 -0700

YAML Generation function

(cherry picked from commit 1d49f62589f981e64fd58ddc39ad78717590bc26)
---
 airflow/bin/cli.py | 53 +
 1 file changed, 53 insertions(+)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index 82162f2..619e305 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -19,6 +19,7 @@
 # under the License.
 
 from __future__ import print_function
+import datetime
 import errno
 import importlib
 import logging
@@ -1491,6 +1492,43 @@ def list_users(args):
 msg = msg.encode('utf-8')
 print(msg)
 
+@cli_utils.action_logging
+def generate_kubernetes_pod_yaml(args):
+from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.kubernetes.worker_configuration import WorkerConfiguration
+execution_date = datetime.datetime(2020, 11, 3)
+dag = get_dag(args)
+yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"
+kube_config = KubeConfig()
+for task in dag.tasks:
+ti = TaskInstance(task, execution_date)
+pod = PodGenerator.construct_pod(
+dag_id=args.dag_id,
+task_id=ti.task_id,
+pod_id=AirflowKubernetesScheduler._create_pod_id(  # pylint: 
disable=W0212
+args.dag_id, ti.task_id),
+try_number=ti.try_number,
+date=ti.execution_date,
+command=ti.command_as_list(),
+kube_executor_config=PodGenerator.from_obj(ti.executor_config),
+worker_uuid="worker-config",
+namespace=kube_config.executor_namespace,
+worker_config=WorkerConfiguration(kube_config=kube_config).as_pod()
+)
+import os
+
+import yaml
+from kubernetes.client.api_client import ApiClient
+api_client = ApiClient()
+date_string = 
AirflowKubernetesScheduler._datetime_to_label_safe_datestring( # pylint: 
disable=W0212
+execution_date)
+yaml_file_name = f"{args.dag_id}_{ti.task_id}_{date_string}.yml"
+os.makedirs(os.path.dirname(yaml_output_path), exist_ok=True)
+with open(yaml_output_path + yaml_file_name, "w") as output:
+sanitized_pod = api_client.sanitize_for_serialization(pod)
+output.write(yaml.dump(sanitized_pod))
+print(f"YAML output can be found at {yaml_output_path}")
 
 @cli_utils.action_logging
 def list_dag_runs(args, dag=None):
@@ -1581,6 +1619,11 @@ class CLIFactory(object):
 'execution_date': Arg(
 ("execution_date",), help="The execution date of the DAG",
 type=parsedate),
+'output_path': Arg(
+('-o', '--output-path'),
+help="output path for yaml file",
+default="/tmp/airflow_yaml_output/"
+),
 'task_regex': Arg(
 ("-t", "--task_regex"),
 "The regex to filter specific task_ids to backfill (optional)"),
@@ -2076,6 +2119,16 @@ class CLIFactory(object):
 'dag_id', 'no_backfill', 'state'
 )
 }, {
+'func': generate_kubernetes_pod_yaml,
+'help': "List dag runs given a DAG id. If state option is given, 
it will only"
+"search for all the dagruns with the given state. "
+"If no_backfill option is given, it will filter out"
+"all backfill dagruns for given dag id.",
+'args': (
+'dag_id', 'output_path', 'subdir'
+)
+
+}, {
 'func': list_tasks,
 'help': "List the tasks within a DAG",
 'args': ('dag_id', 'tree', 'subdir'),



[airflow] 02/02: add generate_yaml to 1.10

2020-09-02 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a commit to branch 1-10-yaml-generator
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ebe71491fc1c8e6489d4417e543ec5de23736943
Author: Daniel Imberman 
AuthorDate: Wed Sep 2 18:37:59 2020 -0700

add generate_yaml to 1.10
---
 airflow/bin/cli.py |  33 -
 chart/charts/postgresql-6.3.12.tgz | Bin 0 -> 22754 bytes
 2 files changed, 20 insertions(+), 13 deletions(-)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index 619e305..1e01ab2 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -1494,40 +1494,47 @@ def list_users(args):
 
 @cli_utils.action_logging
 def generate_kubernetes_pod_yaml(args):
-from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
-from airflow.kubernetes.pod_generator import PodGenerator
-from airflow.kubernetes.worker_configuration import WorkerConfiguration
+from airflow.contrib.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig, KubernetesExecutorConfig
+from airflow.contrib.kubernetes.worker_configuration import 
WorkerConfiguration
+from airflow.settings import pod_mutation_hook
+from airflow.contrib.kubernetes.kubernetes_request_factory import \
+pod_request_factory as pod_factory
+
+kube_request_factory = pod_factory.SimplePodRequestFactory()
 execution_date = datetime.datetime(2020, 11, 3)
 dag = get_dag(args)
 yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"
 kube_config = KubeConfig()
 for task in dag.tasks:
 ti = TaskInstance(task, execution_date)
-pod = PodGenerator.construct_pod(
+pod_id = AirflowKubernetesScheduler._create_pod_id(  # pylint: 
disable=W0212
+args.dag_id, ti.task_id)
+print("pod id " + str(pod_id))
+worker_configuration = WorkerConfiguration(kube_config=kube_config)
+pod = worker_configuration.make_pod(
 dag_id=args.dag_id,
 task_id=ti.task_id,
-pod_id=AirflowKubernetesScheduler._create_pod_id(  # pylint: 
disable=W0212
-args.dag_id, ti.task_id),
+pod_id=pod_id,
 try_number=ti.try_number,
-date=ti.execution_date,
-command=ti.command_as_list(),
-kube_executor_config=PodGenerator.from_obj(ti.executor_config),
+airflow_command=ti.command_as_list(),
+execution_date=execution_date,
+
kube_executor_config=KubernetesExecutorConfig.from_dict(ti.executor_config),
 worker_uuid="worker-config",
 namespace=kube_config.executor_namespace,
-worker_config=WorkerConfiguration(kube_config=kube_config).as_pod()
 )
+pod_mutation_hook(pod)
+request = kube_request_factory.create(pod)
+
 import os
 
 import yaml
 from kubernetes.client.api_client import ApiClient
-api_client = ApiClient()
 date_string = 
AirflowKubernetesScheduler._datetime_to_label_safe_datestring( # pylint: 
disable=W0212
 execution_date)
 yaml_file_name = f"{args.dag_id}_{ti.task_id}_{date_string}.yml"
 os.makedirs(os.path.dirname(yaml_output_path), exist_ok=True)
 with open(yaml_output_path + yaml_file_name, "w") as output:
-sanitized_pod = api_client.sanitize_for_serialization(pod)
-output.write(yaml.dump(sanitized_pod))
+output.write(yaml.dump(request))
 print(f"YAML output can be found at {yaml_output_path}")
 
 @cli_utils.action_logging
diff --git a/chart/charts/postgresql-6.3.12.tgz 
b/chart/charts/postgresql-6.3.12.tgz
new file mode 100644
index 000..51751d7
Binary files /dev/null and b/chart/charts/postgresql-6.3.12.tgz differ



[GitHub] [airflow] boring-cyborg[bot] commented on issue #10704: Unable to Stop / Kill the currently running Airflow DAG programmatically.

2020-09-02 Thread GitBox


boring-cyborg[bot] commented on issue #10704:
URL: https://github.com/apache/airflow/issues/10704#issuecomment-686174687


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jsaradhy opened a new issue #10704: Unable to Stop / Kill the currently running Airflow DAG programmatically.

2020-09-02 Thread GitBox


jsaradhy opened a new issue #10704:
URL: https://github.com/apache/airflow/issues/10704


   I have an usecase, we are implementing the automation with Gitlab, AWS S3 & 
Codepipeline, Code & Airflow. In a nutshell when there is a new version of the 
dag is available in the DAG BAG, i need to pro-grammatically stop the currently 
running Airflow DAG.
   
   Gurus, could you please shed some light on this : Programmatically how can i 
stop/ kill the currently running DAG in Airflow ? 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] atsalolikhin-spokeo commented on issue #523: SQL Alchemy Connection String - How to store password?

2020-09-02 Thread GitBox


atsalolikhin-spokeo commented on issue #523:
URL: https://github.com/apache/airflow/issues/523#issuecomment-686163715


   I don't see this in the 
[FAQ](https://airflow.apache.org/docs/stable/faq.html)  
   
   I am trying to figure out how to store a password in Airflow variable table. 
 I see from 
https://github.com/apache/airflow/blob/v1-10-stable/docs/ui.rst#variable-view 
that Airflow UI won't show the password if the key name has e.g. `api_key` in 
it, but I also see that the password is stored unencrypted in the Airflow 
`variable` table.  
   
   I'm also a bit confused because Admin -> Variables list has a column 
`is_encrypted`, but when I go to add a variable, I am prompted for name and 
value -- there is nothing about encryption. 
   
   So how do encrypted variables work?
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] lopesdiego12 opened a new pull request #10703: Update INTHEWILD.md

2020-09-02 Thread GitBox


lopesdiego12 opened a new pull request #10703:
URL: https://github.com/apache/airflow/pull/10703


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


kaxil commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482607325



##
File path: tests/cli/commands/test_dag_command.py
##
@@ -139,6 +140,19 @@ def test_show_dag_print(self):
 self.assertIn("graph [label=example_bash_operator labelloc=t 
rankdir=LR]", out)
 self.assertIn("runme_2 -> run_after_loop", out)
 
+def test_generate_dag_yaml(self):
+
+directory = "/tmp/airflow_dry_run_test/"
+file_name = 
"example_bash_operator_run_after_loop_2020-11-03T00_00_00.yml"
+if os.path.exists(directory):
+shutil.rmtree(directory)
+dag_command.generate_pod_yaml(self.parser.parse_args([
+'dags', 'generate_kubernetes_pod_yaml',
+"--output-path", directory, 'example_bash_operator']))
+self.assertEqual(len(os.listdir(directory)), 6)
+self.assertTrue(os.path.isfile(directory + file_name))
+self.assertNotEqual(os.stat(directory + file_name).st_size, 0)

Review comment:
   you can use `assertGreater` here 
https://docs.python.org/3/library/unittest.html#unittest.TestCase.assertGreater 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


kaxil commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482606828



##
File path: airflow/cli/commands/dag_command.py
##
@@ -378,6 +379,47 @@ def dag_list_dag_runs(args, dag=None):
 print(table)
 
 
+@cli_utils.action_logging
+def generate_pod_yaml(args):
+"""Generates yaml files for each task in the DAG. Used for testing output 
of KubernetesExecutor"""
+
+from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
+from airflow.kubernetes import pod_generator
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.kubernetes.worker_configuration import WorkerConfiguration
+execution_date = datetime.datetime(2020, 11, 3)
+dag = get_dag(subdir=args.subdir, dag_id=args.dag_id)
+yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"
+kube_config = KubeConfig()
+for task in dag.tasks:
+ti = TaskInstance(task, execution_date)
+pod = PodGenerator.construct_pod(
+dag_id=args.dag_id,
+task_id=ti.task_id,
+pod_id=AirflowKubernetesScheduler._create_pod_id(  # pylint: 
disable=W0212
+args.dag_id, ti.task_id),
+try_number=ti.try_number,
+date=ti.execution_date,
+command=ti.command_as_list(),
+kube_executor_config=PodGenerator.from_obj(ti.executor_config),
+worker_uuid="worker-config",
+namespace=kube_config.executor_namespace,
+worker_config=WorkerConfiguration(kube_config=kube_config).as_pod()
+)
+import os
+
+import yaml
+from kubernetes.client.api_client import ApiClient

Review comment:
   Why are these here ??? let's properly format them (os and yaml can go at 
the top of the file)





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


kaxil commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482606477



##
File path: airflow/cli/commands/dag_command.py
##
@@ -378,6 +379,47 @@ def dag_list_dag_runs(args, dag=None):
 print(table)
 
 
+@cli_utils.action_logging
+def generate_pod_yaml(args):
+"""Generates yaml files for each task in the DAG. Used for testing output 
of KubernetesExecutor"""
+
+from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
+from airflow.kubernetes import pod_generator
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.kubernetes.worker_configuration import WorkerConfiguration
+execution_date = datetime.datetime(2020, 11, 3)

Review comment:
   Why is the execution_date hardcoded ?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


kaxil commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482606210



##
File path: airflow/cli/cli_parser.py
##
@@ -142,6 +142,10 @@ def positive_int(value):
 ("-e", "--end-date"),
 help="Override end_date -MM-DD",
 type=parsedate)
+ARG_OUTPUT_PATH = Arg(
+("-o", "--output-path",),
+help="The output for generated yaml files",

Review comment:
   This hasn't been fixed @dimberman 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #10690: Change the name of Static Check without pylint

2020-09-02 Thread GitBox


kaxil commented on a change in pull request #10690:
URL: https://github.com/apache/airflow/pull/10690#discussion_r482602474



##
File path: .github/workflows/ci.yml
##
@@ -155,7 +155,7 @@ jobs:
 
   static-checks:
 timeout-minutes: 30
-name: "Static checks: no pylint"
+name: "Static checks: except pylint"

Review comment:
   Updated





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg edited a comment on issue #10279: Tooltips not show up in Safari

2020-09-02 Thread GitBox


alexbegg edited a comment on issue #10279:
URL: https://github.com/apache/airflow/issues/10279#issuecomment-686075093


   I was able to reproduce the problem, partially, but only when RBAC UI was 
not enabled. I noticed that in Safari the only tooltips that is not working is 
the ones for the circles for the recent tasks or DAG runs. The (i)s and the 
right "Links" icons all show a tooltip. With RBAC UI on there is no problem 
with the tooltips in Safari. On Chrome all the tooltips work with either UIs.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] paolaperaza commented on issue #7940: DagRuns are marked as failed as soon as one task fails

2020-09-02 Thread GitBox


paolaperaza commented on issue #7940:
URL: https://github.com/apache/airflow/issues/7940#issuecomment-686090091


   @dimberman FYI looks like this is a duplicate of: 
https://github.com/apache/airflow/issues/7939



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] paolaperaza commented on issue #7926: Kill task instances that haven't been able to heartbeat for a while

2020-09-02 Thread GitBox


paolaperaza commented on issue #7926:
URL: https://github.com/apache/airflow/issues/7926#issuecomment-686089514


   @dimberman FYI Looks like this is a duplicate of: 
https://github.com/apache/airflow/issues/7925



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jhtimmins commented on a change in pull request #10594: WIP: Add permissions for stable API

2020-09-02 Thread GitBox


jhtimmins commented on a change in pull request #10594:
URL: https://github.com/apache/airflow/pull/10594#discussion_r482584110



##
File path: airflow/api_connexion/security.py
##
@@ -37,3 +37,32 @@ def decorated(*args, **kwargs):
 return function(*args, **kwargs)
 
 return cast(T, decorated)
+
+
+def requires_access(permissions: Sequence[Tuple[str, str]]) -> Callable[[T], 
T]:

Review comment:
   yeah that isnt particularly difficult, but I think it complicates the 
interface somewhat. I think I'll just create resources that aren't based on a 
specific model. So 'Health' or 'Configuration', just to keep the format 
consistent
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on issue #10279: Tooltips not show up in Safari

2020-09-02 Thread GitBox


alexbegg commented on issue #10279:
URL: https://github.com/apache/airflow/issues/10279#issuecomment-686075093


   I was able to reproduce the problem, partially, but only when RBAC UI was 
not enabled. I noticed that in Safari the only tooltips that is not working is 
the ones for the circles for the recent tasks or DAG runs. The (i)s and the 
right "Links" icons all show a tooltip. On Chrome all the tooltips work.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] vitaly-krugl opened a new issue #10702: Allow custom remote logger plugins

2020-09-02 Thread GitBox


vitaly-krugl opened a new issue #10702:
URL: https://github.com/apache/airflow/issues/10702


   **Description**
   
   Allow custom remote loggers to be added (e.g., via Airflow's plugin 
mechanism)
   
   **Use case / motivation**
   
   Presently, airflow has hard-wired support for archiving logs in S3, Google 
Cloud, and Elastic Search (enabled via config file).
   
   However, this does not facilitate integration of a non-supported-backend 
custom remote logger. In particular, there is no way for user of Airflow to 
install and enable a remote logger that uses a proprietary storage backend or a 
storage backend that is not yet hard-wired into Airflow. 
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on issue #10279: Tooltips not show up in Safari

2020-09-02 Thread GitBox


alexbegg commented on issue #10279:
URL: https://github.com/apache/airflow/issues/10279#issuecomment-686059151


   Also, are you using the RBAC UI or not?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pcandoalmeida edited a comment on issue #9538: Add config variable for UI page title

2020-09-02 Thread GitBox


pcandoalmeida edited a comment on issue #9538:
URL: https://github.com/apache/airflow/issues/9538#issuecomment-686053566


   Hi @joshkadis we could keep `page-title` to the DAG homepage and 
`site-title` to the various places that "Airflow" appears in a view as 
suggested by @mik-laj (I could be getting this the other way round!) I found a 
way to update some templates, but could not seem to find the base template that 
rendered other views (details ^ I believe) so am working on that at the moment. 
But it feels like it would be good to get some feedback from project members as 
we have some good ideas, but we could perhaps look to get some clarity on how 
we could proceed this issue as there are various paths we coud take: 
implementing your original request, updating _all_ views with the singular 
config or splitting the config options between `site-title` and `page-title`. 
If the latter, I could update the code so it updates only the `site-title` and 
I can open open a new issue to update all the templates/views for `page-title`. 
We could get multiple contributors working on this so could perhaps get it done 
f
 aster!



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pcandoalmeida commented on issue #9538: Add config variable for UI page title

2020-09-02 Thread GitBox


pcandoalmeida commented on issue #9538:
URL: https://github.com/apache/airflow/issues/9538#issuecomment-686053566


   Hi @joshkadis we could keep `page-title` to the DAG homepage and 
`site-title` to the various places that "Airflow" appears in a view as 
suggested by @mik-laj (I could be getting this the other way round!) I found a 
way to update some templates, but could not seem to find the base template that 
rendered other views (details ^ I believe) so am working on that at the moment. 
But it feels like it would be good to get some feedback from project members as 
we have some good ideas, but we could perhaps look to get some clarity on how 
we could proceed this issue as there are various paths we coud take: 
implementing your original request, updating _all_ views with the singular 
config or splitting the config options between `site-title` and `page-title`. 
If the latter, I could update the code so it updates only the `site-title` and 
I can open open a new issue to update all the templates/views for `page-title`. 
We could get multiple contributors working on this so could perhaps get done 
fast
 er!



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pcandoalmeida edited a comment on issue #9538: Add config variable for UI page title

2020-09-02 Thread GitBox


pcandoalmeida edited a comment on issue #9538:
URL: https://github.com/apache/airflow/issues/9538#issuecomment-685492510


   Hi @bhavaniravi I've got the open issue functionality working in the open 
PR. I am trying to work on getting the changes updated in all possible views. I 
think splitting `page-title` and `site-title` could be a good idea. That way, 
we could work on `site-title` changes incrementally and then merge in once all 
done, but offer the open issue feature as a first step (possibly). Would be 
keen to know thoughts.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg edited a comment on issue #10279: Tooltips not show up in Safari

2020-09-02 Thread GitBox


alexbegg edited a comment on issue #10279:
URL: https://github.com/apache/airflow/issues/10279#issuecomment-686047299


   It shows up fine for me in 1.10.11 in Safari (Version 13.1 
(15609.1.20.111.8)). I was going to check with 1.10.10 using Breeze but I am 
having an issue building 1.10.10, so maybe someone else can try and reproduce 
this in 1.10.10.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on issue #10279: Tooltips not show up in Safari

2020-09-02 Thread GitBox


alexbegg commented on issue #10279:
URL: https://github.com/apache/airflow/issues/10279#issuecomment-686047299


   It shows up fine for me in 1.10.11 in Safari. I was going to check with 
1.10.10 using Breeze but I am having an issue building 1.10.10, so maybe 
someone else can try and reproduce this in 1.10.10.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jonstacks commented on issue #10292: Kubernetes Executor - invalid hostname in the database

2020-09-02 Thread GitBox


jonstacks commented on issue #10292:
URL: https://github.com/apache/airflow/issues/10292#issuecomment-686044139


   I did some messing around and was able to get get the logs working by adding 
this: 
https://github.com/apache/airflow/commit/58eb534f1902a561c151d0bb796692a3df6e241f
   
   It looks up the correct pod name based on the partial so it can grab the 
logs. It seems to be working for us, but I feel like there might be a better 
way.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow-site] aijamalnk commented on pull request #268: adding a logo for sift use case

2020-09-02 Thread GitBox


aijamalnk commented on pull request #268:
URL: https://github.com/apache/airflow-site/pull/268#issuecomment-685990906


   Sorry i missed this! @mik-laj @mschickensoup is there anything else i need 
to do? Thank you both!



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow-site] branch aijamalnk-patch-1 updated (c02020a -> b97a5dc)

2020-09-02 Thread aizhamal
This is an automated email from the ASF dual-hosted git repository.

aizhamal pushed a change to branch aijamalnk-patch-1
in repository https://gitbox.apache.org/repos/asf/airflow-site.git.


from c02020a  Adding the md file for sift use case
 add b97a5dc  Update landing-pages/site/content/en/use-cases/sift.md

No new revisions were added by this update.

Summary of changes:
 landing-pages/site/content/en/use-cases/sift.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[GitHub] [airflow] ryw commented on pull request #10678: Add section for official source code

2020-09-02 Thread GitBox


ryw commented on pull request #10678:
URL: https://github.com/apache/airflow/pull/10678#issuecomment-685978388


   @ashb @mistercrunch @bolkedebruin would one or two of you you have a quick 
look at this -- basically just stating where to find the official source code 
for the project. We'll likely restructure/slim down the README next, and maybe 
refine this down some in the next iteration.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (649ce4b -> 4e09cb5)

2020-09-02 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 649ce4b  Implement Google Shell Conventions for breeze script (#10695)
 add 4e09cb5  Add packages to function names in bash (#10670) (#10696)

No new revisions were added by this update.

Summary of changes:
 BREEZE.rst |   2 +-
 breeze | 426 ++---
 .../ci_prepare_backport_packages.sh|   6 +-
 .../ci_prepare_backport_readme.sh  |   6 +-
 ...ci_test_backport_packages_import_all_classes.sh |   4 +-
 ...ci_test_backport_packages_install_separately.sh |   4 +-
 scripts/ci/constraints/ci_generate_constraints.sh  |   6 +-
 scripts/ci/docs/ci_docs.sh |   6 +-
 scripts/ci/images/ci_build_dockerhub.sh|  12 +-
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |  12 +-
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |   8 +-
 scripts/ci/images/ci_push_ci_images.sh |   4 +-
 scripts/ci/images/ci_push_production_images.sh |   4 +-
 scripts/ci/images/ci_wait_for_all_ci_images.sh |   4 +-
 scripts/ci/images/ci_wait_for_all_prod_images.sh   |   6 +-
 .../ci/kubernetes/ci_deploy_app_to_kubernetes.sh   |  20 +-
 scripts/ci/kubernetes/ci_run_kubernetes_tests.sh   |   6 +-
 scripts/ci/libraries/_build_images.sh  | 190 -
 scripts/ci/libraries/_initialization.sh| 102 ++---
 scripts/ci/libraries/_kind.sh  |  48 +--
 scripts/ci/libraries/_local_mounts.sh  |   6 +-
 scripts/ci/libraries/_md5sum.sh|  44 +--
 scripts/ci/libraries/_parameters.sh|   8 +-
 scripts/ci/libraries/_permissions.sh   |  22 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |  52 +--
 scripts/ci/libraries/_pylint.sh|   2 +-
 scripts/ci/libraries/_runs.sh  |   8 +-
 scripts/ci/libraries/_sanity_checks.sh |  40 +-
 scripts/ci/libraries/_script_init.sh   |  16 +-
 scripts/ci/libraries/_spinner.sh   |   2 +-
 scripts/ci/libraries/_start_end.sh |  44 +--
 scripts/ci/libraries/_traps.sh |   2 +-
 scripts/ci/libraries/_verbosity.sh |  13 +-
 scripts/ci/pre_commit/pre_commit_ci_build.sh   |   6 +-
 .../ci/pre_commit/pre_commit_local_yml_mounts.sh   |   6 +-
 scripts/ci/pre_commit/pre_commit_mermaid.sh|   2 +-
 scripts/ci/static_checks/flake8.sh |   4 +-
 scripts/ci/static_checks/mypy.sh   |   4 +-
 scripts/ci/static_checks/pylint.sh |   6 +-
 scripts/ci/static_checks/refresh_pylint_todo.sh|   4 +-
 scripts/ci/static_checks/run_static_checks.sh  |   4 +-
 scripts/ci/testing/ci_run_airflow_testing.sh   |   4 +-
 tests/bats/bats_utils.bash |   4 +-
 tests/bats/test_breeze_params.bats |  14 +-
 tests/bats/test_local_mounts.bats  |   2 +-
 45 files changed, 594 insertions(+), 601 deletions(-)



[airflow] branch master updated (649ce4b -> 4e09cb5)

2020-09-02 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 649ce4b  Implement Google Shell Conventions for breeze script (#10695)
 add 4e09cb5  Add packages to function names in bash (#10670) (#10696)

No new revisions were added by this update.

Summary of changes:
 BREEZE.rst |   2 +-
 breeze | 426 ++---
 .../ci_prepare_backport_packages.sh|   6 +-
 .../ci_prepare_backport_readme.sh  |   6 +-
 ...ci_test_backport_packages_import_all_classes.sh |   4 +-
 ...ci_test_backport_packages_install_separately.sh |   4 +-
 scripts/ci/constraints/ci_generate_constraints.sh  |   6 +-
 scripts/ci/docs/ci_docs.sh |   6 +-
 scripts/ci/images/ci_build_dockerhub.sh|  12 +-
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |  12 +-
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |   8 +-
 scripts/ci/images/ci_push_ci_images.sh |   4 +-
 scripts/ci/images/ci_push_production_images.sh |   4 +-
 scripts/ci/images/ci_wait_for_all_ci_images.sh |   4 +-
 scripts/ci/images/ci_wait_for_all_prod_images.sh   |   6 +-
 .../ci/kubernetes/ci_deploy_app_to_kubernetes.sh   |  20 +-
 scripts/ci/kubernetes/ci_run_kubernetes_tests.sh   |   6 +-
 scripts/ci/libraries/_build_images.sh  | 190 -
 scripts/ci/libraries/_initialization.sh| 102 ++---
 scripts/ci/libraries/_kind.sh  |  48 +--
 scripts/ci/libraries/_local_mounts.sh  |   6 +-
 scripts/ci/libraries/_md5sum.sh|  44 +--
 scripts/ci/libraries/_parameters.sh|   8 +-
 scripts/ci/libraries/_permissions.sh   |  22 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |  52 +--
 scripts/ci/libraries/_pylint.sh|   2 +-
 scripts/ci/libraries/_runs.sh  |   8 +-
 scripts/ci/libraries/_sanity_checks.sh |  40 +-
 scripts/ci/libraries/_script_init.sh   |  16 +-
 scripts/ci/libraries/_spinner.sh   |   2 +-
 scripts/ci/libraries/_start_end.sh |  44 +--
 scripts/ci/libraries/_traps.sh |   2 +-
 scripts/ci/libraries/_verbosity.sh |  13 +-
 scripts/ci/pre_commit/pre_commit_ci_build.sh   |   6 +-
 .../ci/pre_commit/pre_commit_local_yml_mounts.sh   |   6 +-
 scripts/ci/pre_commit/pre_commit_mermaid.sh|   2 +-
 scripts/ci/static_checks/flake8.sh |   4 +-
 scripts/ci/static_checks/mypy.sh   |   4 +-
 scripts/ci/static_checks/pylint.sh |   6 +-
 scripts/ci/static_checks/refresh_pylint_todo.sh|   4 +-
 scripts/ci/static_checks/run_static_checks.sh  |   4 +-
 scripts/ci/testing/ci_run_airflow_testing.sh   |   4 +-
 tests/bats/bats_utils.bash |   4 +-
 tests/bats/test_breeze_params.bats |  14 +-
 tests/bats/test_local_mounts.bats  |   2 +-
 45 files changed, 594 insertions(+), 601 deletions(-)



[airflow] branch master updated (e5785d4 -> 649ce4b)

2020-09-02 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e5785d4  Chart: Flower deployment should use Flower image (#10701)
 add 649ce4b  Implement Google Shell Conventions for breeze script (#10695)

No new revisions were added by this update.

Summary of changes:
 BREEZE.rst |8 +-
 breeze | 2567 
 breeze-complete|   20 +-
 confirm|2 +-
 .../ci_prepare_and_test_backport_packages.sh   |2 -
 .../ci_prepare_backport_packages.sh|4 -
 .../ci_prepare_backport_readme.sh  |4 -
 ...ci_test_backport_packages_import_all_classes.sh |4 -
 ...ci_test_backport_packages_install_separately.sh |4 -
 scripts/ci/constraints/ci_branch_constraints.sh|4 -
 scripts/ci/constraints/ci_commit_constraints.sh|4 -
 scripts/ci/constraints/ci_generate_constraints.sh  |4 -
 scripts/ci/docs/ci_docs.sh |4 -
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |1 -
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |1 -
 scripts/ci/images/ci_push_ci_images.sh |2 -
 scripts/ci/images/ci_push_production_images.sh |2 -
 scripts/ci/images/ci_wait_for_all_ci_images.sh |   26 +-
 scripts/ci/images/ci_wait_for_all_prod_images.sh   |   29 +-
 .../ci/kubernetes/ci_deploy_app_to_kubernetes.sh   |   12 +-
 scripts/ci/kubernetes/ci_run_kubernetes_tests.sh   |   11 +-
 scripts/ci/libraries/_all_libs.sh  |   10 +-
 scripts/ci/libraries/_build_images.sh  |  210 +-
 scripts/ci/libraries/_initialization.sh|  251 +-
 scripts/ci/libraries/_kind.sh  |   56 +-
 scripts/ci/libraries/_md5sum.sh|8 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |   37 +-
 scripts/ci/libraries/_sanity_checks.sh |   25 +-
 scripts/ci/libraries/_script_init.sh   |   23 +-
 scripts/ci/libraries/_start_end.sh |1 -
 scripts/ci/libraries/{_pylint.sh => _traps.sh} |   35 +-
 scripts/ci/libraries/_verbosity.sh |   48 +-
 scripts/ci/openapi/client_codegen_diff.sh  |1 -
 .../ci/pre_commit/pre_commit_breeze_cmd_line.sh|   16 +
 .../ci/pre_commit/pre_commit_check_integrations.sh |5 +-
 scripts/ci/pre_commit/pre_commit_ci_build.sh   |1 -
 .../ci/pre_commit/pre_commit_local_yml_mounts.sh   |4 +-
 scripts/ci/pre_commit/pre_commit_mermaid.sh|9 +-
 scripts/ci/static_checks/bat_tests.sh  |4 -
 scripts/ci/static_checks/flake8.sh |4 -
 scripts/ci/static_checks/mypy.sh   |4 -
 scripts/ci/static_checks/pylint.sh |4 -
 scripts/ci/static_checks/refresh_pylint_todo.sh|4 +-
 scripts/ci/static_checks/run_static_checks.sh  |4 -
 scripts/ci/testing/ci_run_airflow_testing.sh   |7 +-
 scripts/ci/tools/ci_clear_tmp.sh   |   16 -
 scripts/ci/tools/ci_count_changed_files.sh |3 -
 scripts/ci/tools/ci_fix_ownership.sh   |   16 -
 scripts/in_container/_in_container_script_init.sh  |2 +-
 scripts/in_container/_in_container_utils.sh|   25 +-
 scripts/in_container/check_environment.sh  |4 +-
 scripts/in_container/entrypoint_ci.sh  |6 +-
 scripts/in_container/prod/entrypoint_prod.sh   |4 +-
 scripts/in_container/run_cli_tool.sh   |   10 +-
 scripts/in_container/run_docs_build.sh |5 +-
 scripts/in_container/run_generate_constraints.sh   |5 +-
 .../in_container/run_prepare_backport_packages.sh  |8 +-
 .../in_container/run_prepare_backport_readme.sh|5 +-
 tests/bats/bats_utils.bash |   17 +-
 tests/bats/test_breeze_params.bats |   26 -
 tests/bats/test_empty_test.bats|2 -
 tests/bats/test_local_mounts.bats  |2 -
 62 files changed, 2161 insertions(+), 1481 deletions(-)
 copy scripts/ci/libraries/{_pylint.sh => _traps.sh} (56%)



[airflow] branch master updated (649ce4b -> 4e09cb5)

2020-09-02 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 649ce4b  Implement Google Shell Conventions for breeze script (#10695)
 add 4e09cb5  Add packages to function names in bash (#10670) (#10696)

No new revisions were added by this update.

Summary of changes:
 BREEZE.rst |   2 +-
 breeze | 426 ++---
 .../ci_prepare_backport_packages.sh|   6 +-
 .../ci_prepare_backport_readme.sh  |   6 +-
 ...ci_test_backport_packages_import_all_classes.sh |   4 +-
 ...ci_test_backport_packages_install_separately.sh |   4 +-
 scripts/ci/constraints/ci_generate_constraints.sh  |   6 +-
 scripts/ci/docs/ci_docs.sh |   6 +-
 scripts/ci/images/ci_build_dockerhub.sh|  12 +-
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |  12 +-
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |   8 +-
 scripts/ci/images/ci_push_ci_images.sh |   4 +-
 scripts/ci/images/ci_push_production_images.sh |   4 +-
 scripts/ci/images/ci_wait_for_all_ci_images.sh |   4 +-
 scripts/ci/images/ci_wait_for_all_prod_images.sh   |   6 +-
 .../ci/kubernetes/ci_deploy_app_to_kubernetes.sh   |  20 +-
 scripts/ci/kubernetes/ci_run_kubernetes_tests.sh   |   6 +-
 scripts/ci/libraries/_build_images.sh  | 190 -
 scripts/ci/libraries/_initialization.sh| 102 ++---
 scripts/ci/libraries/_kind.sh  |  48 +--
 scripts/ci/libraries/_local_mounts.sh  |   6 +-
 scripts/ci/libraries/_md5sum.sh|  44 +--
 scripts/ci/libraries/_parameters.sh|   8 +-
 scripts/ci/libraries/_permissions.sh   |  22 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |  52 +--
 scripts/ci/libraries/_pylint.sh|   2 +-
 scripts/ci/libraries/_runs.sh  |   8 +-
 scripts/ci/libraries/_sanity_checks.sh |  40 +-
 scripts/ci/libraries/_script_init.sh   |  16 +-
 scripts/ci/libraries/_spinner.sh   |   2 +-
 scripts/ci/libraries/_start_end.sh |  44 +--
 scripts/ci/libraries/_traps.sh |   2 +-
 scripts/ci/libraries/_verbosity.sh |  13 +-
 scripts/ci/pre_commit/pre_commit_ci_build.sh   |   6 +-
 .../ci/pre_commit/pre_commit_local_yml_mounts.sh   |   6 +-
 scripts/ci/pre_commit/pre_commit_mermaid.sh|   2 +-
 scripts/ci/static_checks/flake8.sh |   4 +-
 scripts/ci/static_checks/mypy.sh   |   4 +-
 scripts/ci/static_checks/pylint.sh |   6 +-
 scripts/ci/static_checks/refresh_pylint_todo.sh|   4 +-
 scripts/ci/static_checks/run_static_checks.sh  |   4 +-
 scripts/ci/testing/ci_run_airflow_testing.sh   |   4 +-
 tests/bats/bats_utils.bash |   4 +-
 tests/bats/test_breeze_params.bats |  14 +-
 tests/bats/test_local_mounts.bats  |   2 +-
 45 files changed, 594 insertions(+), 601 deletions(-)



[airflow] branch master updated (e5785d4 -> 649ce4b)

2020-09-02 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e5785d4  Chart: Flower deployment should use Flower image (#10701)
 add 649ce4b  Implement Google Shell Conventions for breeze script (#10695)

No new revisions were added by this update.

Summary of changes:
 BREEZE.rst |8 +-
 breeze | 2567 
 breeze-complete|   20 +-
 confirm|2 +-
 .../ci_prepare_and_test_backport_packages.sh   |2 -
 .../ci_prepare_backport_packages.sh|4 -
 .../ci_prepare_backport_readme.sh  |4 -
 ...ci_test_backport_packages_import_all_classes.sh |4 -
 ...ci_test_backport_packages_install_separately.sh |4 -
 scripts/ci/constraints/ci_branch_constraints.sh|4 -
 scripts/ci/constraints/ci_commit_constraints.sh|4 -
 scripts/ci/constraints/ci_generate_constraints.sh  |4 -
 scripts/ci/docs/ci_docs.sh |4 -
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |1 -
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |1 -
 scripts/ci/images/ci_push_ci_images.sh |2 -
 scripts/ci/images/ci_push_production_images.sh |2 -
 scripts/ci/images/ci_wait_for_all_ci_images.sh |   26 +-
 scripts/ci/images/ci_wait_for_all_prod_images.sh   |   29 +-
 .../ci/kubernetes/ci_deploy_app_to_kubernetes.sh   |   12 +-
 scripts/ci/kubernetes/ci_run_kubernetes_tests.sh   |   11 +-
 scripts/ci/libraries/_all_libs.sh  |   10 +-
 scripts/ci/libraries/_build_images.sh  |  210 +-
 scripts/ci/libraries/_initialization.sh|  251 +-
 scripts/ci/libraries/_kind.sh  |   56 +-
 scripts/ci/libraries/_md5sum.sh|8 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |   37 +-
 scripts/ci/libraries/_sanity_checks.sh |   25 +-
 scripts/ci/libraries/_script_init.sh   |   23 +-
 scripts/ci/libraries/_start_end.sh |1 -
 scripts/ci/libraries/{_pylint.sh => _traps.sh} |   35 +-
 scripts/ci/libraries/_verbosity.sh |   48 +-
 scripts/ci/openapi/client_codegen_diff.sh  |1 -
 .../ci/pre_commit/pre_commit_breeze_cmd_line.sh|   16 +
 .../ci/pre_commit/pre_commit_check_integrations.sh |5 +-
 scripts/ci/pre_commit/pre_commit_ci_build.sh   |1 -
 .../ci/pre_commit/pre_commit_local_yml_mounts.sh   |4 +-
 scripts/ci/pre_commit/pre_commit_mermaid.sh|9 +-
 scripts/ci/static_checks/bat_tests.sh  |4 -
 scripts/ci/static_checks/flake8.sh |4 -
 scripts/ci/static_checks/mypy.sh   |4 -
 scripts/ci/static_checks/pylint.sh |4 -
 scripts/ci/static_checks/refresh_pylint_todo.sh|4 +-
 scripts/ci/static_checks/run_static_checks.sh  |4 -
 scripts/ci/testing/ci_run_airflow_testing.sh   |7 +-
 scripts/ci/tools/ci_clear_tmp.sh   |   16 -
 scripts/ci/tools/ci_count_changed_files.sh |3 -
 scripts/ci/tools/ci_fix_ownership.sh   |   16 -
 scripts/in_container/_in_container_script_init.sh  |2 +-
 scripts/in_container/_in_container_utils.sh|   25 +-
 scripts/in_container/check_environment.sh  |4 +-
 scripts/in_container/entrypoint_ci.sh  |6 +-
 scripts/in_container/prod/entrypoint_prod.sh   |4 +-
 scripts/in_container/run_cli_tool.sh   |   10 +-
 scripts/in_container/run_docs_build.sh |5 +-
 scripts/in_container/run_generate_constraints.sh   |5 +-
 .../in_container/run_prepare_backport_packages.sh  |8 +-
 .../in_container/run_prepare_backport_readme.sh|5 +-
 tests/bats/bats_utils.bash |   17 +-
 tests/bats/test_breeze_params.bats |   26 -
 tests/bats/test_empty_test.bats|2 -
 tests/bats/test_local_mounts.bats  |2 -
 62 files changed, 2161 insertions(+), 1481 deletions(-)
 copy scripts/ci/libraries/{_pylint.sh => _traps.sh} (56%)



[airflow] branch master updated (e5785d4 -> 649ce4b)

2020-09-02 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e5785d4  Chart: Flower deployment should use Flower image (#10701)
 add 649ce4b  Implement Google Shell Conventions for breeze script (#10695)

No new revisions were added by this update.

Summary of changes:
 BREEZE.rst |8 +-
 breeze | 2567 
 breeze-complete|   20 +-
 confirm|2 +-
 .../ci_prepare_and_test_backport_packages.sh   |2 -
 .../ci_prepare_backport_packages.sh|4 -
 .../ci_prepare_backport_readme.sh  |4 -
 ...ci_test_backport_packages_import_all_classes.sh |4 -
 ...ci_test_backport_packages_install_separately.sh |4 -
 scripts/ci/constraints/ci_branch_constraints.sh|4 -
 scripts/ci/constraints/ci_commit_constraints.sh|4 -
 scripts/ci/constraints/ci_generate_constraints.sh  |4 -
 scripts/ci/docs/ci_docs.sh |4 -
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |1 -
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |1 -
 scripts/ci/images/ci_push_ci_images.sh |2 -
 scripts/ci/images/ci_push_production_images.sh |2 -
 scripts/ci/images/ci_wait_for_all_ci_images.sh |   26 +-
 scripts/ci/images/ci_wait_for_all_prod_images.sh   |   29 +-
 .../ci/kubernetes/ci_deploy_app_to_kubernetes.sh   |   12 +-
 scripts/ci/kubernetes/ci_run_kubernetes_tests.sh   |   11 +-
 scripts/ci/libraries/_all_libs.sh  |   10 +-
 scripts/ci/libraries/_build_images.sh  |  210 +-
 scripts/ci/libraries/_initialization.sh|  251 +-
 scripts/ci/libraries/_kind.sh  |   56 +-
 scripts/ci/libraries/_md5sum.sh|8 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |   37 +-
 scripts/ci/libraries/_sanity_checks.sh |   25 +-
 scripts/ci/libraries/_script_init.sh   |   23 +-
 scripts/ci/libraries/_start_end.sh |1 -
 scripts/ci/libraries/{_pylint.sh => _traps.sh} |   35 +-
 scripts/ci/libraries/_verbosity.sh |   48 +-
 scripts/ci/openapi/client_codegen_diff.sh  |1 -
 .../ci/pre_commit/pre_commit_breeze_cmd_line.sh|   16 +
 .../ci/pre_commit/pre_commit_check_integrations.sh |5 +-
 scripts/ci/pre_commit/pre_commit_ci_build.sh   |1 -
 .../ci/pre_commit/pre_commit_local_yml_mounts.sh   |4 +-
 scripts/ci/pre_commit/pre_commit_mermaid.sh|9 +-
 scripts/ci/static_checks/bat_tests.sh  |4 -
 scripts/ci/static_checks/flake8.sh |4 -
 scripts/ci/static_checks/mypy.sh   |4 -
 scripts/ci/static_checks/pylint.sh |4 -
 scripts/ci/static_checks/refresh_pylint_todo.sh|4 +-
 scripts/ci/static_checks/run_static_checks.sh  |4 -
 scripts/ci/testing/ci_run_airflow_testing.sh   |7 +-
 scripts/ci/tools/ci_clear_tmp.sh   |   16 -
 scripts/ci/tools/ci_count_changed_files.sh |3 -
 scripts/ci/tools/ci_fix_ownership.sh   |   16 -
 scripts/in_container/_in_container_script_init.sh  |2 +-
 scripts/in_container/_in_container_utils.sh|   25 +-
 scripts/in_container/check_environment.sh  |4 +-
 scripts/in_container/entrypoint_ci.sh  |6 +-
 scripts/in_container/prod/entrypoint_prod.sh   |4 +-
 scripts/in_container/run_cli_tool.sh   |   10 +-
 scripts/in_container/run_docs_build.sh |5 +-
 scripts/in_container/run_generate_constraints.sh   |5 +-
 .../in_container/run_prepare_backport_packages.sh  |8 +-
 .../in_container/run_prepare_backport_readme.sh|5 +-
 tests/bats/bats_utils.bash |   17 +-
 tests/bats/test_breeze_params.bats |   26 -
 tests/bats/test_empty_test.bats|2 -
 tests/bats/test_local_mounts.bats  |2 -
 62 files changed, 2161 insertions(+), 1481 deletions(-)
 copy scripts/ci/libraries/{_pylint.sh => _traps.sh} (56%)



[GitHub] [airflow] potiuk merged pull request #10696: Add packages to function names in bash

2020-09-02 Thread GitBox


potiuk merged pull request #10696:
URL: https://github.com/apache/airflow/pull/10696


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk merged pull request #10695: Implement Google Shell Conventions for breeze script

2020-09-02 Thread GitBox


potiuk merged pull request #10695:
URL: https://github.com/apache/airflow/pull/10695


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Chart: Flower deployment should use Flower image (#10701)

2020-09-02 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new e5785d4  Chart: Flower deployment should use Flower image (#10701)
e5785d4 is described below

commit e5785d4720d94ed3183bd46769ffdb54830b0d22
Author: Kaxil Naik 
AuthorDate: Wed Sep 2 20:34:03 2020 +0100

Chart: Flower deployment should use Flower image (#10701)

Co-authored-by: Steven Miller 
---
 chart/templates/_helpers.yaml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/chart/templates/_helpers.yaml b/chart/templates/_helpers.yaml
index 833d6e2..5e56b62 100644
--- a/chart/templates/_helpers.yaml
+++ b/chart/templates/_helpers.yaml
@@ -178,7 +178,7 @@
 {{- end }}
 
 {{ define "flower_image" -}}
-{{ printf "%s:%s" (.Values.images.airflow.repository | default 
.Values.defaultAirflowRepository) (.Values.images.airflow.tag | default 
.Values.defaultAirflowTag) }}
+{{ printf "%s:%s" (.Values.images.flower.repository | default 
.Values.defaultAirflowRepository) (.Values.images.flower.tag | default 
.Values.defaultAirflowTag) }}
 {{- end }}
 
 {{ define "statsd_image" -}}



[GitHub] [airflow] kaxil merged pull request #10701: Chart: Flower deployment should use Flower image

2020-09-02 Thread GitBox


kaxil merged pull request #10701:
URL: https://github.com/apache/airflow/pull/10701


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #10671: Add more type annotations to AWS hooks

2020-09-02 Thread GitBox


potiuk commented on pull request #10671:
URL: https://github.com/apache/airflow/pull/10671#issuecomment-685950489


   There was an issue with latest release of black (released few days ago) that 
stopped respecting our configuration when running pre-commits. Please rebase to 
latest master @coopergillan (we just merged a workaround where we pinned black 
to  a working version (and opened an issue in black to get it fixed 
https://github.com/psf/black/issues/1667).



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] XD-DENG commented on pull request #10681: Add Stacktrace when DagFileProcessorManager gets killed

2020-09-02 Thread GitBox


XD-DENG commented on pull request #10681:
URL: https://github.com/apache/airflow/pull/10681#issuecomment-685949264


   > > Given we still officially support Python 3.5, maybe not the best time 
yet to introduce f-string (which is only supported since 3.6)?
   > 
   > 
   > 
   > But we are already using a lot of f-strings. 3.5 is only supported in 
1.10.* line. In master we are 3.6+
   
   Cool, makes sense to me.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #10681: Add Stacktrace when DagFileProcessorManager gets killed

2020-09-02 Thread GitBox


potiuk commented on pull request #10681:
URL: https://github.com/apache/airflow/pull/10681#issuecomment-685948712


   > Given we still officially support Python 3.5, maybe not the best time yet 
to introduce f-string (which is only supported since 3.6)?
   
   But we are already using a lot of f-strings. 3.5 is only supported in 1.10.* 
line. In master we are 3.6+



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] houqp commented on a change in pull request #10655: [AIRFLOW-10645] Add AWS Secrets Manager Hook

2020-09-02 Thread GitBox


houqp commented on a change in pull request #10655:
URL: https://github.com/apache/airflow/pull/10655#discussion_r482313845



##
File path: airflow/providers/amazon/aws/hooks/secrets_manager.py
##
@@ -0,0 +1,54 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import base64
+import json
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class SecretsManagerHook(AwsBaseHook):
+"""
+Interact with Amazon SecretsManager Service.
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. see also::
+:class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+"""
+
+def __init__(self, *args, **kwargs):
+super().__init__(client_type='secretsmanager', *args, **kwargs)
+
+def get_secrets(self, secret_name: str) -> dict:

Review comment:
   we store a lot of none json secret values in secret managers, for 
example, simple database passwords without the quote, private certificate keys, 
etc. You can store raw binary secret as well, which by definition won't be in 
json format.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] marcusianlevine commented on issue #8421: Hide sensitive data in UI

2020-09-02 Thread GitBox


marcusianlevine commented on issue #8421:
URL: https://github.com/apache/airflow/issues/8421#issuecomment-685934343


   I should have some time to work on a proposal for this in the next few 
weeks, but I'm still not sure the best way to approach the design
   
   The key question I'm stuck on is at what point the templates get filled 
in... obviously the executors need to have access to the unmasked values, but 
if they get injected at DAG compile time then they will be visible in the UI, 
so I'm guessing the unmasked injection needs to happen at task execution time



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #10701: Chart: Flower deployment should use Flower image

2020-09-02 Thread GitBox


kaxil opened a new pull request #10701:
URL: https://github.com/apache/airflow/pull/10701


   Flower deployment should use Flower image if provided
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Fix missing dash in flag for statsd container (#10691)

2020-09-02 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 48ce4bd  Fix missing dash in flag for statsd container (#10691)
48ce4bd is described below

commit 48ce4bdac42af6a0bbd33148c14ac0bfc3568ce2
Author: Kamil Olszewski <34898234+olc...@users.noreply.github.com>
AuthorDate: Wed Sep 2 20:43:00 2020 +0200

Fix missing dash in flag for statsd container (#10691)

Co-authored-by: Kamil Olszewski 
---
 chart/templates/statsd/statsd-deployment.yaml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/chart/templates/statsd/statsd-deployment.yaml 
b/chart/templates/statsd/statsd-deployment.yaml
index d7f5464..12257ea 100644
--- a/chart/templates/statsd/statsd-deployment.yaml
+++ b/chart/templates/statsd/statsd-deployment.yaml
@@ -61,7 +61,7 @@ spec:
   image: {{- include "statsd_image" . | indent 1 }}
   imagePullPolicy: {{ .Values.images.statsd.pullPolicy }}
   args:
-- "-statsd.mapping-config=/etc/statsd-exporter/mappings.yml"
+- "--statsd.mapping-config=/etc/statsd-exporter/mappings.yml"
   resources:
 {{ toYaml .Values.statsd.resources | indent 12 }}
   ports:



[GitHub] [airflow] kaxil merged pull request #10691: Fix missing dash in flag for statsd container

2020-09-02 Thread GitBox


kaxil merged pull request #10691:
URL: https://github.com/apache/airflow/pull/10691


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] agrubb86 opened a new issue #10700: CeleryExecutor + MySQL Airflow DB with 'wait_timeout' + BashOperator that runs longer than 'wait_timeout' = Airflow uses a stale connection

2020-09-02 Thread GitBox


agrubb86 opened a new issue #10700:
URL: https://github.com/apache/airflow/issues/10700


   
   
   
   
   **Apache Airflow version**: 1.10.11
   
   **Environment**: Linux aarch64
   
   - **Cloud provider or hardware configuration**: AWS
   - **OS** (e.g. from /etc/os-release): 
   NAME="Ubuntu"
   VERSION="18.04.4 LTS (Bionic Beaver)"
   ID=ubuntu
   ID_LIKE=debian
   PRETTY_NAME="Ubuntu 18.04.4 LTS"
   VERSION_ID="18.04"
   
   - **Kernel** (e.g. `uname -a`):
   Linux 4.15.0-1063-aws #67-Ubuntu SMP Mon Mar 2 07:25:24 UTC 2020 aarch64 
aarch64 aarch64 GNU/Linux
   - **Install tools**: pip3
   - **Others**:
   Config options for SQLAlchemy:
   
   ```
   sql_alchemy_conn = mysql+mysqldb://X:X@X/airflow
   sql_engine_encoding = utf-8
   sql_alchemy_pool_enabled = True
   sql_alchemy_pool_size = 10
   sql_alchemy_max_overflow = 10
   sql_alchemy_pool_recycle = 270
   sql_alchemy_pool_pre_ping = True
   ```
   
   Config options for MySQL server:
   `wait_timeout = 300`
   
   **What happened**:
   
   From the BashOperator logs:
   
   ```
   [2020-09-02 03:38:04,363] {bash_operator.py:161} INFO - Command exited with 
return code 0
   [2020-09-02 03:38:04,371] {taskinstance.py:1150} ERROR - 
(_mysql_exceptions.OperationalError) (2006, 'MySQL server has gone away')
   [SQL: SELECT airflow.task_instance.try_number AS 
airflow_task_instance_try_number, airflow.task_instance.task_id AS 
airflow_task_instance_task_id, airflow.task_instance.dag_id AS 
airflow_task_instance_dag_id, airflow.task_instance.execution_date AS 
airflow_task_instance_execution_date, airflow.task_instance.start_date AS 
airflow_task_instance_start_date, airflow.task_instance.end_date AS 
airflow_task_instance_end_date, airflow.task_instance.duration AS 
airflow_task_instance_duration, airflow.task_instance.state AS 
airflow_task_instance_state, airflow.task_instance.max_tries AS 
airflow_task_instance_max_tries, airflow.task_instance.hostname AS 
airflow_task_instance_hostname, airflow.task_instance.unixname AS 
airflow_task_instance_unixname, airflow.task_instance.job_id AS 
airflow_task_instance_job_id, airflow.task_instance.pool AS 
airflow_task_instance_pool, airflow.task_instance.pool_slots AS 
airflow_task_instance_pool_slots, airflow.task_instance.queue AS 
airflow_task_instance_que
 ue, airflow.task_instance.priority_weight AS 
airflow_task_instance_priority_weight, airflow.task_instance.operator AS 
airflow_task_instance_operator, airflow.task_instance.queued_dttm AS 
airflow_task_instance_queued_dttm, airflow.task_instance.pid AS 
airflow_task_instance_pid, airflow.task_instance.executor_config AS 
airflow_task_instance_executor_config 
   FROM airflow.task_instance 
   WHERE airflow.task_instance.dag_id = %s AND airflow.task_instance.task_id = 
%s AND airflow.task_instance.execution_date = %s 
LIMIT %s FOR UPDATE]
   [parameters: ('X', 'X', datetime.datetime(2020, 9, 1, 21, 10), 1)]
   (Background on this error at: http://sqlalche.me/e/13/e3q8)
   Traceback (most recent call last):
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
line 1278, in _execute_context
   cursor, statement, parameters, context
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/sqlalchemy/engine/default.py",
 line 593, in do_execute
   cursor.execute(statement, parameters)
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
255, in execute
   self.errorhandler(self, exc, value)
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/MySQLdb/connections.py", line 
50, in defaulterrorhandler
   raise errorvalue
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
252, in execute
   res = self._query(query)
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
378, in _query
   db.query(q)
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/MySQLdb/connections.py", line 
280, in query
   _mysql.connection.query(self, query)
   _mysql_exceptions.OperationalError: (2006, 'MySQL server has gone away')
   
   The above exception was the direct cause of the following exception:
   
   Traceback (most recent call last):
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/airflow/models/taskinstance.py",
 line 1003, in _run_raw_task
   self.refresh_from_db(lock_for_update=True)
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/airflow/utils/db.py", line 
74, in wrapper
   return func(*args, **kwargs)
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/airflow/models/taskinstance.py",
 line 473, in refresh_from_db
   ti = qry.with_for_update().first()
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/sqlalchemy/orm/query.py", 
line 3397, in first
   ret = list(self[0:1])
 File 
"/opt/airflow/pyfiles/lib/python3.6/site-packages/sqlalchemy/orm/query.py", 
line 3171, in __getitem__

[GitHub] [airflow] boring-cyborg[bot] commented on issue #10700: CeleryExecutor + MySQL Airflow DB with 'wait_timeout' + BashOperator that runs longer than 'wait_timeout' = Airflow uses a stale connec

2020-09-02 Thread GitBox


boring-cyborg[bot] commented on issue #10700:
URL: https://github.com/apache/airflow/issues/10700#issuecomment-685925727


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ausiddiqui commented on issue #8421: Hide sensitive data in UI

2020-09-02 Thread GitBox


ausiddiqui commented on issue #8421:
URL: https://github.com/apache/airflow/issues/8421#issuecomment-685923962


   I have a similar use case, but passing the password from a connection object 
I created in the UI to the environment variable in the KubernetesPodOperator 
and it is appearing in plain text in the Rendered Template part of the UI for 
the task. There should be a way to avoid this being printed and visible.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] sdzharkov edited a comment on issue #7935: scheduler gets stuck without a trace

2020-09-02 Thread GitBox


sdzharkov edited a comment on issue #7935:
URL: https://github.com/apache/airflow/issues/7935#issuecomment-685920512


   We've experienced this issue twice now, with the CPU spiking to 100% and 
failing to schedule any tasks after. Our config is `Airflow 1.10.6 - Celery - 
Postgres` running on AWS ECS. I went back into our Cloudwatch logs and noticed 
the following collection of logs at the time the bug occurred: 
   
   ```
 | 2020-07-20T07:21:21.346Z | Process DagFileProcessor4357938-Process:
     | 2020-07-20T07:21:21.346Z | Traceback (most recent call last):
     | 2020-07-20T07:21:21.346Z | File 
"/usr/local/lib/python3.7/logging/__init__.py", line 1029, in emit
     | 2020-07-20T07:21:21.346Z | self.flush()
     | 2020-07-20T07:21:21.346Z | File 
"/usr/local/lib/python3.7/logging/__init__.py", line 1009, in flush
     | 2020-07-20T07:21:21.346Z | self.stream.flush()
     | 2020-07-20T07:21:21.346Z | OSError: [Errno 28] No space left on device
     | 2020-07-20T07:21:21.346Z | During handling of the above exception, 
another exception occurred:
   ```
   
   Which would point to the scheduler running out of memory, likely due to log 
buildup (I added log cleanup tasks retroactively). I'm not sure if this is 
related to the scheduler getting stuck though.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] sdzharkov commented on issue #7935: scheduler gets stuck without a trace

2020-09-02 Thread GitBox


sdzharkov commented on issue #7935:
URL: https://github.com/apache/airflow/issues/7935#issuecomment-685920512


   We experienced this issue twice now, with the CPU spiking to 100% and 
failing to schedule any tasks after. Our config is `Airflow 1.10.6 - Celery - 
Postgres` running on AWS ECS. I went back into our Cloudwatch logs and noticed 
the following collection of logs at the time the bug occurred: 
   
   ```
 | 2020-07-20T07:21:21.346Z | Process DagFileProcessor4357938-Process:
     | 2020-07-20T07:21:21.346Z | Traceback (most recent call last):
     | 2020-07-20T07:21:21.346Z | File 
"/usr/local/lib/python3.7/logging/__init__.py", line 1029, in emit
     | 2020-07-20T07:21:21.346Z | self.flush()
     | 2020-07-20T07:21:21.346Z | File 
"/usr/local/lib/python3.7/logging/__init__.py", line 1009, in flush
     | 2020-07-20T07:21:21.346Z | self.stream.flush()
     | 2020-07-20T07:21:21.346Z | OSError: [Errno 28] No space left on device
     | 2020-07-20T07:21:21.346Z | During handling of the above exception, 
another exception occurred:
   ```
   
   Which would point to the scheduler running out of memory, likely due to log 
buildup (I added log cleanup tasks retroactively). I'm not sure if this is 
related to the scheduler getting stuck though.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (9a10f83 -> 02b853b)

2020-09-02 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 9a10f83  Revert recent breeze changes (#10651 & #10670) (#10694)
 add 02b853b  Fix failing black test (#10697)

No new revisions were added by this update.

Summary of changes:
 .pre-commit-config.yaml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[airflow] branch master updated (9a10f83 -> 02b853b)

2020-09-02 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 9a10f83  Revert recent breeze changes (#10651 & #10670) (#10694)
 add 02b853b  Fix failing black test (#10697)

No new revisions were added by this update.

Summary of changes:
 .pre-commit-config.yaml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[GitHub] [airflow] potiuk merged pull request #10697: Fix failing black test

2020-09-02 Thread GitBox


potiuk merged pull request #10697:
URL: https://github.com/apache/airflow/pull/10697


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #10699: Latest stable version of black does not use .pyproject.toml

2020-09-02 Thread GitBox


potiuk commented on pull request #10699:
URL: https://github.com/apache/airflow/pull/10699#issuecomment-685916891


   Implemented in #10697



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk closed pull request #10699: Latest stable version of black does not use .pyproject.toml

2020-09-02 Thread GitBox


potiuk closed pull request #10699:
URL: https://github.com/apache/airflow/pull/10699


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #10699: Latest stable version of black does not use .pyproject.toml

2020-09-02 Thread GitBox


potiuk opened a new pull request #10699:
URL: https://github.com/apache/airflow/pull/10699


   We switch to 19.10b0 for now before they fix it.
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil closed pull request #10698: Fix failing black test

2020-09-02 Thread GitBox


kaxil closed pull request #10698:
URL: https://github.com/apache/airflow/pull/10698


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on a change in pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on a change in pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#discussion_r482270661



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -369,14 +369,15 @@ DAGs
 function lastDagRunsHandler(error, json) {

Review comment:
   Correction, the function is named after the `dag.get_last_dagrun` method 
which was used in the pre-asynchronous code for the value of `last_run`, which 
then was used for the values `last_run.execution_date` and 
`last_run.start_date`, but was incorrectly changed to be last_run = 
execution_date when it went asynchronous in commit 
https://github.com/apache/airflow/pull/4005/commits/6607e486d219a5ecbeb46ef8ea2869367e634bab#diff-f38558559ea1b4c30ddf132b7f223cf9L122.
 This PR just puts the values for the asynchronous code back to as they were.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on a change in pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on a change in pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#discussion_r482270661



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -369,14 +369,15 @@ DAGs
 function lastDagRunsHandler(error, json) {

Review comment:
   Correction, the function is named after the `dag.get_last_dagrun` method 
which was used in the pre-asynchronous code for the value of `last_run`, which 
then was used for the values `last_run.execution_date` and 
`last_run.start_date`, but was incorrectly changed to be last_run = 
execution_date when it went asynchronous in commit 
https://github.com/apache/airflow/pull/4005/commits/6607e486d219a5ecbeb46ef8ea2869367e634bab#diff-f38558559ea1b4c30ddf132b7f223cf9L122.
 This PR just puts the values back to as they were for the asynchronous code.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #10698: Fix failing black test

2020-09-02 Thread GitBox


kaxil opened a new pull request #10698:
URL: https://github.com/apache/airflow/pull/10698


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on a change in pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on a change in pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#discussion_r482270661



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -369,14 +369,15 @@ DAGs
 function lastDagRunsHandler(error, json) {

Review comment:
   Correction, the function is named after the `dag.get_last_dagrun` method 
which was used in the pre-asynchronous code for the value of `last_run`, which 
then was used for the values `last_run.execution_date` and 
`last_run.start_date`, but was incorrectly changed to be last_run = 
execution_date when it went asynchronous. This PR just puts the values back to 
as they were for the asynchronous code.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #10697: Fix failing black test

2020-09-02 Thread GitBox


kaxil opened a new pull request #10697:
URL: https://github.com/apache/airflow/pull/10697


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#issuecomment-685905666


   > Great! BTW tooltip seems not to work on all browsers #10279
   
   @JeffryMAC thanks for the heads up, I'll try to look into that



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on a change in pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on a change in pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#discussion_r482263524



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -369,14 +369,15 @@ DAGs
 function lastDagRunsHandler(error, json) {

Review comment:
   Well, I am not adding new functionality, just fixing a bug/regression. I 
am just putting back a value lost when the values were changed to load 
asynchronously. But it is a valid point as that function was added at that 
point it was changed to be asynchronous.
   
   This function appears to be named the same as the `/last_dagruns` endpoint 
that returns json for these dates, which is in turn seems to be named after the 
"Last Run" column this tooltip is in. So the column is showing the last run's 
execution date and the last run's start date. What name should it be instead?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on a change in pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on a change in pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#discussion_r482263524



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -369,14 +369,15 @@ DAGs
 function lastDagRunsHandler(error, json) {

Review comment:
   Well, I am not adding new functionality, just fixing a bug/regression. I 
am just putting back a value lost when the values were changed to load 
asynchronously. But it is a valid point as that function was added at that 
point it was change to asynchronous.
   
   This function appears to be named the same as the `/last_dagruns` endpoint 
that returns json for these dates, which is in turn seems to be named after the 
"Last Run" column this tooltip is in. So the column is showing the last run's 
execution date and the last run's start date. What name should it be instead?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on a change in pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on a change in pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#discussion_r482263524



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -369,14 +369,15 @@ DAGs
 function lastDagRunsHandler(error, json) {

Review comment:
   Well, I am not adding new functionality, just fixing a bug/regression. I 
am just putting back a value lost when the values was changed to load 
asynchronously. But it is a valid point as that function was added at that 
point it was change to asynchronous.
   
   This function appears to be named the same as the `/last_dagruns` endpoint 
that returns json for these dates, which is in turn seems to be named after the 
"Last Run" column this tooltip is in. So the column is showing the last run's 
execution date and the last run's start date. What name should it be instead?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on a change in pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on a change in pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#discussion_r482263524



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -369,14 +369,15 @@ DAGs
 function lastDagRunsHandler(error, json) {

Review comment:
   Well, I am not adding new functionality, just fixing a bug/regression. I 
am just putting back a value lost when the values was changed to load 
asynchronously. But it is a valid point as that function was added at that 
point it was change to asynchronous.
   
   This function appears to be named the same as the `/last_dagruns` endpoint 
that returns json for these dates, which is in turn seems to be named after the 
"Last Run" column this tooltip is in (I think). So the column is showing the 
last run's execution date and the last run's start date. What name should it be 
instead?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on a change in pull request #10637: Fix Start Date tooltip on DAGs page not showing actual start_date

2020-09-02 Thread GitBox


alexbegg commented on a change in pull request #10637:
URL: https://github.com/apache/airflow/pull/10637#discussion_r482263524



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -369,14 +369,15 @@ DAGs
 function lastDagRunsHandler(error, json) {

Review comment:
   Well, I am not adding new functionality, just fixing a bug/regression. I 
am just putting back a value lost when the value was changed to load 
asynchronously. But it is a valid point as that function was added at that 
point it was change to asynchronous.
   
   This function appears to be named the same as the `/last_dagruns` endpoint 
that returns json for these dates, which is in turn seems to be named after the 
"Last Run" column this tooltip is in (I think). So the column is showing the 
last run's execution date and the last run's start date. What name should it be 
instead?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #10695: Implement Google Shell Conventions for breeze script

2020-09-02 Thread GitBox


potiuk commented on pull request #10695:
URL: https://github.com/apache/airflow/pull/10695#issuecomment-685901279


   It was an interesting one. I fixed trap handling in this commit so that 
traps were "added" rather than "replaced". But they were added in a wrong 
order. Rather than putting them on a "stack" the add_trap method were adding 
them to the "list". The Traps shoudl follow the pattern of "stack" so thee last 
one should be executed first but it was reverse.
   
   One of the other changes I implemented few days earlier was making sure that 
the static checks output  only contain the pylint/mypy/etc and nothing else 
when pre-commit was run (the output before was cluttered with the informational 
message)  - that involved redirecting the output from pylint to the OUTPUT_LOG 
file And bad sequence of traps meant tha the file was deleted by the 
cleanup trap before it was printed out by the script_end trap
   
   That's why there was no output - the output file was deleted before it was 
printed
   
   This is the actual fix:
   
https://github.com/apache/airflow/pull/10695/commits/98756b64918c7dead2d2a0b6a49045db2740d18c
   
   
   I reproduced it with a bogus change:
   
   Before the fix:
   
   https://user-images.githubusercontent.com/595491/92019256-9a4e1400-ed56-11ea-9b77-58159ec781d3.png";>
   
   After the fix:
   
   https://user-images.githubusercontent.com/595491/92019272-9f12c800-ed56-11ea-9ff9-25684775e3d2.png";>
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on pull request #10695: Implement Google Shell Conventions for breeze script

2020-09-02 Thread GitBox


kaxil commented on pull request #10695:
URL: https://github.com/apache/airflow/pull/10695#issuecomment-685897780


   Thanks for the PR even though you are on holiday, I will test this out in 
few mins



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #10696: Add packages to function names in bash (#10670)

2020-09-02 Thread GitBox


potiuk opened a new pull request #10696:
URL: https://github.com/apache/airflow/pull/10696


   [Re-added after reverting it]
   
   Inspired by the Google Shell Guide where they mentioned
   separating package names with :: I realized that this was
   one of the missing pieces in the bash scripts of ours.
   
   While we already had packages (in libraries folders)
   it's been difficult to realise which function is where.
   
   With introducing packages - equal to the library file name
   we are *almost* at a level of a structured language - and
   it's easier to find the functions if you are looking for them.
   
   Way easier in fact.
   
   Part of #10576
   
   (cherry picked from commit cc551ba)
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] joshkadis commented on issue #9538: Add config variable for UI page title

2020-09-02 Thread GitBox


joshkadis commented on issue #9538:
URL: https://github.com/apache/airflow/issues/9538#issuecomment-685894959


   @pcandoalmeida Sorry... Where would `page-title` vs`site-title` be rendered? 
(thanks!)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #10695: Fix missing output from docker

2020-09-02 Thread GitBox


potiuk opened a new pull request #10695:
URL: https://github.com/apache/airflow/pull/10695


   [This is re-applied commit from reverted one - the sequence of trap
   handling scripts were wrong so after add_trap was fixed the folder
   was deleted before output was printed out]
   
   First (and the biggest of the series of commits to introduce
   Google Shell Conventions in our bash scripts.
   
   This is about the biggest and the most complex breeze script
   so it is rather huge but it is difficult to split it into
   smaller pieces.
   
   The rules implemented (from the conventions):
   
* constants and exported variables are CAPITALIZED, where
  local/temporary variables are lowercase
   
* following the shell guide, once all the variables are set to their
  final values (either from exported variables, calculation or --switches
  ) I have a single function that makes all the variables read-only. That
  helped to clean-up a lot of places where same functions was called
  several times, or where variables were defined in a few places. Now the
  behavior should be rather consistent and we should easily catch some
  duplications
   
* function headers (following the guide) explaining arguments,
  variables expected, variables modified in the functions used.
   
* setting the variables as read-only also helped to clean-up the "ifs"
  where we often had ":=}" in variables and != "" or == "". Those are
  replaced with `=}` and tests are replaced with `-n` and `-z` - also
  following the shell guide (readonly helped to detect and clean all
  such cases). This also should be much more robust in the future.
   
* reorganized initialization of those constants and variables - simplified
  a few places where initialization was overlapping. It should be much more
  straightforward and clean now
   
* a number of internal function breeze variables are "local" - this is
  helpful in accidental variables overwriting and keeping stuff localized
   
* trap_add function is separated out to help in cases where we had
  several traps handling the same signals.
   
   (cherry picked from commit 46c8d67)
   (cherry picked from commit c822fd7b4bf2a9c5a9bb3c6e783cbea9dac37246)
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] coopergillan commented on pull request #10671: Add more type annotations to AWS hooks

2020-09-02 Thread GitBox


coopergillan commented on pull request #10671:
URL: https://github.com/apache/airflow/pull/10671#issuecomment-685882664


   I need some help understanding the "no pylint" failure. It looks like it has 
to do with `black` formatting:
   
   ```bash
   
black.Failed
   - hook id: black
   - files were modified by this hook
   
   reformatted 
/home/runner/work/airflow/airflow/airflow/providers/amazon/aws/hooks/aws_dynamodb.py
   reformatted 
/home/runner/work/airflow/airflow/airflow/providers/amazon/aws/hooks/base_aws.py
   reformatted 
/home/runner/work/airflow/airflow/airflow/providers/amazon/aws/hooks/kinesis.py
   All done! ✨ 🍰 ✨
   3 files reformatted, 1440 files left unchanged.
   ```
   
   Is this requiring that `black` formatting be included?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dimberman commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


dimberman commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482236372



##
File path: airflow/cli/cli_parser.py
##
@@ -778,6 +783,12 @@ class GroupCommand(NamedTuple):
 
func=lazy_load_command('airflow.cli.commands.dag_command.dag_list_dags'),
 args=(ARG_SUBDIR, ARG_OUTPUT),
 ),
+ActionCommand(
+name='generate_yaml',

Review comment:
   @mik-laj so create a section for one command? Seems like a bit of 
overkill...





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] varundhussa commented on a change in pull request #10673: Dataproc job sensor

2020-09-02 Thread GitBox


varundhussa commented on a change in pull request #10673:
URL: https://github.com/apache/airflow/pull/10673#discussion_r482235954



##
File path: airflow/providers/google/cloud/sensors/dataproc.py
##
@@ -0,0 +1,83 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+This module contains a Dataproc Job sensor.
+"""
+from typing import Optional
+
+from airflow.providers.google.cloud.hooks.dataproc import DataprocHook
+from airflow.sensors.base_sensor_operator import BaseSensorOperator
+from airflow.utils.decorators import apply_defaults
+from google.cloud.dataproc_v1beta2.types import JobStatus
+
+
+class DataprocJobSensor(BaseSensorOperator):
+"""
+Check for the state of a job submitted to a Dataproc cluster.
+
+:param gcp_conn_id: The connection ID to use connecting to Google Cloud 
Platform.
+:type gcp_conn_id: str
+:param dataproc_job_id: The Dataproc job ID to poll
+:type dataproc_job_id: str
+:param delegate_to: The account to impersonate, if any.

Review comment:
   Thanks @jaketf I'm removing delegate_to





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dimberman commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


dimberman commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482230994



##
File path: tests/cli/commands/test_dag_command.py
##
@@ -139,6 +139,12 @@ def test_show_dag_print(self):
 self.assertIn("graph [label=example_bash_operator labelloc=t 
rankdir=LR]", out)
 self.assertIn("runme_2 -> run_after_loop", out)
 
+def test_generate_dag_yaml(self):
+with contextlib.redirect_stdout(io.StringIO()) as temp_stdout:
+dag_command.generate_pod_yaml(self.parser.parse_args([
+'dags', 'generate_yaml', '--dag-id', 'example_bash_operator']))
+

Review comment:
   @mik-laj PTAL added assertions





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dimberman commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


dimberman commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482230802



##
File path: airflow/cli/commands/dag_command.py
##
@@ -378,6 +379,47 @@ def dag_list_dag_runs(args, dag=None):
 print(table)
 
 
+@provide_session
+@cli_utils.action_logging
+def generate_pod_yaml(args, session=None):
+"""Generates yaml files for each task in the DAG. Used for testing output 
of KubernetesExecutor"""
+
+from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
+from airflow.kubernetes import pod_generator
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.kubernetes.worker_configuration import WorkerConfiguration
+execution_date = datetime.datetime(2020,11,3)
+dag = get_dag(subdir=args.subdir, dag_id=args.dag_id)
+yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"

Review comment:
   @mik-laj because usually if I'm debugging I'm debugging a specific task. 
We can add a "join_in_one_file" option if there is interest, but that can be a 
separate commit.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


mik-laj commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482205398



##
File path: airflow/cli/commands/dag_command.py
##
@@ -378,6 +379,47 @@ def dag_list_dag_runs(args, dag=None):
 print(table)
 
 
+@provide_session
+@cli_utils.action_logging
+def generate_pod_yaml(args, session=None):
+"""Generates yaml files for each task in the DAG. Used for testing output 
of KubernetesExecutor"""
+
+from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
+from airflow.kubernetes import pod_generator
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.kubernetes.worker_configuration import WorkerConfiguration
+execution_date = datetime.datetime(2020,11,3)
+dag = get_dag(subdir=args.subdir, dag_id=args.dag_id)
+yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"

Review comment:
   Why one task in a separate file? Wouldn't it be more convenient if there 
was one file for each DAG? This will prevent you from having to write the data 
to a file, but you can display it on the screen.  Ideally, the YAML displayed 
on the screen should also have colors. Take a look: `airflow config list`.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (338b412 -> 9a10f83)

2020-09-02 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 338b412  Add on_kill support for the KubernetesPodOperator (#10666)
 add 9a10f83  Revert recent breeze changes (#10651 & #10670) (#10694)

No new revisions were added by this update.

Summary of changes:
 BREEZE.rst |   10 +-
 breeze | 2821 
 breeze-complete|   20 +-
 confirm|2 +-
 .../ci_prepare_and_test_backport_packages.sh   |2 +
 .../ci_prepare_backport_packages.sh|   10 +-
 .../ci_prepare_backport_readme.sh  |   10 +-
 ...ci_test_backport_packages_import_all_classes.sh |8 +-
 ...ci_test_backport_packages_install_separately.sh |8 +-
 scripts/ci/constraints/ci_branch_constraints.sh|4 +
 scripts/ci/constraints/ci_commit_constraints.sh|4 +
 scripts/ci/constraints/ci_generate_constraints.sh  |   10 +-
 scripts/ci/docs/ci_docs.sh |   10 +-
 scripts/ci/images/ci_build_dockerhub.sh|   12 +-
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh |   13 +-
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |9 +-
 scripts/ci/images/ci_push_ci_images.sh |6 +-
 scripts/ci/images/ci_push_production_images.sh |6 +-
 scripts/ci/images/ci_wait_for_all_ci_images.sh |   28 +-
 scripts/ci/images/ci_wait_for_all_prod_images.sh   |   33 +-
 .../ci/kubernetes/ci_deploy_app_to_kubernetes.sh   |   28 +-
 scripts/ci/kubernetes/ci_run_kubernetes_tests.sh   |   11 +-
 scripts/ci/libraries/_all_libs.sh  |   10 +-
 scripts/ci/libraries/_build_images.sh  |  384 ++-
 scripts/ci/libraries/_initialization.sh|  341 +--
 scripts/ci/libraries/_kind.sh  |  102 +-
 scripts/ci/libraries/_local_mounts.sh  |6 +-
 scripts/ci/libraries/_md5sum.sh|   52 +-
 scripts/ci/libraries/_parameters.sh|8 +-
 scripts/ci/libraries/_permissions.sh   |   22 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |   89 +-
 scripts/ci/libraries/_pylint.sh|2 +-
 scripts/ci/libraries/_runs.sh  |8 +-
 scripts/ci/libraries/_sanity_checks.sh |   63 +-
 scripts/ci/libraries/_script_init.sh   |   25 +-
 scripts/ci/libraries/_spinner.sh   |2 +-
 scripts/ci/libraries/_start_end.sh |   45 +-
 scripts/ci/libraries/_traps.sh |   38 -
 scripts/ci/libraries/_verbosity.sh |   61 +-
 scripts/ci/openapi/client_codegen_diff.sh  |1 +
 .../ci/pre_commit/pre_commit_breeze_cmd_line.sh|   16 -
 .../ci/pre_commit/pre_commit_check_integrations.sh |5 +-
 scripts/ci/pre_commit/pre_commit_ci_build.sh   |7 +-
 .../ci/pre_commit/pre_commit_local_yml_mounts.sh   |8 +-
 scripts/ci/pre_commit/pre_commit_mermaid.sh|9 +-
 scripts/ci/static_checks/bat_tests.sh  |4 +
 scripts/ci/static_checks/flake8.sh |8 +-
 scripts/ci/static_checks/mypy.sh   |8 +-
 scripts/ci/static_checks/pylint.sh |   10 +-
 scripts/ci/static_checks/refresh_pylint_todo.sh|8 +-
 scripts/ci/static_checks/run_static_checks.sh  |8 +-
 scripts/ci/testing/ci_run_airflow_testing.sh   |   11 +-
 scripts/ci/tools/ci_clear_tmp.sh   |   16 +
 scripts/ci/tools/ci_count_changed_files.sh |3 +
 scripts/ci/tools/ci_fix_ownership.sh   |   16 +
 scripts/in_container/_in_container_script_init.sh  |2 +-
 scripts/in_container/_in_container_utils.sh|   25 +-
 scripts/in_container/check_environment.sh  |4 +-
 scripts/in_container/entrypoint_ci.sh  |6 +-
 scripts/in_container/prod/entrypoint_prod.sh   |4 +-
 scripts/in_container/run_cli_tool.sh   |   10 +-
 scripts/in_container/run_docs_build.sh |5 +-
 scripts/in_container/run_generate_constraints.sh   |5 +-
 .../in_container/run_prepare_backport_packages.sh  |8 +-
 .../in_container/run_prepare_backport_readme.sh|5 +-
 tests/bats/bats_utils.bash |   17 +-
 tests/bats/test_breeze_params.bats |   40 +-
 tests/bats/test_empty_test.bats|2 +
 tests/bats/test_local_mounts.bats  |4 +-
 69 files changed, 1945 insertions(+), 2653 deletions(-)
 delete mode 100644 scripts/ci/libraries/_traps.sh



[GitHub] [airflow] mik-laj commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


mik-laj commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482203423



##
File path: airflow/cli/cli_parser.py
##
@@ -778,6 +783,12 @@ class GroupCommand(NamedTuple):
 
func=lazy_load_command('airflow.cli.commands.dag_command.dag_list_dags'),
 args=(ARG_SUBDIR, ARG_OUTPUT),
 ),
+ActionCommand(
+name='generate_yaml',
+help="Generate YAML files for all tasks in DAG",
+
func=lazy_load_command('airflow.cli.commands.dag_command.generate_pod_yaml'),
+args=(ARG_SUBDIR, ARG_DAG_ID_OPT,ARG_OUTPUT_PATH),

Review comment:
   ```suggestion
   args=(ARG_SUBDIR, ARG_DAG_ID, ARG_OUTPUT_PATH),
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #10694: Revert recent breeze changes (#10651 & #10670)

2020-09-02 Thread GitBox


kaxil merged pull request #10694:
URL: https://github.com/apache/airflow/pull/10694


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek edited a comment on pull request #10694: Revert recent breeze changes (#10651 & #10670)

2020-09-02 Thread GitBox


turbaszek edited a comment on pull request #10694:
URL: https://github.com/apache/airflow/pull/10694#issuecomment-685844024


   The first problem can be fixed simply by (thanks to @michalslowikowski00 who 
figured this out):
   ```
   ./breeze build-image --force-pull-images
   ```
   but in case of the pylint/mypy error I don't know how to fix it



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on pull request #10694: Revert recent breeze changes (#10651 & #10670)

2020-09-02 Thread GitBox


turbaszek commented on pull request #10694:
URL: https://github.com/apache/airflow/pull/10694#issuecomment-685844024


   The first problem can be fixed simply by:
   ```
   ./breeze build-image --force-pull-images
   ```
   but in case of the pylint/mypy error I don't know how to fix it



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


mik-laj commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482193145



##
File path: airflow/cli/cli_parser.py
##
@@ -778,6 +783,12 @@ class GroupCommand(NamedTuple):
 
func=lazy_load_command('airflow.cli.commands.dag_command.dag_list_dags'),
 args=(ARG_SUBDIR, ARG_OUTPUT),
 ),
+ActionCommand(
+name='generate_yaml',

Review comment:
   What do you think to add this to the new ``kubernetes`` group? We 
currently have sections for celery executor.
   
   ```
   Celery components
   
 airflow celery flower - Start a Celery Flower
 airflow celery stop - Stop the Celery worker gracefully
 airflow celery worker - Start a Celery worker node
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dimberman commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


dimberman commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482192545



##
File path: airflow/cli/cli_parser.py
##
@@ -778,6 +782,12 @@ class GroupCommand(NamedTuple):
 
func=lazy_load_command('airflow.cli.commands.dag_command.dag_list_dags'),
 args=(ARG_SUBDIR, ARG_OUTPUT),
 ),
+ActionCommand(
+name='generate_yaml',
+help="Generate YAML files for all tasks in DAG",
+
func=lazy_load_command('airflow.cli.commands.dag_command.generate_pod_yaml'),
+args=(ARG_SUBDIR, ARG_DAG_ID_OPT,ARG_OUTPUT_PATH),

Review comment:
   How do I make this not optional?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dimberman commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


dimberman commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482191652



##
File path: tests/cli/commands/test_dag_command.py
##
@@ -139,6 +139,12 @@ def test_show_dag_print(self):
 self.assertIn("graph [label=example_bash_operator labelloc=t 
rankdir=LR]", out)
 self.assertIn("runme_2 -> run_after_loop", out)
 
+def test_generate_dag_yaml(self):
+with contextlib.redirect_stdout(io.StringIO()) as temp_stdout:
+dag_command.generate_pod_yaml(self.parser.parse_args([
+'dags', 'generate_yaml', '--dag-id', 'example_bash_operator']))
+

Review comment:
   @mik-laj yes I plan to add assertions that a yaml file is created and 
that it's a valid pod.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dimberman commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


dimberman commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482191227



##
File path: airflow/cli/commands/dag_command.py
##
@@ -378,6 +379,47 @@ def dag_list_dag_runs(args, dag=None):
 print(table)
 
 
+@provide_session
+@cli_utils.action_logging
+def generate_pod_yaml(args, session=None):
+"""Generates yaml files for each task in the DAG. Used for testing output 
of KubernetesExecutor"""
+
+from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
+from airflow.kubernetes import pod_generator
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.kubernetes.worker_configuration import WorkerConfiguration
+execution_date = datetime.datetime(2020,11,3)
+dag = get_dag(subdir=args.subdir, dag_id=args.dag_id)
+yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"

Review comment:
   @mik-laj would people want that as stdout? The idea is that it would 
create a file per-task.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


mik-laj commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482191020



##
File path: tests/cli/commands/test_dag_command.py
##
@@ -139,6 +139,12 @@ def test_show_dag_print(self):
 self.assertIn("graph [label=example_bash_operator labelloc=t 
rankdir=LR]", out)
 self.assertIn("runme_2 -> run_after_loop", out)
 
+def test_generate_dag_yaml(self):
+with contextlib.redirect_stdout(io.StringIO()) as temp_stdout:
+dag_command.generate_pod_yaml(self.parser.parse_args([
+'dags', 'generate_yaml', '--dag-id', 'example_bash_operator']))
+

Review comment:
   Can you add even one assertion?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10677: Add generate_yaml command to easily test KubernetesExecutor before deploying pods

2020-09-02 Thread GitBox


mik-laj commented on a change in pull request #10677:
URL: https://github.com/apache/airflow/pull/10677#discussion_r482190501



##
File path: airflow/cli/commands/dag_command.py
##
@@ -378,6 +379,47 @@ def dag_list_dag_runs(args, dag=None):
 print(table)
 
 
+@provide_session
+@cli_utils.action_logging
+def generate_pod_yaml(args, session=None):
+"""Generates yaml files for each task in the DAG. Used for testing output 
of KubernetesExecutor"""
+
+from airflow.executors.kubernetes_executor import 
AirflowKubernetesScheduler, KubeConfig
+from airflow.kubernetes import pod_generator
+from airflow.kubernetes.pod_generator import PodGenerator
+from airflow.kubernetes.worker_configuration import WorkerConfiguration
+execution_date = datetime.datetime(2020,11,3)
+dag = get_dag(subdir=args.subdir, dag_id=args.dag_id)
+yaml_output_path = args.output_path or "/tmp/airflow_generated_yaml/"

Review comment:
   Should be `sys.stdout` by default?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   >