[GitHub] [airflow] ecerulm commented on pull request #13073: Script to generate integrations.json

2020-12-14 Thread GitBox


ecerulm commented on pull request #13073:
URL: https://github.com/apache/airflow/pull/13073#issuecomment-745115624


   @mik-laj just mentioning you since I can't "request a review" from you.
   
   You can invoke the script as:
   ```
   python scripts/tools/generate-integrations-json.py 
>../airflow-site/landing-pages/site/data/integrations.json
   ```
   
   and the output look like 
   
   ```
   [
   {
   "name": "Discord",
   "url": "/docs/apache-airflow-providers-discord/stable/index.html"
   },
   {
   "name": "Amazon Athena",
   "url": "/docs/apache-airflow-providers-amazon/stable/index.html",
   "logo": "/integration-logos/aws/amazon-athena_light...@4x.png"
   },
   {
   "name": "Amazon CloudFormation",
   "url": "/docs/apache-airflow-providers-amazon/stable/index.html"
   },
   {
   "name": "Amazon CloudWatch Logs",
   "url": "/docs/apache-airflow-providers-amazon/stable/index.html",
   "logo": "/integration-logos/aws/amazon-cloudwatch_light...@4x.png"
   },
   ...
   ]
   ```



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] triptec commented on pull request #12944: Airflow 7044 - add ability to specify RSA public key in extra field of SSH connection.

2020-12-14 Thread GitBox


triptec commented on pull request #12944:
URL: https://github.com/apache/airflow/pull/12944#issuecomment-745113465


   @turbaszek any thoughts about the PR?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] XD-DENG opened a new pull request #13082: Fix failing static check in master

2020-12-14 Thread GitBox


XD-DENG opened a new pull request #13082:
URL: https://github.com/apache/airflow/pull/13082


   This new addition didn't follow the correct order.
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pindge commented on issue #13081: OAuth2 login process is not stateless

2020-12-14 Thread GitBox


pindge commented on issue #13081:
URL: https://github.com/apache/airflow/issues/13081#issuecomment-745107292


   Note, this is running v1.10.14 with flask appbuilder 3.1.1 in web instance
   ```
 extraPipPackages:
 - "authlib"
 - "Flask-AppBuilder==3.1.1"
   ```
   
   When the secret key is not specified and is randomly generated, each web 
instance has a different secret key which breaks the login process when the 
serving pod is switched.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #13080: KubernetesExecutor overrides should only append to lists

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13080:
URL: https://github.com/apache/airflow/pull/13080#issuecomment-745095464


   [The Workflow run](https://github.com/apache/airflow/actions/runs/422462347) 
is cancelling this PR. Building images for the PR has failed. Follow the the 
workflow link to check the reason.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] whatnick opened a new issue #13081: OAuth2 login process is not stateless

2020-12-14 Thread GitBox


whatnick opened a new issue #13081:
URL: https://github.com/apache/airflow/issues/13081


   **Apache Airflow version**: 1.10.14
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): Server Version: version.Info{Major:"1", Minor:"16+", 
GitVersion:"v1.16.15-eks-ad4801", 
GitCommit:"ad4801fd44fe0f125c8d13f1b1d4827e8884476d", GitTreeState:"clean", 
BuildDate:"2020-10-20T23:27:12Z", GoVersion:"go1.13.15", Compiler:"gc", 
Platform:"linux/amd64"}
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: AWS / EKS
   - **OS** (e.g. from /etc/os-release): N/A
   - **Kernel** (e.g. `uname -a`): N/A
   - **Install tools**: N/A
   - **Others**: N/A
   
   **What happened**:
   
   Cognito login does not work if second request is not handled by first pod 
receiving access_token headers.
   
   **What you expected to happen**:
   
   Logging in via Cognito OAuth2 mode / Code should work via any pod.
   
   **How to reproduce it**:
   
   Override `webserver_config.py` with the following code:
   
   ```
   """Default configuration for the Airflow webserver"""
 import logging
 import os
 import json
 from airflow.configuration import conf
 from airflow.www_rbac.security import AirflowSecurityManager
 from flask_appbuilder.security.manager import AUTH_OAUTH
   
 log = logging.getLogger(__name__)
 basedir = os.path.abspath(os.path.dirname(__file__))
   
 # The SQLAlchemy connection string.
 SQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN')
   
 # Flask-WTF flag for CSRF
 WTF_CSRF_ENABLED = True
   
 CSRF_ENABLED = True
 # 
 # AUTHENTICATION CONFIG
 # 
 # For details on how to set up each of the following 
authentication, see
 # http://flask-appbuilder.readthedocs.io/en/latest/security.html# 
authentication-methods
 # for details.
   
 # The authentication type
 AUTH_TYPE = AUTH_OAUTH
   
 SECRET_KEY = os.environ.get("FLASK_SECRET_KEY")
   
 OAUTH_PROVIDERS = [{
 'name': 'aws_cognito',
 'whitelist': ['@ga.gov.au'], 
 'token_key': 'access_token',
 'icon': 'fa-amazon',
 'remote_app': {
 'api_base_url': os.environ.get("OAUTH2_BASE_URL") + "/",
 'client_kwargs': {
 'scope': 'openid email aws.cognito.signin.user.admin'
 },
 'authorize_url': os.environ.get("OAUTH2_BASE_URL") + 
"/authorize",
 'access_token_url': os.environ.get("OAUTH2_BASE_URL") + 
"/token",
 'request_token_url': None,
 'client_id': os.environ.get("COGNITO_CLIENT_ID"),
 'client_secret': os.environ.get("COGNITO_CLIENT_SECRET"),
 }
 }]
   
   
 class CognitoAirflowSecurityManager(AirflowSecurityManager):
 def oauth_user_info(self, provider, resp):
 # log.info("Requesting user info from AWS Cognito: 
{0}".format(resp))
 assert provider == "aws_cognito"
 # log.info("Requesting user info from AWS Cognito: 
{0}".format(resp))
 me = 
self.appbuilder.sm.oauth_remotes[provider].get("userInfo")
 return {
 "username": me.json().get("username"),
 "email": me.json().get("email"),
 "first_name": me.json().get("given_name", ""),
 "last_name": me.json().get("family_name", ""),
 "id": me.json().get("sub", ""),
 }
   
   
 SECURITY_MANAGER_CLASS = CognitoAirflowSecurityManager
   ```
   
   - Setup an airflow-app linked a to Cognito user pull and run multiple 
replicas of the airflow-web pod.
   - Login will start failing and work may be 1 in 9 attempts.
   
   **Anything else we need to know**:
   
   There are 2 possible work arounds using infrastructure changes instead of 
airflow-web code changes.
   
   - Use a single pod for airflow-web to avoid session issues
   - Make ALB sticky via ingress to give users the same pod consistently
   
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #13081: OAuth2 login process is not stateless

2020-12-14 Thread GitBox


boring-cyborg[bot] commented on issue #13081:
URL: https://github.com/apache/airflow/issues/13081#issuecomment-745088379


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Fix failing pylint check on Master (#13078)

2020-12-14 Thread xddeng
This is an automated email from the ASF dual-hosted git repository.

xddeng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 0655d51  Fix failing pylint check on Master (#13078)
0655d51 is described below

commit 0655d51a39d763ca6c00abf081fa4916661bc3b8
Author: Kaxil Naik 
AuthorDate: Tue Dec 15 06:21:21 2020 +

Fix failing pylint check on Master (#13078)
---
 docs/exts/docs_build/fetch_inventories.py | 4 +++-
 docs/exts/docs_build/lint_checks.py   | 2 +-
 2 files changed, 4 insertions(+), 2 deletions(-)

diff --git a/docs/exts/docs_build/fetch_inventories.py 
b/docs/exts/docs_build/fetch_inventories.py
index 147d9c2..e9da264 100644
--- a/docs/exts/docs_build/fetch_inventories.py
+++ b/docs/exts/docs_build/fetch_inventories.py
@@ -27,7 +27,9 @@ from requests.adapters import DEFAULT_POOLSIZE
 from docs.exts.docs_build.docs_builder import (  # pylint: 
disable=no-name-in-module
 get_available_providers_packages,
 )
-from docs.exts.docs_build.third_party_inventories import THIRD_PARTY_INDEXES
+from docs.exts.docs_build.third_party_inventories import (  # pylint: 
disable=no-name-in-module
+THIRD_PARTY_INDEXES,
+)
 
 CURRENT_DIR = os.path.dirname(__file__)
 ROOT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir, os.pardir, 
os.pardir))
diff --git a/docs/exts/docs_build/lint_checks.py 
b/docs/exts/docs_build/lint_checks.py
index d9177c8..b611b1e 100644
--- a/docs/exts/docs_build/lint_checks.py
+++ b/docs/exts/docs_build/lint_checks.py
@@ -22,7 +22,7 @@ from glob import glob
 from itertools import chain
 from typing import Iterable, List, Optional, Set
 
-from docs.exts.docs_build.docs_builder import ALL_PROVIDER_YAMLS
+from docs.exts.docs_build.docs_builder import ALL_PROVIDER_YAMLS  # pylint: 
disable=no-name-in-module
 from docs.exts.docs_build.errors import DocBuildError  # pylint: 
disable=no-name-in-module
 
 ROOT_PROJECT_DIR = os.path.abspath(



[GitHub] [airflow] XD-DENG merged pull request #13078: Fix failing pylint check on Master

2020-12-14 Thread GitBox


XD-DENG merged pull request #13078:
URL: https://github.com/apache/airflow/pull/13078


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (317858a -> ada1c63)

2020-12-14 Thread msumit
This is an automated email from the ASF dual-hosted git repository.

msumit pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 317858a  Remove unneeded parentheses from Python file (#12968)
 add ada1c63  Update INTHEWILD.md - Add Altafino (#13079)

No new revisions were added by this update.

Summary of changes:
 INTHEWILD.md | 1 +
 1 file changed, 1 insertion(+)



[GitHub] [airflow] github-actions[bot] commented on pull request #13078: Fix failing pylint check on Master

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13078:
URL: https://github.com/apache/airflow/pull/13078#issuecomment-745080193


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest master or amend the last commit 
of the PR, and push it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] msumit merged pull request #13079: Update INTHEWILD.md

2020-12-14 Thread GitBox


msumit merged pull request #13079:
URL: https://github.com/apache/airflow/pull/13079


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #13079: Update INTHEWILD.md

2020-12-14 Thread GitBox


boring-cyborg[bot] commented on pull request #13079:
URL: https://github.com/apache/airflow/pull/13079#issuecomment-745078772


   Awesome work, congrats on your first merged pull request!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch fix-env-from-1-10 updated (3a5663e -> b7df326)

2020-12-14 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a change to branch fix-env-from-1-10
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard 3a5663e  KubernetesExecutor overrides should only add to lists
omit e0d3aad  KubernetesExecutor should accept images from executor_config
 add b7df326  KubernetesExecutor overrides should only append to lists

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (3a5663e)
\
 N -- N -- N   refs/heads/fix-env-from-1-10 (b7df326)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 airflow/kubernetes/pod_generator.py | 8 +---
 1 file changed, 1 insertion(+), 7 deletions(-)



[airflow] branch accept-executor-config-image updated (3a5663e -> e0d3aad)

2020-12-14 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a change to branch accept-executor-config-image
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard 3a5663e  KubernetesExecutor overrides should only add to lists

This update removed existing revisions from the reference, leaving the
reference pointing at a previous point in the repository history.

 * -- * -- N   refs/heads/accept-executor-config-image (e0d3aad)
\
 O -- O -- O   (3a5663e)

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 airflow/kubernetes/pod_generator.py|  225 ++---
 tests/kubernetes/test_pod_generator.py | 1453 
 2 files changed, 830 insertions(+), 848 deletions(-)



[airflow] 01/01: KubernetesExecutor overrides should only add to lists

2020-12-14 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a commit to branch fix-env-from-1-10
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3a5663e43931903cee02abbea5193a6a8cb9e9e1
Author: Daniel Imberman 
AuthorDate: Mon Dec 14 21:22:17 2020 -0800

KubernetesExecutor overrides should only add to lists

This PR makes 1.10 interaction more similar to that of Airflow 2.0.
Essentially users are able to override values that are in maps, but when
it comes to lists in k8s objects, it is too complicated to consistently
override.
---
 airflow/kubernetes/pod_generator.py|  225 +++--
 tests/kubernetes/test_pod_generator.py | 1453 
 2 files changed, 848 insertions(+), 830 deletions(-)

diff --git a/airflow/kubernetes/pod_generator.py 
b/airflow/kubernetes/pod_generator.py
index 4df3198..8f0d141 100644
--- a/airflow/kubernetes/pod_generator.py
+++ b/airflow/kubernetes/pod_generator.py
@@ -34,7 +34,6 @@ from dateutil import parser
 from kubernetes.client.api_client import ApiClient
 from airflow.contrib.kubernetes.pod import _extract_volume_mounts
 
-from airflow.exceptions import AirflowConfigException
 from airflow.version import version as airflow_version
 
 MAX_LABEL_LEN = 63
@@ -50,21 +49,15 @@ class PodDefaults(object):
 def __init__(self):
 pass
 
-XCOM_MOUNT_PATH = '/airflow/xcom'
-SIDECAR_CONTAINER_NAME = 'airflow-xcom-sidecar'
+XCOM_MOUNT_PATH = "/airflow/xcom"
+SIDECAR_CONTAINER_NAME = "airflow-xcom-sidecar"
 XCOM_CMD = 'trap "exit 0" INT; while true; do sleep 30; done;'
-VOLUME_MOUNT = k8s.V1VolumeMount(
-name='xcom',
-mount_path=XCOM_MOUNT_PATH
-)
-VOLUME = k8s.V1Volume(
-name='xcom',
-empty_dir=k8s.V1EmptyDirVolumeSource()
-)
+VOLUME_MOUNT = k8s.V1VolumeMount(name="xcom", mount_path=XCOM_MOUNT_PATH)
+VOLUME = k8s.V1Volume(name="xcom", empty_dir=k8s.V1EmptyDirVolumeSource())
 SIDECAR_CONTAINER = k8s.V1Container(
 name=SIDECAR_CONTAINER_NAME,
-command=['sh', '-c', XCOM_CMD],
-image='alpine',
+command=["sh", "-c", XCOM_CMD],
+image="alpine",
 volume_mounts=[VOLUME_MOUNT],
 resources=k8s.V1ResourceRequirements(
 requests={
@@ -88,7 +81,7 @@ def make_safe_label_value(string):
 
 if len(safe_label) > MAX_LABEL_LEN or string != safe_label:
 safe_hash = hashlib.md5(string.encode()).hexdigest()[:9]
-safe_label = safe_label[:MAX_LABEL_LEN - len(safe_hash) - 1] + "-" + 
safe_hash
+safe_label = safe_label[: MAX_LABEL_LEN - len(safe_hash) - 1] + "-" + 
safe_hash
 
 return safe_label
 
@@ -102,7 +95,7 @@ def datetime_to_label_safe_datestring(datetime_obj):
 :param datetime_obj: datetime.datetime object
 :return: ISO-like string representing the datetime
 """
-return datetime_obj.isoformat().replace(":", "_").replace('+', '_plus_')
+return datetime_obj.isoformat().replace(":", "_").replace("+", "_plus_")
 
 
 def label_safe_datestring_to_datetime(string):
@@ -114,7 +107,7 @@ def label_safe_datestring_to_datetime(string):
 :param string: str
 :return: datetime.datetime object
 """
-return parser.parse(string.replace('_plus_', '+').replace("_", ":"))
+return parser.parse(string.replace("_plus_", "+").replace("_", ":"))
 
 
 class PodGenerator(object):
@@ -230,8 +223,8 @@ class PodGenerator(object):
 self.ud_pod = pod
 
 self.pod = k8s.V1Pod()
-self.pod.api_version = 'v1'
-self.pod.kind = 'Pod'
+self.pod.api_version = "v1"
+self.pod.kind = "Pod"
 
 # Pod Metadata
 self.metadata = k8s.V1ObjectMeta()
@@ -241,35 +234,34 @@ class PodGenerator(object):
 self.metadata.annotations = annotations
 
 # Pod Container
-self.container = k8s.V1Container(name='base')
+self.container = k8s.V1Container(name="base")
 self.container.image = image
 self.container.env = []
 
 if envs:
 if isinstance(envs, dict):
 for key, val in envs.items():
-self.container.env.append(k8s.V1EnvVar(
-name=key,
-value=val
-))
+self.container.env.append(k8s.V1EnvVar(name=key, 
value=val))
 elif isinstance(envs, list):
 self.container.env.extend(envs)
 
 configmaps = configmaps or []
 self.container.env_from = []
 for configmap in configmaps:
-self.container.env_from.append(k8s.V1EnvFromSource(
-config_map_ref=k8s.V1ConfigMapEnvSource(
-name=configmap
+self.container.env_from.append(
+k8s.V1EnvFromSource(
+config_map_ref=k8s.V1ConfigMapEnvSource(name=configmap)
 )
-))
+)
 
 sel

[airflow] branch fix-env-from-1-10 created (now 3a5663e)

2020-12-14 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a change to branch fix-env-from-1-10
in repository https://gitbox.apache.org/repos/asf/airflow.git.


  at 3a5663e  KubernetesExecutor overrides should only add to lists

This branch includes the following new commits:

 new 3a5663e  KubernetesExecutor overrides should only add to lists

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




[GitHub] [airflow] dimberman opened a new pull request #13080: KubernetesExecutor overrides should only append to lists

2020-12-14 Thread GitBox


dimberman opened a new pull request #13080:
URL: https://github.com/apache/airflow/pull/13080


   This PR makes 1.10 interaction more similar to that of Airflow 2.0.
   Essentially users are able to override values that are in maps, but when
   it comes to lists in k8s objects, it is too complicated to consistently
   override.
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch accept-executor-config-image updated (e0d3aad -> 3a5663e)

2020-12-14 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a change to branch accept-executor-config-image
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e0d3aad  KubernetesExecutor should accept images from executor_config
 add 3a5663e  KubernetesExecutor overrides should only add to lists

No new revisions were added by this update.

Summary of changes:
 airflow/kubernetes/pod_generator.py|  225 +++--
 tests/kubernetes/test_pod_generator.py | 1453 
 2 files changed, 848 insertions(+), 830 deletions(-)



[GitHub] [airflow] notatallshaw commented on issue #12838: ❗ PIP 20.3 breaks Airflow installation❗

2020-12-14 Thread GitBox


notatallshaw commented on issue #12838:
URL: https://github.com/apache/airflow/issues/12838#issuecomment-745049883


   @pradyunsg I'm not on the Airflow team and I don't have as deep of an 
understanding as @potiuk but I gave installing Airflow 1.10.14 with all 
dependencies using the new resolver with pip 20.3.2.
   
   I'm not sure how much is Airflow fixes and how much is 20.3.2s improvements 
but I am able to successfully run `pip install apache-airflow[all]` with no 
errors 😄. Thanks to both teams!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] debodirno commented on issue #13069: Rewrite handwritten argument parser in prepare_provider_packages.py

2020-12-14 Thread GitBox


debodirno commented on issue #13069:
URL: https://github.com/apache/airflow/issues/13069#issuecomment-745042158


   Can I take this up @potiuk  and @mik-laj?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #13079: Update INTHEWILD.md

2020-12-14 Thread GitBox


boring-cyborg[bot] commented on pull request #13079:
URL: https://github.com/apache/airflow/pull/13079#issuecomment-745041127


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] stefanwuthrich opened a new pull request #13079: Update INTHEWILD.md

2020-12-14 Thread GitBox


stefanwuthrich opened a new pull request #13079:
URL: https://github.com/apache/airflow/pull/13079


   Add Altafino
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pradyunsg commented on issue #12838: ❗ PIP 20.3 breaks Airflow installation❗

2020-12-14 Thread GitBox


pradyunsg commented on issue #12838:
URL: https://github.com/apache/airflow/issues/12838#issuecomment-745003277


   Well, pip's master branch is now 20.3.2, so... test against that! :)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] otourzan commented on a change in pull request #12907: Add regional support to the workflow template methods.

2020-12-14 Thread GitBox


otourzan commented on a change in pull request #12907:
URL: https://github.com/apache/airflow/pull/12907#discussion_r540973336



##
File path: airflow/providers/google/cloud/hooks/dataproc.py
##
@@ -218,11 +217,14 @@ def get_cluster_client(self, location: Optional[str] = 
None) -> ClusterControlle
 credentials=self._get_credentials(), client_info=self.client_info, 
client_options=client_options
 )
 
-@cached_property
-def get_template_client(self) -> WorkflowTemplateServiceClient:
+def get_template_client(self, location: Optional[str] = None) -> 
WorkflowTemplateServiceClient:
 """Returns WorkflowTemplateServiceClient."""
+client_options = None
+if location and location != 'global':

Review comment:
   Yes, user should provide proper configuration. If they pass 'global' as 
a valid location, the endpoint needs to be global one (without region prefix: 
'dataproc.googleapis.com') as there is *no* endpoint like 
'global-dataproc.googleapis.com'. Not sure if I got your point here.
   
   I joined the Airflow slack and will be happy to help.
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Coqueiro closed issue #13076: Getting "Invalid kube-config file" when trying to run SparkKubernetesOperator

2020-12-14 Thread GitBox


Coqueiro closed issue #13076:
URL: https://github.com/apache/airflow/issues/13076


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Coqueiro commented on issue #13076: Getting "Invalid kube-config file" when trying to run SparkKubernetesOperator

2020-12-14 Thread GitBox


Coqueiro commented on issue #13076:
URL: https://github.com/apache/airflow/issues/13076#issuecomment-744895585


   It worked once I changed the conn to the following extra:
   ```
   {"extra__kubernetes__in_cluster":"True"}
   ```
   It seems that the `True` on the  json I originally used isn't deserialized 
correctly.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #13078: Fix failing pylint check on Master

2020-12-14 Thread GitBox


kaxil opened a new pull request #13078:
URL: https://github.com/apache/airflow/pull/13078


   Master is failing on Pylint check
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #13073: Script to generate integrations.json

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13073:
URL: https://github.com/apache/airflow/pull/13073#issuecomment-744832136


   [The Workflow run](https://github.com/apache/airflow/actions/runs/422002272) 
is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static 
checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider 
packages,^Checks: Helm tests$,^Test OpenAPI*.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] shib-nmg opened a new issue #13077: Apache Airflow Triggering DAG for Future Dates (Airflow Version 1.8.1)

2020-12-14 Thread GitBox


shib-nmg opened a new issue #13077:
URL: https://github.com/apache/airflow/issues/13077


   One of our DAG's "**Global Polling DAG**" which I am using for one of my 
Customers is getting triggered for Future dates, intermittently.
   
   For example: **For 12/12 9.30 PM execution, the Global Polling DAG got 
triggered for 12/13 08:10 PM**
   
   This is one of the unique occurrences that we are facing and the same 
Airflow domain was running without any issues, over the last 9+ Months.
   
   Point to note is - The same "Global Polling DAG" is not behaving this way in 
other Airflow domains (we have a total of 4 Airflow Domains in Production).
   
   Could this is be an issue for the Airflow backfill (that's disabled now)? 
Any quick help is really appreciated
   
   **Environment: Airflow Version:** 1.8.1
   **Cloud Platform :** AWS
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #13077: Apache Airflow Triggering DAG for Future Dates (Airflow Version 1.8.1)

2020-12-14 Thread GitBox


boring-cyborg[bot] commented on issue #13077:
URL: https://github.com/apache/airflow/issues/13077#issuecomment-744765892


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on issue #13066: Lazy load Task in TaskGroup

2020-12-14 Thread GitBox


ashb commented on issue #13066:
URL: https://github.com/apache/airflow/issues/13066#issuecomment-744765416


   Yeah, that would be a nice improvement!



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Coqueiro opened a new issue #13076: Getting "Invalid kube-config file" when trying to run SparkKubernetesOperator

2020-12-14 Thread GitBox


Coqueiro opened a new issue #13076:
URL: https://github.com/apache/airflow/issues/13076


   **Apache Airflow version**: 1.10.12 with Python3.7
   **Kubernetes version**:
   ```
   Client Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.5", 
GitCommit:"e6503f8d8f769ace2f338794c914a96fc335df0f", GitTreeState:"clean", 
BuildDate:"2020-06-27T00:38:11Z", GoVersion:"go1.14.4", Compiler:"gc", 
Platform:"darwin/amd64"}
   Server Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.8", 
GitCommit:"9f2892aab98fe339f3bd70e3c470144299398ace", GitTreeState:"clean", 
BuildDate:"2020-08-13T16:04:18Z", GoVersion:"go1.13.15", Compiler:"gc", 
Platform:"linux/amd64"}
   ```
   
   I'm running an Airflow cluster using CeleryExecutor inside a Kubernetes 
cluster, after installing `cncf.kubernetes` backport package. I already did 
some testing with the Spark Operator inside the cluster and I'm able to run 
`SparkApplications` smoothly by applying them with `kubectl`. I'm trying now to 
make an Airflow DAG execute them. When doing so, I ran into this message:
   
   ``` 
   [2020-12-14 22:18:10,609] {taskinstance.py:1150} ERROR - Invalid kube-config 
file. No configuration found.
   Traceback (most recent call last):
 File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py",
 line 984, in _run_raw_task
   result = task_copy.execute(context=context)
 File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/cncf/kubernetes/operators/spark_kubernetes.py",
 line 67, in execute
   namespace=self.namespace,
 File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/cncf/kubernetes/hooks/kubernetes.py",
 line 127, in create_custom_object
   api = client.CustomObjectsApi(self.api_client)
 File 
"/home/airflow/.local/lib/python3.7/site-packages/cached_property.py", line 35, 
in __get__
   value = obj.__dict__[self.func.__name__] = self.func(obj)
 File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/cncf/kubernetes/hooks/kubernetes.py",
 line 108, in api_client
   return self.get_conn()
 File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/cncf/kubernetes/hooks/kubernetes.py",
 line 102, in get_conn
   config.load_kube_config(client_configuration=self.client_configuration)
 File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/config/kube_config.py",
 line 739, in load_kube_config
   persist_config=persist_config)
 File 
"/home/airflow/.local/lib/python3.7/site-packages/kubernetes/config/kube_config.py",
 line 701, in _get_kube_config_loader_for_yaml_file
   'Invalid kube-config file. '
   kubernetes.config.config_exception.ConfigException: Invalid kube-config 
file. No configuration found. 
   ```
   
   Here's my DAG code:
   ```
   import os
   
   from datetime import datetime, timedelta
   
   from airflow import DAG
   from airflow.providers.cncf.kubernetes.operators.spark_kubernetes import 
SparkKubernetesOperator
   from airflow.operators.dummy_operator import DummyOperator
   
   DAG_ID = "spark_operator_test"
   DESCRIPTION = ""
   SCHEDULE_INTERVAL = "0 10 * * *"
   
   default_args = {
   "owner": "airflow",
   "depends_on_past": False,
   "start_date": datetime(2020, 11, 18),
   "retries": 1,
   "retry_delay": timedelta(minutes=5),
   "provide_context": True
   }
   
   with DAG(
   dag_id=DAG_ID,
   schedule_interval=SCHEDULE_INTERVAL,
   description=DESCRIPTION,
   default_args=default_args,
   catchup=False,
   concurrency=1
   ) as dag:
   end_dag = DummyOperator(task_id='end_dag')
   
   t1 = SparkKubernetesOperator(
   task_id='spark_operator_execute',
   namespace="analytics-airflow",
   application_file="resources/spark/spark-operator-test.yaml",
   kubernetes_conn_id="kubernetes_default",
   do_xcom_push=True
   )
   
   t1 >> end_dag
   ```
   
   I tried creating a connection with the following extra:
   ```
   {"extra__kubernetes__in_cluster":True}
   ```
   
   But it didn't work either. How can I correctly configure my airflow 
deployment so that I can make a task that can use the 
`SparkKubernetesOperator`, creating a `SparkApplication` within the same 
cluster?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #13076: Getting "Invalid kube-config file" when trying to run SparkKubernetesOperator

2020-12-14 Thread GitBox


boring-cyborg[bot] commented on issue #13076:
URL: https://github.com/apache/airflow/issues/13076#issuecomment-744761189


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch constraints-2-0 updated: Updating constraints. Build id:420457387

2020-12-14 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch constraints-2-0
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/constraints-2-0 by this push:
 new 7604560  Updating constraints. Build id:420457387
7604560 is described below

commit 7604560345c48cae6d4d573e46aa283d75f00952
Author: Automated GitHub Actions commit 
AuthorDate: Mon Dec 14 12:05:42 2020 +

Updating constraints. Build id:420457387

This update in constraints is automatically committed by the CI 
'constraints-push' step based on
HEAD of 'refs/heads/master' in 'apache/airflow'
with commit sha 1c1ef7ee693fead93e269dfd9774a72b6eed2e85.

All tests passed in this build so we determined we can push the updated 
constraints.

See 
https://github.com/apache/airflow/blob/master/README.md#installing-from-pypi 
for details.
---
 constraints-3.6.txt | 3 ++-
 constraints-3.7.txt | 3 ++-
 constraints-3.8.txt | 3 ++-
 3 files changed, 6 insertions(+), 3 deletions(-)

diff --git a/constraints-3.6.txt b/constraints-3.6.txt
index a3a0ae7..bdfcad8 100644
--- a/constraints-3.6.txt
+++ b/constraints-3.6.txt
@@ -1,4 +1,4 @@
-# Editable install with no version control (apache-airflow==2.0.0b3)
+# Editable install with no version control (apache-airflow==2.0.0)
 APScheduler==3.6.3
 Authlib==0.15.2
 Babel==2.9.0
@@ -366,6 +366,7 @@ snowflake-connector-python==2.3.6
 snowflake-sqlalchemy==1.2.4
 sortedcontainers==2.3.0
 soupsieve==2.0.1
+sphinx-airflow-theme==0.0.2
 sphinx-argparse==0.2.5
 sphinx-autoapi==1.0.0
 sphinx-copybutton==0.3.1
diff --git a/constraints-3.7.txt b/constraints-3.7.txt
index 65c834b..3e21df8 100644
--- a/constraints-3.7.txt
+++ b/constraints-3.7.txt
@@ -1,4 +1,4 @@
-# Editable install with no version control (apache-airflow==2.0.0b3)
+# Editable install with no version control (apache-airflow==2.0.0)
 APScheduler==3.6.3
 Authlib==0.15.2
 Babel==2.9.0
@@ -361,6 +361,7 @@ snowflake-connector-python==2.3.6
 snowflake-sqlalchemy==1.2.4
 sortedcontainers==2.3.0
 soupsieve==2.0.1
+sphinx-airflow-theme==0.0.2
 sphinx-argparse==0.2.5
 sphinx-autoapi==1.0.0
 sphinx-copybutton==0.3.1
diff --git a/constraints-3.8.txt b/constraints-3.8.txt
index 65c834b..3e21df8 100644
--- a/constraints-3.8.txt
+++ b/constraints-3.8.txt
@@ -1,4 +1,4 @@
-# Editable install with no version control (apache-airflow==2.0.0b3)
+# Editable install with no version control (apache-airflow==2.0.0)
 APScheduler==3.6.3
 Authlib==0.15.2
 Babel==2.9.0
@@ -361,6 +361,7 @@ snowflake-connector-python==2.3.6
 snowflake-sqlalchemy==1.2.4
 sortedcontainers==2.3.0
 soupsieve==2.0.1
+sphinx-airflow-theme==0.0.2
 sphinx-argparse==0.2.5
 sphinx-autoapi==1.0.0
 sphinx-copybutton==0.3.1



[GitHub] [airflow] github-actions[bot] commented on pull request #13071: Remove inapplicable arg 'output' for CLI pools import/export

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13071:
URL: https://github.com/apache/airflow/pull/13071#issuecomment-744755826


   [The Workflow run](https://github.com/apache/airflow/actions/runs/421885469) 
is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static 
checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider 
packages,^Checks: Helm tests$,^Test OpenAPI*.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch accept-executor-config-image created (now e0d3aad)

2020-12-14 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a change to branch accept-executor-config-image
in repository https://gitbox.apache.org/repos/asf/airflow.git.


  at e0d3aad  KubernetesExecutor should accept images from executor_config

No new revisions were added by this update.



[GitHub] [airflow] dimberman opened a new pull request #13074: KubernetesExecutor should accept images from executor_config

2020-12-14 Thread GitBox


dimberman opened a new pull request #13074:
URL: https://github.com/apache/airflow/pull/13074


   Addresses:
   https://github.com/apache/airflow/issues/13003#issuecomment-743733799a
   
   Users should be able to specify custom images in the executor_config. Not 
being
   able to is a regression.
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ecerulm opened a new pull request #13073: Script to generate integrations.json

2020-12-14 Thread GitBox


ecerulm opened a new pull request #13073:
URL: https://github.com/apache/airflow/pull/13073


   Closes #12613



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (b4b9cf5 -> 317858a)

2020-12-14 Thread kamilbregula
This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from b4b9cf5  Check for missing references to operator guides (#13059)
 add 317858a  Remove unneeded parentheses from Python file (#12968)

No new revisions were added by this update.

Summary of changes:
 airflow/cli/cli_parser.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[airflow] branch master updated (fa9c6b4 -> b4b9cf5)

2020-12-14 Thread kamilbregula
This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from fa9c6b4  Fetch inventories for third-party services only once (#13068)
 add b4b9cf5  Check for missing references to operator guides (#13059)

No new revisions were added by this update.

Summary of changes:
 .../providers/google/cloud/operators/dataprep.py   |   4 +
 .../google/cloud/transfers/mysql_to_gcs.py |   4 +
 .../providers/google/cloud/transfers/s3_to_gcs.py  |   4 +
 .../operators/transfer/s3_to_gcs.rst   |   3 +-
 docs/exts/docs_build/lint_checks.py| 109 +++--
 5 files changed, 72 insertions(+), 52 deletions(-)



[GitHub] [airflow] mik-laj merged pull request #12968: Remove unneeded parentheses from Python files

2020-12-14 Thread GitBox


mik-laj merged pull request #12968:
URL: https://github.com/apache/airflow/pull/12968


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj merged pull request #13059: Check for missing references to operator guides

2020-12-14 Thread GitBox


mik-laj merged pull request #13059:
URL: https://github.com/apache/airflow/pull/13059


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (5503951 -> fa9c6b4)

2020-12-14 Thread kamilbregula
This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 5503951  Update stable version for published docs (#13052)
 add fa9c6b4  Fetch inventories for third-party services only once (#13068)

No new revisions were added by this update.

Summary of changes:
 docs/conf.py| 90 ++---
 docs/exts/docs_build/fetch_inventories.py   | 25 +--
 docs/exts/docs_build/third_party_inventories.py | 52 ++
 3 files changed, 119 insertions(+), 48 deletions(-)
 create mode 100644 docs/exts/docs_build/third_party_inventories.py



[GitHub] [airflow] mik-laj merged pull request #13068: Fetch inventories for third-party services only once

2020-12-14 Thread GitBox


mik-laj merged pull request #13068:
URL: https://github.com/apache/airflow/pull/13068


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] marcusianlevine commented on issue #13003: KubernetesExecutor - image from executor_config is ignored

2020-12-14 Thread GitBox


marcusianlevine commented on issue #13003:
URL: https://github.com/apache/airflow/issues/13003#issuecomment-744724098


   Thanks for reporting this Cris, I'm seeing the same issue
   
   AFAIK this only affects 1.10.13+
   
   I'm currently running Airflow 1.10.12 and the `executor_config` image 
override works, but when I try to upgrade to 1.10.14 it always uses the image 
specified in the base config



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pceric commented on issue #12111: Kubernetes operator retry regression

2020-12-14 Thread GitBox


pceric commented on issue #12111:
URL: https://github.com/apache/airflow/issues/12111#issuecomment-744723729


   I installed 1.10.14 today and while the behavior is a bit odd, it does work. 
If I have retries set to 5, Airflow will run 10 retries with every odd retry 
being a "dummy retry", gathering the output from the previous failure. But 
since everything works as expected I'm calling this fixed.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pceric closed issue #12111: Kubernetes operator retry regression

2020-12-14 Thread GitBox


pceric closed issue #12111:
URL: https://github.com/apache/airflow/issues/12111


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




svn commit: r44966 - /dev/airflow/upgrade-check/1.1.0rc1/ /release/airflow/upgrade-check/1.1.0/

2020-12-14 Thread ash
Author: ash
Date: Mon Dec 14 21:28:59 2020
New Revision: 44966

Log:
Release apache-airflow-upgrade-check 1.1.0 from 1.1.0rc1

Added:
release/airflow/upgrade-check/1.1.0/

release/airflow/upgrade-check/1.1.0/apache-airflow-upgrade-check-1.1.0-source.tar.gz
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1-source.tar.gz

release/airflow/upgrade-check/1.1.0/apache-airflow-upgrade-check-1.1.0-source.tar.gz.asc
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1-source.tar.gz.asc

release/airflow/upgrade-check/1.1.0/apache-airflow-upgrade-check-1.1.0-source.tar.gz.sha512
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1-source.tar.gz.sha512

release/airflow/upgrade-check/1.1.0/apache-airflow-upgrade-check-1.1.0.tar.gz
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1.tar.gz

release/airflow/upgrade-check/1.1.0/apache-airflow-upgrade-check-1.1.0.tar.gz.asc
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1.tar.gz.asc

release/airflow/upgrade-check/1.1.0/apache-airflow-upgrade-check-1.1.0.tar.gz.sha512
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1.tar.gz.sha512

release/airflow/upgrade-check/1.1.0/apache_airflow_upgrade_check-1.1.0-py2.py3-none-any.whl
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache_airflow_upgrade_check-1.1.0rc1-py2.py3-none-any.whl

release/airflow/upgrade-check/1.1.0/apache_airflow_upgrade_check-1.1.0-py2.py3-none-any.whl.asc
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache_airflow_upgrade_check-1.1.0rc1-py2.py3-none-any.whl.asc

release/airflow/upgrade-check/1.1.0/apache_airflow_upgrade_check-1.1.0-py2.py3-none-any.whl.sha512
  - copied unchanged from r44965, 
dev/airflow/upgrade-check/1.1.0rc1/apache_airflow_upgrade_check-1.1.0rc1-py2.py3-none-any.whl.sha512
Removed:

dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1-source.tar.gz

dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1-source.tar.gz.asc

dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1-source.tar.gz.sha512

dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1.tar.gz

dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1.tar.gz.asc

dev/airflow/upgrade-check/1.1.0rc1/apache-airflow-upgrade-check-1.1.0rc1.tar.gz.sha512

dev/airflow/upgrade-check/1.1.0rc1/apache_airflow_upgrade_check-1.1.0rc1-py2.py3-none-any.whl

dev/airflow/upgrade-check/1.1.0rc1/apache_airflow_upgrade_check-1.1.0rc1-py2.py3-none-any.whl.asc

dev/airflow/upgrade-check/1.1.0rc1/apache_airflow_upgrade_check-1.1.0rc1-py2.py3-none-any.whl.sha512



[GitHub] [airflow] potiuk commented on issue #12838: ❗ PIP 20.3 breaks Airflow installation❗

2020-12-14 Thread GitBox


potiuk commented on issue #12838:
URL: https://github.com/apache/airflow/issues/12838#issuecomment-744717680


   Sorry - I've been busy identifiying and testing some issue with Airlow 
2.0RC2 ( which lead to RC3 sent today). I will take a closer look tomorrow! 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




svn commit: r44965 [2/2] - /dev/airflow/providers/1.0.0rc1/ /release/airflow/providers/ /release/airflow/upgrade-check/1.0.0/

2020-12-14 Thread ash


Modified: 
release/airflow/upgrade-check/1.0.0/apache_airflow_upgrade_check-1.0.0-py2.py3-none-any.whl.asc
==
Binary files - no diff available.




svn commit: r44965 [1/2] - /dev/airflow/providers/1.0.0rc1/ /release/airflow/providers/ /release/airflow/upgrade-check/1.0.0/

2020-12-14 Thread ash
Author: ash
Date: Mon Dec 14 21:20:54 2020
New Revision: 44965

Log:
Bulk release airflow providers 1.0.0

Added:
release/airflow/providers/
release/airflow/providers/apache-airflow-providers-1.0.0-source.tar.gz
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz
release/airflow/providers/apache-airflow-providers-1.0.0-source.tar.gz.asc
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.asc

release/airflow/providers/apache-airflow-providers-1.0.0-source.tar.gz.sha512
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.sha512
release/airflow/providers/apache-airflow-providers-amazon-1.0.0-bin.tar.gz
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-amazon-1.0.0rc1-bin.tar.gz

release/airflow/providers/apache-airflow-providers-amazon-1.0.0-bin.tar.gz.asc
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-amazon-1.0.0rc1-bin.tar.gz.asc

release/airflow/providers/apache-airflow-providers-amazon-1.0.0-bin.tar.gz.sha512
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-amazon-1.0.0rc1-bin.tar.gz.sha512

release/airflow/providers/apache-airflow-providers-apache-cassandra-1.0.0-bin.tar.gz
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-cassandra-1.0.0rc1-bin.tar.gz

release/airflow/providers/apache-airflow-providers-apache-cassandra-1.0.0-bin.tar.gz.asc
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-cassandra-1.0.0rc1-bin.tar.gz.asc

release/airflow/providers/apache-airflow-providers-apache-cassandra-1.0.0-bin.tar.gz.sha512
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-cassandra-1.0.0rc1-bin.tar.gz.sha512

release/airflow/providers/apache-airflow-providers-apache-druid-1.0.0-bin.tar.gz
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-druid-1.0.0rc1-bin.tar.gz

release/airflow/providers/apache-airflow-providers-apache-druid-1.0.0-bin.tar.gz.asc
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-druid-1.0.0rc1-bin.tar.gz.asc

release/airflow/providers/apache-airflow-providers-apache-druid-1.0.0-bin.tar.gz.sha512
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-druid-1.0.0rc1-bin.tar.gz.sha512

release/airflow/providers/apache-airflow-providers-apache-hdfs-1.0.0-bin.tar.gz
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-hdfs-1.0.0rc1-bin.tar.gz

release/airflow/providers/apache-airflow-providers-apache-hdfs-1.0.0-bin.tar.gz.asc
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-hdfs-1.0.0rc1-bin.tar.gz.asc

release/airflow/providers/apache-airflow-providers-apache-hdfs-1.0.0-bin.tar.gz.sha512
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-hdfs-1.0.0rc1-bin.tar.gz.sha512

release/airflow/providers/apache-airflow-providers-apache-hive-1.0.0-bin.tar.gz
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-hive-1.0.0rc1-bin.tar.gz

release/airflow/providers/apache-airflow-providers-apache-hive-1.0.0-bin.tar.gz.asc
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-hive-1.0.0rc1-bin.tar.gz.asc

release/airflow/providers/apache-airflow-providers-apache-hive-1.0.0-bin.tar.gz.sha512
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-hive-1.0.0rc1-bin.tar.gz.sha512

release/airflow/providers/apache-airflow-providers-apache-kylin-1.0.0-bin.tar.gz
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-kylin-1.0.0rc1-bin.tar.gz

release/airflow/providers/apache-airflow-providers-apache-kylin-1.0.0-bin.tar.gz.asc
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-kylin-1.0.0rc1-bin.tar.gz.asc

release/airflow/providers/apache-airflow-providers-apache-kylin-1.0.0-bin.tar.gz.sha512
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-kylin-1.0.0rc1-bin.tar.gz.sha512

release/airflow/providers/apache-airflow-providers-apache-livy-1.0.0-bin.tar.gz
  - copied unchanged from r44964, 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-apache-livy-1.0.0rc1-bin.tar.gz

release/airflow/providers/apache-airflow-providers-apache-livy-1.0.0-bin.tar

[GitHub] [airflow] github-actions[bot] commented on pull request #13059: Check for missing references to operator guides

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13059:
URL: https://github.com/apache/airflow/pull/13059#issuecomment-744712563


   The PR is likely OK to be merged with just subset of tests for default 
Python and Database versions without running the full matrix of tests, because 
it does not modify the core of Airflow. If the committers decide that the full 
tests matrix is needed, they will add the label 'full tests needed'. Then you 
should rebase to the latest master or amend the last commit of the PR, and push 
it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #12881: Snowflake python connector monkeypatches urllib and makes many services unusable.

2020-12-14 Thread GitBox


potiuk commented on issue #12881:
URL: https://github.com/apache/airflow/issues/12881#issuecomment-744711310


   Thanks. I think we are just trying to push snowflake (and we have some 
heavyweights helping us) to fix it, so let's wait and see :).



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #13069: Handwritten argument parser in prepare_provider_packages.py

2020-12-14 Thread GitBox


potiuk commented on issue #13069:
URL: https://github.com/apache/airflow/issues/13069#issuecomment-744709498


   It simply grew from a simple script where there was more parameters. But my 
goal now is that more people take part in writing those scripts not only me. I 
marked it as good first issue, and if you feel like you would like to rewrite 
it - go ahead. I am happy to review it and guide you (or anyone else who would 
like to take on that task).



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #13069: Handwritten argument parser in prepare_provider_packages.py

2020-12-14 Thread GitBox


potiuk commented on issue #13069:
URL: https://github.com/apache/airflow/issues/13069#issuecomment-744708333


   Feel free to rewrite it :). No problem with that.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #13058: Set timeout for child processes

2020-12-14 Thread GitBox


mik-laj commented on a change in pull request #13058:
URL: https://github.com/apache/airflow/pull/13058#discussion_r542777395



##
File path: airflow/configuration.py
##
@@ -69,7 +69,9 @@ def run_command(command):
 process = subprocess.Popen(
 shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, 
close_fds=True
 )
-output, stderr = [stream.decode(sys.getdefaultencoding(), 'ignore') for 
stream in process.communicate()]
+output, stderr = [
+stream.decode(sys.getdefaultencoding(), 'ignore') for stream in 
process.communicate(timeout=60)

Review comment:
   I added more docs now :-) 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #13071: Remove inapplicable arg 'output' for CLI pools import/export

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13071:
URL: https://github.com/apache/airflow/pull/13071#issuecomment-744705084


   The PR is likely OK to be merged with just subset of tests for default 
Python and Database versions without running the full matrix of tests, because 
it does not modify the core of Airflow. If the committers decide that the full 
tests matrix is needed, they will add the label 'full tests needed'. Then you 
should rebase to the latest master or amend the last commit of the PR, and push 
it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #13068: Fetch inventories for third-party services only once

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13068:
URL: https://github.com/apache/airflow/pull/13068#issuecomment-744704045


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest master or amend the last commit 
of the PR, and push it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


potiuk commented on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744703207


   Be sure to watch the devlist. It might well be that 1.10.15 will be released 
at some point in time :)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #13049: Add S3KeySizeSensor

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13049:
URL: https://github.com/apache/airflow/pull/13049#issuecomment-744695535


   [The Workflow run](https://github.com/apache/airflow/actions/runs/421679253) 
is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static 
checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider 
packages,^Checks: Helm tests$,^Test OpenAPI*.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] marshall7m opened a new pull request #13072: AWS Glue Crawler Integration

2020-12-14 Thread GitBox


marshall7m opened a new pull request #13072:
URL: https://github.com/apache/airflow/pull/13072


   
   ## Description
   
   This PR integrates a AWS glue crawler operator and hook that can be used to 
trigger glue crawlers from Airflow.  
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #13072: AWS Glue Crawler Integration

2020-12-14 Thread GitBox


boring-cyborg[bot] commented on pull request #13072:
URL: https://github.com/apache/airflow/pull/13072#issuecomment-744694482


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




svn commit: r44964 - in /dev/airflow/providers/1.0.0rc1: apache-airflow-providers-1.0.0rc1-source.tar.gz apache-airflow-providers-1.0.0rc1-source.tar.gz.asc apache-airflow-providers-1.0.0rc1-source.ta

2020-12-14 Thread ash
Author: ash
Date: Mon Dec 14 20:20:37 2020
New Revision: 44964

Log:
Add missing source bundle

Added:

dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz  
 (with props)

dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.asc
   (with props)

dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.sha512

Added: 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz
==
Binary file - no diff available.

Propchange: 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz
--
svn:mime-type = application/gzip

Added: 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.asc
==
Binary file - no diff available.

Propchange: 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.asc
--
svn:mime-type = application/pgp-signature

Added: 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.sha512
==
--- 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.sha512
 (added)
+++ 
dev/airflow/providers/1.0.0rc1/apache-airflow-providers-1.0.0rc1-source.tar.gz.sha512
 Mon Dec 14 20:20:37 2020
@@ -0,0 +1 @@
+2167836ea4e7015419413d43167939d2e1fef782d7f06fa677be071f4965dca8dd3ede616d559953c1b1971c86301472f02ee7e4315ed6b5aa92749e72002ddd
  apache-airflow-providers-1.0.0rc1-source.tar.gz




[GitHub] [airflow] XD-DENG opened a new pull request #13071: Remove inapplicable arg 'output' for CLI pools import/export

2020-12-14 Thread GitBox


XD-DENG opened a new pull request #13071:
URL: https://github.com/apache/airflow/pull/13071


   `-o`/`--output` is not applicable for CLI `airflow pools export` or `airflow 
pools import`.
   It should be removed.
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Squigilum edited a comment on issue #12995: Worker never running tasks or failing them with no explanation for many simultaneous tasks

2020-12-14 Thread GitBox


Squigilum edited a comment on issue #12995:
URL: https://github.com/apache/airflow/issues/12995#issuecomment-744655785


   I tried both increasing the allowed connections (from 100 to 250), and 
enabling pgbouncer, and both still had similar errors.  For pgbouncer, I tried 
both enabling with the just default parameters and increasing the 
`maxClientConn` parameter in the chart.  I think I tried 250 and 1000 for the 
`maxClientConn` parameter. 
   
   I'm attaching the worker logs for my two celery workers.  For this run, 
tasks 23, 25, and 26 stayed in the queued state and never ran.  23 is active on 
worker 1 and 25 and 26 are on worker 0.  I've also reduced (from 100 to 40) the 
amount of concurrent tasks considerably from the DAG I initially shared. 
   
   The only thing I noticed that looked abnormal to me is the following warning 
in the logs, but it does not seem to occur near the tasks in question:
   ```
   [2020-12-14 18:52:40,054: WARNING/ForkPoolWorker-6] Failed to log action 
with (psycopg2.DatabaseError) error with status PGRES_TUPLES_OK and no message 
from the libpq
   (Background on this error at: http://sqlalche.me/e/13/4xp6)
   ```
   
   I'm not sure if this snippet includes all the relevant log messages, but the 
worker logs for the tasks generally look like this:
   ```
   [2020-12-14 18:48:37,531: INFO/MainProcess] Received task: 
airflow.executors.celery_executor.execute_command[5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d]
   [2020-12-14 18:48:37,532: DEBUG/MainProcess] TaskPool: Apply  
(args:('airflow.executors.celery_executor.execute_command', 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', {'lang': 'py', 'task': 
'airflow.executors.celery_executor.execute_command', 'id': 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'shadow': None, 'eta': None, 'expires': 
None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, 
None], 'root_id': '5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'parent_id': None, 
'argsrepr': "[['airflow', 'tasks', 'run', 'run_100_concurrent', '25', 
'2020-12-14T18:47:13.448236+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/opt/airflow/dags/concurrent_workflow.py']]", 'kwargsrepr': '{}', 
'origin': 'gen148@airflow-scheduler-686f8b7b4-2vlrd', 'reply_to': 
'7a69ddc6-70f1-3417-ae71-92a691da626b', 'correlation_id': 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'hostname': 'celery@airflow-worker-1', 
'delivery_info': {'exchange': '', 'routing_key':
  'celery', 'priority': 0, 'redelivered': None}, 'args': [['airflow', 'tasks', 
'run', 'run_100_concurrent', '25',... kwargs:{})
   [2020-12-14 18:48:37,552: DEBUG/MainProcess] Task accepted: 
airflow.executors.celery_executor.execute_command[5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d]
 pid:23
   [2020-12-14 18:48:37,750: INFO/ForkPoolWorker-5] Executing command in 
Celery: ['airflow', 'tasks', 'run', 'run_100_concurrent', '25', 
'2020-12-14T18:47:13.448236+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/opt/airflow/dags/concurrent_workflow.py']
   [2020-12-14 18:48:41,337: DEBUG/ForkPoolWorker-5] Calling callbacks: 
[]
   ```
   [worker-0.log](https://github.com/apache/airflow/files/5691087/worker-0.log)
   [worker-1.log](https://github.com/apache/airflow/files/5691089/worker-1.log)
   
[scheduler.log](https://github.com/apache/airflow/files/5691090/scheduler.log)
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Squigilum edited a comment on issue #12995: Worker never running tasks or failing them with no explanation for many simultaneous tasks

2020-12-14 Thread GitBox


Squigilum edited a comment on issue #12995:
URL: https://github.com/apache/airflow/issues/12995#issuecomment-744655785


   I tried both increasing the allowed connections (from 100 to 250), and 
enabling pgbouncer, and both still had similar errors.  For pgbouncer, I tried 
both enabling with the just default parameters and increasing the 
`maxClientConn` parameter in the chart.  I think I tried 250 and 1000 for the 
`maxClientConn` parameter. 
   
   I'm attaching the worker logs for my two celery workers.  For this run, 
tasks 23, 25, and 26 stayed in the queued state and never ran.  23 is active on 
worker 1 and 25 and 26 are on worker 0.  I've also reduced (from 100 to 40) the 
amount of concurrent tasks considerably from the DAG I initially shared. 
   
   The only thing I noticed that looked abnormal to me is the following warning 
in the logs, but it does not seem to occur near the tasks in question:
   ```
   [2020-12-14 18:52:40,054: WARNING/ForkPoolWorker-6] Failed to log action 
with (psycopg2.DatabaseError) error with status PGRES_TUPLES_OK and no message 
from the libpq
   (Background on this error at: http://sqlalche.me/e/13/4xp6)
   ```
   
   I'm not sure if this snippet includes all the relevant log messages, but the 
worker logs for the tasks generally look like this:
   ```
   [worker-0.log](https://github.com/apache/airflow/files/5691079/worker-0.log)
   [2020-12-14 18:48:37,531: INFO/MainProcess] Received task: 
airflow.executors.celery_executor.execute_command[5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d]
   [2020-12-14 18:48:37,532: DEBUG/MainProcess] TaskPool: Apply  
(args:('airflow.executors.celery_executor.execute_command', 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', {'lang': 'py', 'task': 
'airflow.executors.celery_executor.execute_command', 'id': 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'shadow': None, 'eta': None, 'expires': 
None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, 
None], 'root_id': '5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'parent_id': None, 
'argsrepr': "[['airflow', 'tasks', 'run', 'run_100_concurrent', '25', 
'2020-12-14T18:47:13.448236+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/opt/airflow/dags/concurrent_workflow.py']]", 'kwargsrepr': '{}', 
'origin': 'gen148@airflow-scheduler-686f8b7b4-2vlrd', 'reply_to': 
'7a69ddc6-70f1-3417-ae71-92a691da626b', 'correlation_id': 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'hostname': 'celery@airflow-worker-1', 
'delivery_info': {'exchange': '', 'routing_key':
  'celery', 'priority': 0, 'redelivered': None}, 'args': [['airflow', 'tasks', 
'run', 'run_100_concurrent', '25',... kwargs:{})
   [2020-12-14 18:48:37,552: DEBUG/MainProcess] Task accepted: 
airflow.executors.celery_executor.execute_command[5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d]
 pid:23
   [2020-12-14 18:48:37,750: INFO/ForkPoolWorker-5] Executing command in 
Celery: ['airflow', 'tasks', 'run', 'run_100_concurrent', '25', 
'2020-12-14T18:47:13.448236+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/opt/airflow/dags/concurrent_workflow.py']
   [2020-12-14 18:48:41,337: DEBUG/ForkPoolWorker-5] Calling callbacks: 
[]
   ```
   [worker-0.log](https://github.com/apache/airflow/files/5691087/worker-0.log)
   [worker-1.log](https://github.com/apache/airflow/files/5691089/worker-1.log)
   
[scheduler.log](https://github.com/apache/airflow/files/5691090/scheduler.log)
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] lsowen opened a new issue #13070: KubernetesPodOperator duplicate code

2020-12-14 Thread GitBox


lsowen opened a new issue #13070:
URL: https://github.com/apache/airflow/issues/13070


   Does anyone know why there are two calls to `create_pod_request_obj()` and 
`create_labels_for_pod()` in 
https://github.com/apache/airflow/blob/c743b95a02ba1ec04013635a56ad042ce98823d2/airflow/contrib/operators/kubernetes_pod_operator.py#L278-L287
   
   It looks like maybe a merge issue between two different PRs that got merged 
at approximately the same time?  (#11162 and #10963)
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #13070: KubernetesPodOperator duplicate code

2020-12-14 Thread GitBox


boring-cyborg[bot] commented on issue #13070:
URL: https://github.com/apache/airflow/issues/13070#issuecomment-744669631


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] RikHeijdens commented on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


RikHeijdens commented on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744664901


   Ah, excellent, I hadn't noticed that the issue was fixed on the `v1-10-test` 
branch. I had only looked at the `v1-10-stable` branch before opening this 
issue. I'll make sure to check that branch before opening an issue the next 
time.
   
   Either way, thanks for the quick response to this bug report :-)
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #12116: Update google-cloud deps to v2

2020-12-14 Thread GitBox


mik-laj commented on issue #12116:
URL: https://github.com/apache/airflow/issues/12116#issuecomment-744660908


   I added a new column "To update?", which specifies whether we can update 
this library or rather we should wait until we drop Airflow 1.10 support.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #13058: Set timeout for child processes

2020-12-14 Thread GitBox


ashb commented on a change in pull request #13058:
URL: https://github.com/apache/airflow/pull/13058#discussion_r542683198



##
File path: airflow/configuration.py
##
@@ -69,7 +69,9 @@ def run_command(command):
 process = subprocess.Popen(
 shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, 
close_fds=True
 )
-output, stderr = [stream.decode(sys.getdefaultencoding(), 'ignore') for 
stream in process.communicate()]
+output, stderr = [
+stream.decode(sys.getdefaultencoding(), 'ignore') for stream in 
process.communicate(timeout=60)

Review comment:
   This case is to run command from the `*_CMD` vars, for instance 
`sqlalchemy_conn_cmd` setting.
   
   60s is fine for most sensible cases (probably too long), but we just can't 
know what users might put in these configs -- it might take a long time on 
first instance.
   
   Perhaps we just document this as a limitation for now and we can make it 
configurable when someone complains.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Squigilum edited a comment on issue #12995: Worker never running tasks or failing them with no explanation for many simultaneous tasks

2020-12-14 Thread GitBox


Squigilum edited a comment on issue #12995:
URL: https://github.com/apache/airflow/issues/12995#issuecomment-744655785


   I tried both increasing the allowed connections (from 100 to 250), and 
enabling pgbouncer, and both still had similar errors.  For pgbouncer, I tried 
both enabling with the just default parameters and increasing the 
`maxClientConn` parameter in the chart.  I think I tried 250 and 1000 for the 
`maxClientConn` parameter. 
   
   I'm attaching the worker logs for my two celery workers.  For this run, 
tasks 23, 25, and 26 stayed in the queued state and never ran.  23 is active on 
worker 1 and 25 and 26 are on worker 0.  I've also reduced (from 100 to 40) the 
amount of concurrent tasks considerably from the DAG I initially shared. 
   
   The only thing I noticed that looked abnormal to me is the following warning 
in the logs, but it does not seem to occur near the tasks in question:
   ```
   [2020-12-14 18:52:40,054: WARNING/ForkPoolWorker-6] Failed to log action 
with (psycopg2.DatabaseError) error with status PGRES_TUPLES_OK and no message 
from the libpq
   (Background on this error at: http://sqlalche.me/e/13/4xp6)
   ```
   
   I'm not sure if I grabbed all the relevant log messages, but the worker logs 
for the tasks generally look like this:
   ```
   [worker-0.log](https://github.com/apache/airflow/files/5691079/worker-0.log)
   [2020-12-14 18:48:37,531: INFO/MainProcess] Received task: 
airflow.executors.celery_executor.execute_command[5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d]
   [2020-12-14 18:48:37,532: DEBUG/MainProcess] TaskPool: Apply  
(args:('airflow.executors.celery_executor.execute_command', 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', {'lang': 'py', 'task': 
'airflow.executors.celery_executor.execute_command', 'id': 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'shadow': None, 'eta': None, 'expires': 
None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, 
None], 'root_id': '5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'parent_id': None, 
'argsrepr': "[['airflow', 'tasks', 'run', 'run_100_concurrent', '25', 
'2020-12-14T18:47:13.448236+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/opt/airflow/dags/concurrent_workflow.py']]", 'kwargsrepr': '{}', 
'origin': 'gen148@airflow-scheduler-686f8b7b4-2vlrd', 'reply_to': 
'7a69ddc6-70f1-3417-ae71-92a691da626b', 'correlation_id': 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'hostname': 'celery@airflow-worker-1', 
'delivery_info': {'exchange': '', 'routing_key':
  'celery', 'priority': 0, 'redelivered': None}, 'args': [['airflow', 'tasks', 
'run', 'run_100_concurrent', '25',... kwargs:{})
   [2020-12-14 18:48:37,552: DEBUG/MainProcess] Task accepted: 
airflow.executors.celery_executor.execute_command[5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d]
 pid:23
   [2020-12-14 18:48:37,750: INFO/ForkPoolWorker-5] Executing command in 
Celery: ['airflow', 'tasks', 'run', 'run_100_concurrent', '25', 
'2020-12-14T18:47:13.448236+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/opt/airflow/dags/concurrent_workflow.py']
   [2020-12-14 18:48:41,337: DEBUG/ForkPoolWorker-5] Calling callbacks: 
[]
   ```
   [worker-0.log](https://github.com/apache/airflow/files/5691087/worker-0.log)
   [worker-1.log](https://github.com/apache/airflow/files/5691089/worker-1.log)
   
[scheduler.log](https://github.com/apache/airflow/files/5691090/scheduler.log)
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Squigilum commented on issue #12995: Worker never running tasks or failing them with no explanation for many simultaneous tasks

2020-12-14 Thread GitBox


Squigilum commented on issue #12995:
URL: https://github.com/apache/airflow/issues/12995#issuecomment-744655785


   I tried both increasing the allowed connections (from 100 to 250), and 
enabling pgbouncer, and both still had similar errors.  For pgbouncer, I tried 
both enabling with the just default parameters and increasing the 
`maxClientConn` parameter in the chart.  I think I tried 250 and 1000 for the 
`maxClientConn` parameter. 
   
   I'm attaching the worker logs for my two celery workers.  For this run, 
tasks 23, 25, and 26 stayed in the queued state and never ran.  23 is active on 
worker 1 and 25 and 26 are on worker 0.  I've also reduced (from 100 to 40) the 
amount of concurrent tasks considerably from the DAG I initially shared. 
   
   The only thing I noticed that looked abnormal is the following warning in 
the logs, but it does not seem to occur near the tasks in question:
   ```
   [2020-12-14 18:52:40,054: WARNING/ForkPoolWorker-6] Failed to log action 
with (psycopg2.DatabaseError) error with status PGRES_TUPLES_OK and no message 
from the libpq
   (Background on this error at: http://sqlalche.me/e/13/4xp6)
   ```
   
   I'm not sure if I grabbed all the relevant log messages, but the worker logs 
for the tasks generally look like this:
   ```
   [worker-0.log](https://github.com/apache/airflow/files/5691079/worker-0.log)
   [2020-12-14 18:48:37,531: INFO/MainProcess] Received task: 
airflow.executors.celery_executor.execute_command[5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d]
   [2020-12-14 18:48:37,532: DEBUG/MainProcess] TaskPool: Apply  
(args:('airflow.executors.celery_executor.execute_command', 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', {'lang': 'py', 'task': 
'airflow.executors.celery_executor.execute_command', 'id': 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'shadow': None, 'eta': None, 'expires': 
None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, 
None], 'root_id': '5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'parent_id': None, 
'argsrepr': "[['airflow', 'tasks', 'run', 'run_100_concurrent', '25', 
'2020-12-14T18:47:13.448236+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/opt/airflow/dags/concurrent_workflow.py']]", 'kwargsrepr': '{}', 
'origin': 'gen148@airflow-scheduler-686f8b7b4-2vlrd', 'reply_to': 
'7a69ddc6-70f1-3417-ae71-92a691da626b', 'correlation_id': 
'5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d', 'hostname': 'celery@airflow-worker-1', 
'delivery_info': {'exchange': '', 'routing_key':
  'celery', 'priority': 0, 'redelivered': None}, 'args': [['airflow', 'tasks', 
'run', 'run_100_concurrent', '25',... kwargs:{})
   [2020-12-14 18:48:37,552: DEBUG/MainProcess] Task accepted: 
airflow.executors.celery_executor.execute_command[5a07e65a-8ad0-4fbe-83c8-8ea952f3a55d]
 pid:23
   [2020-12-14 18:48:37,750: INFO/ForkPoolWorker-5] Executing command in 
Celery: ['airflow', 'tasks', 'run', 'run_100_concurrent', '25', 
'2020-12-14T18:47:13.448236+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/opt/airflow/dags/concurrent_workflow.py']
   [2020-12-14 18:48:41,337: DEBUG/ForkPoolWorker-5] Calling callbacks: 
[]
   ```
   [worker-0.log](https://github.com/apache/airflow/files/5691087/worker-0.log)
   [worker-1.log](https://github.com/apache/airflow/files/5691089/worker-1.log)
   
[scheduler.log](https://github.com/apache/airflow/files/5691090/scheduler.log)
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] manugarri commented on issue #12881: Snowflake python connector monkeypatches urllib and makes many services unusable.

2020-12-14 Thread GitBox


manugarri commented on issue #12881:
URL: https://github.com/apache/airflow/issues/12881#issuecomment-744641325


   fyi, we have been trying to get snowflake to fix this for months.
   
   As an alternative, I made a [lite version of snowflake-connector 
package](https://github.com/manugarri/snowflake-connector-python-lite) that i 
use in my airflow environments in case someone wants to check it out



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


kaxil commented on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744637470


   > I can indeed confirm that this issue does not occur when 
`importlib-metadata` is installed. One thing that is not quite clear to me 
though is why this package did not get installed as part of one of Airflow's 
dependencies by pip when running on Python 3.8. I probably wouldn't have 
stumbled into this issue if that had been the case.
   > 
   > i.e. this statement here should have been updated: 
https://github.com/apache/airflow/blob/1.10.14/setup.py#L624, similar to L442: 
https://github.com/apache/airflow/blob/1.10.14/setup.py#L442
   
   Yeat that fix I included only updated importlib-metadata in `devel` 
requirements. Hence it updated breeze and CI where I tested that changed. 
However, I missed the entry in install_requires, hence it is not installed for 
normal users who do not install `devel`



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj opened a new issue #13069: Handwritten argument parser in prepare_provider_packages.py

2020-12-14 Thread GitBox


mik-laj opened a new issue #13069:
URL: https://github.com/apache/airflow/issues/13069


   Hello @potiuk 
   
   I believe you wrote this script - 
[`prepare_provider_packages.py`](https://github.com/apache/airflow/blob/master/dev/provider_packages/prepare_provider_packages.py).
   
   I am wondering if there is any reason why you did not use the standard 
library - 
[`argument.ArgumentParser`](https://docs.python.org/3/library/argparse.html) in 
this script?
   https://docs.python.org/3/library/argparse.html
   
https://github.com/apache/airflow/blob/master/dev/provider_packages/prepare_provider_packages.py#L1518-L1552
   
   This makes the code overly complex and, by the way, more prone to errors 
e.g.  ``usage()`` method describes the `--version-suffix-for-pypi`` parameter, 
but we changed its name to` --version-suffix`.
   
   What do you think to rewrite this script to use the standard library? Do we 
have a requirement that prevents this? I think of something similar to [my 
gist](https://gist.github.com/mik-laj/ff008718fc6cec9fe929731b8c62d6f8).
   
   
   
   
   
   **Apache Airflow version**:
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**:
   - **OS** (e.g. from /etc/os-release):
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   
   
   
   **What you expected to happen**:
   
   
   
   **How to reproduce it**:
   
   
   
   **Anything else we need to know**:
   
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v2-0-test updated (e491f62 -> cc87caa)

2020-12-14 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


omit e491f62  Update default versions v2-0-test in the 2.0 branch (#12962)
 add 825e9cb  Fix gpg verification command (#13035)
 add abf2a42  Install airflow and providers from dist and verifies them  
(#13033)
 add 37263d6  Add Vidora to INTHEWILD.md (#13038)
 add 0d49a47  Fixes image building in DockerHub (#13039)
 add ed4926f  🔒 Fix missing HTTPS on airflow site links (#13043)
 add 4d3300c  Refactor plugins command output using AirflowConsole (#13036)
 add 1c1ef7e  Add project_id to client inside BigQuery hook update_table 
method (#13018)
 add ea3d42a  Make AirflowJsonEncoder uses Flask's JSONEncoder as a base  
(#13050)
 add 26c6854  Allows to install Airflow in Breeze from PIP with 
configurable extras (#13055)
 add 6bf9acb  Fix import from core to mysql provider in mysql example DAG 
(#13060)
 add ab5f770  Explicitly shutdown logging in tasks so concurrent.futures 
can be used (#13057)
 new cc87caa  Update default versions v2-0-test in the 2.0 branch (#12962)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (e491f62)
\
 N -- N -- N   refs/heads/v2-0-test (cc87caa)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/ci.yml   | 57 +--
 BREEZE.rst |  4 +-
 CONTRIBUTING.rst   |  2 +-
 Dockerfile |  2 +-
 INTHEWILD.md   |  1 +
 UPDATING.md|  6 +-
 airflow/api_connexion/openapi/v1.yaml  |  2 +-
 airflow/cli/cli_parser.py  | 18 +-
 airflow/cli/commands/plugins_command.py| 65 ++
 airflow/cli/commands/task_command.py   |  7 ++-
 airflow/cli/simple_table.py|  8 ++-
 airflow/executors/celery_executor.py   |  2 +
 airflow/executors/local_executor.py|  3 +
 .../dingding/example_dags/example_dingding.py  | 20 +++
 airflow/providers/google/cloud/hooks/bigquery.py   |  2 +-
 .../providers/mysql/example_dags/example_mysql.py  |  2 +-
 airflow/task/task_runner/standard_task_runner.py   |  2 +
 airflow/utils/json.py  |  4 +-
 airflow/www/package.json   |  2 +-
 breeze | 10 ++--
 breeze-complete|  1 +
 dev/README_RELEASE_AIRFLOW.md  |  2 +-
 dev/README_RELEASE_PROVIDER_PACKAGES.md|  2 +-
 dev/provider_packages/SETUP_TEMPLATE.py.jinja2 |  2 +-
 docs/apache-airflow/production-deployment.rst  |  1 +
 docs/apache-airflow/project.rst|  2 +-
 .../ci/{libraries => docker-compose}/_docker.env   | 53 --
 scripts/ci/docker-compose/base.yml | 45 +--
 scripts/ci/docker-compose/local-prod.yml   |  7 ---
 scripts/ci/images/ci_build_dockerhub.sh|  2 +-
 scripts/ci/images/ci_verify_ci_image.sh|  2 +
 scripts/ci/images/ci_verify_prod_image.sh  |  2 +
 scripts/ci/libraries/_build_images.sh  |  7 +--
 scripts/ci/libraries/_initialization.sh|  5 +-
 scripts/ci/static_checks/check_license.sh  |  2 +-
 scripts/ci/tools/ci_clear_tmp.sh   |  2 +-
 scripts/ci/tools/ci_fix_ownership.sh   |  2 +-
 scripts/in_container/_in_container_utils.sh| 26 ++---
 scripts/in_container/entrypoint_ci.sh  | 17 --
 .../run_install_and_test_provider_packages.sh  | 19 +--
 setup.cfg  |  2 +-
 tests/cli/commands/test_plugins_command.py | 25 ++---
 tests/providers/dingding/hooks/test_dingding.py| 22 
 .../providers/google/cloud/hooks/test_bigquery.py  | 52 +
 44 files changed, 289 insertions(+), 232 deletions(-)
 rename scripts/ci/{librar

[GitHub] [airflow] github-actions[bot] commented on pull request #13065: Add retryer to SFTP hook connection

2020-12-14 Thread GitBox


github-actions[bot] commented on pull request #13065:
URL: https://github.com/apache/airflow/pull/13065#issuecomment-744629985


   The PR is likely OK to be merged with just subset of tests for default 
Python and Database versions without running the full matrix of tests, because 
it does not modify the core of Airflow. If the committers decide that the full 
tests matrix is needed, they will add the label 'full tests needed'. Then you 
should rebase to the latest master or amend the last commit of the PR, and push 
it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v2-0-stable updated (e491f62 -> cc87caa)

2020-12-14 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v2-0-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard e491f62  Update default versions v2-0-test in the 2.0 branch (#12962)
 add 825e9cb  Fix gpg verification command (#13035)
 add abf2a42  Install airflow and providers from dist and verifies them  
(#13033)
 add 37263d6  Add Vidora to INTHEWILD.md (#13038)
 add 0d49a47  Fixes image building in DockerHub (#13039)
 add ed4926f  🔒 Fix missing HTTPS on airflow site links (#13043)
 add 4d3300c  Refactor plugins command output using AirflowConsole (#13036)
 add 1c1ef7e  Add project_id to client inside BigQuery hook update_table 
method (#13018)
 add ea3d42a  Make AirflowJsonEncoder uses Flask's JSONEncoder as a base  
(#13050)
 add 26c6854  Allows to install Airflow in Breeze from PIP with 
configurable extras (#13055)
 add 6bf9acb  Fix import from core to mysql provider in mysql example DAG 
(#13060)
 add ab5f770  Explicitly shutdown logging in tasks so concurrent.futures 
can be used (#13057)
 add cc87caa  Update default versions v2-0-test in the 2.0 branch (#12962)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (e491f62)
\
 N -- N -- N   refs/heads/v2-0-stable (cc87caa)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 .github/workflows/ci.yml   | 57 +--
 BREEZE.rst |  4 +-
 CONTRIBUTING.rst   |  2 +-
 Dockerfile |  2 +-
 INTHEWILD.md   |  1 +
 UPDATING.md|  6 +-
 airflow/api_connexion/openapi/v1.yaml  |  2 +-
 airflow/cli/cli_parser.py  | 18 +-
 airflow/cli/commands/plugins_command.py| 65 ++
 airflow/cli/commands/task_command.py   |  7 ++-
 airflow/cli/simple_table.py|  8 ++-
 airflow/executors/celery_executor.py   |  2 +
 airflow/executors/local_executor.py|  3 +
 .../dingding/example_dags/example_dingding.py  | 20 +++
 airflow/providers/google/cloud/hooks/bigquery.py   |  2 +-
 .../providers/mysql/example_dags/example_mysql.py  |  2 +-
 airflow/task/task_runner/standard_task_runner.py   |  2 +
 airflow/utils/json.py  |  4 +-
 airflow/www/package.json   |  2 +-
 breeze | 10 ++--
 breeze-complete|  1 +
 dev/README_RELEASE_AIRFLOW.md  |  2 +-
 dev/README_RELEASE_PROVIDER_PACKAGES.md|  2 +-
 dev/provider_packages/SETUP_TEMPLATE.py.jinja2 |  2 +-
 docs/apache-airflow/production-deployment.rst  |  1 +
 docs/apache-airflow/project.rst|  2 +-
 .../ci/{libraries => docker-compose}/_docker.env   | 53 --
 scripts/ci/docker-compose/base.yml | 45 +--
 scripts/ci/docker-compose/local-prod.yml   |  7 ---
 scripts/ci/images/ci_build_dockerhub.sh|  2 +-
 scripts/ci/images/ci_verify_ci_image.sh|  2 +
 scripts/ci/images/ci_verify_prod_image.sh  |  2 +
 scripts/ci/libraries/_build_images.sh  |  7 +--
 scripts/ci/libraries/_initialization.sh|  5 +-
 scripts/ci/static_checks/check_license.sh  |  2 +-
 scripts/ci/tools/ci_clear_tmp.sh   |  2 +-
 scripts/ci/tools/ci_fix_ownership.sh   |  2 +-
 scripts/in_container/_in_container_utils.sh| 26 ++---
 scripts/in_container/entrypoint_ci.sh  | 17 --
 .../run_install_and_test_provider_packages.sh  | 19 +--
 setup.cfg  |  2 +-
 tests/cli/commands/test_plugins_command.py | 25 ++---
 tests/providers/dingding/hooks/test_dingding.py| 22 
 .../providers/google/cloud/hooks/test_bigquery.py  | 52 +
 44 files changed, 289 insertions(+), 232 deletions(-)
 rename scripts/ci/{libraries => docker-compose}/_docker.env (80%)



[airflow] 01/01: Update default versions v2-0-test in the 2.0 branch (#12962)

2020-12-14 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit cc87caa0ce0b31aa29df7bbe90bdcc2426d80ff1
Author: Jarek Potiuk 
AuthorDate: Wed Dec 9 20:25:02 2020 +0100

Update default versions v2-0-test in the 2.0 branch (#12962)
---
 Dockerfile  | 2 +-
 scripts/ci/libraries/_initialization.sh | 4 ++--
 2 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index 968c657..d22d081 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -160,7 +160,7 @@ ARG AIRFLOW_EXTRAS
 ARG ADDITIONAL_AIRFLOW_EXTRAS=""
 ENV 
AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS}${ADDITIONAL_AIRFLOW_EXTRAS:+,}${ADDITIONAL_AIRFLOW_EXTRAS}
 
-ARG AIRFLOW_CONSTRAINTS_REFERENCE="constraints-master"
+ARG AIRFLOW_CONSTRAINTS_REFERENCE="constraints-2-0"
 ARG 
AIRFLOW_CONSTRAINTS_LOCATION="https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt";
 ENV AIRFLOW_CONSTRAINTS_LOCATION=${AIRFLOW_CONSTRAINTS_LOCATION}
 
diff --git a/scripts/ci/libraries/_initialization.sh 
b/scripts/ci/libraries/_initialization.sh
index 0775706..51ada03 100644
--- a/scripts/ci/libraries/_initialization.sh
+++ b/scripts/ci/libraries/_initialization.sh
@@ -199,8 +199,8 @@ function initialization::initialize_base_variables() {
 # Determine current branch
 function initialization::initialize_branch_variables() {
 # Default branch used - this will be different in different branches
-export DEFAULT_BRANCH=${DEFAULT_BRANCH="master"}
-export 
DEFAULT_CONSTRAINTS_BRANCH=${DEFAULT_CONSTRAINTS_BRANCH="constraints-master"}
+export DEFAULT_BRANCH=${DEFAULT_BRANCH="v2-0-test"}
+export 
DEFAULT_CONSTRAINTS_BRANCH=${DEFAULT_CONSTRAINTS_BRANCH="constraints-2-0"}
 readonly DEFAULT_BRANCH
 readonly DEFAULT_CONSTRAINTS_BRANCH
 



[GitHub] [airflow] mik-laj edited a comment on pull request #13065: Add retryer to SFTP hook connection

2020-12-14 Thread GitBox


mik-laj edited a comment on pull request #13065:
URL: https://github.com/apache/airflow/pull/13065#issuecomment-744626474


   @TobKed LGTM. Is it ready for rmerge?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on pull request #13065: Add retryer to SFTP hook connection

2020-12-14 Thread GitBox


mik-laj commented on pull request #13065:
URL: https://github.com/apache/airflow/pull/13065#issuecomment-744626474


   @TobKed LGTM. Is it ready for review?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (74dc6fb -> 5503951)

2020-12-14 Thread kamilbregula
This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 74dc6fb  Display version selector for production docs (#13051)
 add 5503951  Update stable version for published docs (#13052)

No new revisions were added by this update.

Summary of changes:
 docs/exts/docs_build/docs_builder.py | 3 +++
 1 file changed, 3 insertions(+)



[airflow] branch master updated (169aa01 -> 74dc6fb)

2020-12-14 Thread kamilbregula
This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 169aa01  Skip discovering snowflake provider in development mode 
(#13062)
 add 74dc6fb  Display version selector for production docs (#13051)

No new revisions were added by this update.

Summary of changes:
 docs/build_docs.py |  2 +-
 docs/conf.py   | 11 ---
 2 files changed, 9 insertions(+), 4 deletions(-)



[GitHub] [airflow] mik-laj merged pull request #13052: Update stable version for published docs

2020-12-14 Thread GitBox


mik-laj merged pull request #13052:
URL: https://github.com/apache/airflow/pull/13052


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj merged pull request #13051: Display version selector for production docs

2020-12-14 Thread GitBox


mik-laj merged pull request #13051:
URL: https://github.com/apache/airflow/pull/13051


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #12982: Rename master branch to main

2020-12-14 Thread GitBox


mik-laj commented on issue #12982:
URL: https://github.com/apache/airflow/issues/12982#issuecomment-744620091


   Similar: https://github.com/apache/airflow/issues/9351



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj opened a new pull request #13068: Fetch inventories for third-party services only once

2020-12-14 Thread GitBox


mik-laj opened a new pull request #13068:
URL: https://github.com/apache/airflow/pull/13068


   Part of: https://github.com/apache/airflow/pull/12989/files
   There are similar problems but with third-party services.
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (387af82 -> 169aa01)

2020-12-14 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 387af82  Bump version of sphinx-airflow-theme (#13054)
 add 169aa01  Skip discovering snowflake provider in development mode 
(#13062)

No new revisions were added by this update.

Summary of changes:
 airflow/providers_manager.py | 12 ++--
 tests/core/test_providers_manager.py |  6 --
 2 files changed, 14 insertions(+), 4 deletions(-)



[GitHub] [airflow] potiuk merged pull request #13062: Skip discovering snowflake provider in development mode

2020-12-14 Thread GitBox


potiuk merged pull request #13062:
URL: https://github.com/apache/airflow/pull/13062


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


potiuk edited a comment on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744611311


   This was a pure mistake that we only noticed after the release, It is 
already fixed in v1-10-test branch - so in case we release 1.10.15. See 
https://github.com/apache/airflow/pull/12859 for the history of it and 
https://github.com/apache/airflow/commit/e75deee11ab8ed626979f1fe3927049a200ab676
 for the commit fixing it merged to v1-10-test



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


potiuk edited a comment on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744611311


   This was a pure mistake that we only noticed after the release, It is 
already fixed in v1-10-test branch - so in case we release 1.10.15. See 
https://github.com/apache/airflow/pull/12859 for the history of it and
   
   
https://github.com/apache/airflow/commit/e75deee11ab8ed626979f1fe3927049a200ab676
 for the commit fixing it merged to v1-10-test



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


potiuk commented on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744611311


   This was a pure mistake that we only noticed after the release, It is 
already fixed in v1-10-test branch - so in case we release 1.10.15. See 
https://github.com/apache/airflow/pull/12859.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (ab5f770 -> 387af82)

2020-12-14 Thread xddeng
This is an automated email from the ASF dual-hosted git repository.

xddeng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from ab5f770  Explicitly shutdown logging in tasks so concurrent.futures 
can be used (#13057)
 add 387af82  Bump version of sphinx-airflow-theme (#13054)

No new revisions were added by this update.

Summary of changes:
 setup.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)



[GitHub] [airflow] XD-DENG merged pull request #13054: Bump version of sphinx-airflow-theme

2020-12-14 Thread GitBox


XD-DENG merged pull request #13054:
URL: https://github.com/apache/airflow/pull/13054


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] RikHeijdens edited a comment on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


RikHeijdens edited a comment on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744598980


   I can indeed confirm that this issue does not occur when 
`importlib-metadata` is installed. One thing that is not quite clear to me 
though is why this package did not get installed as part of one of Airflow's 
dependencies by pip when running on Python 3.8. I probably wouldn't have 
stumbled into this issue if that had been the case.
   
   i.e. this statement here should have been updated: 
https://github.com/apache/airflow/blob/1.10.14/setup.py#L624, similar to L442: 
https://github.com/apache/airflow/blob/1.10.14/setup.py#L442



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] RikHeijdens edited a comment on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


RikHeijdens edited a comment on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744598980


   I can indeed confirm that this issue does not occur when 
`importlib-metadata` is installed. One thing that is not quite clear to me 
though is why this package did not get installed as part of one of Airflow's 
dependencies when running on Python 3.8. I probably wouldn't have stumbled into 
this issue if that had been the case.
   
   i.e. this statement here should have been updated: 
https://github.com/apache/airflow/blob/1.10.14/setup.py#L624, similar to L442: 
https://github.com/apache/airflow/blob/1.10.14/setup.py#L442



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] RikHeijdens commented on issue #13063: Importing entry_point plugins fails on Airflow v1.10.14 and Python 3.8

2020-12-14 Thread GitBox


RikHeijdens commented on issue #13063:
URL: https://github.com/apache/airflow/issues/13063#issuecomment-744598980


   I can indeed confirm that this issue does not occur when 
`importlib-metadata` is installed. One thing that is not quite clear to me 
though is why this package did not get installed as part of one of Airflow's 
dependencies when running on Python 3.8. I probably wouldn't have stumbled into 
this issue if that had been the case.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #13045: Airflow constraints file constraints-1.10.7/constraints-3.7 incorrect constraints preventing install

2020-12-14 Thread GitBox


potiuk commented on issue #13045:
URL: https://github.com/apache/airflow/issues/13045#issuecomment-744591339


   > While I don't doubt that there are several problems with pip 20.3, as you 
documented in the attached issues, I don't think this one falls under them.
   > The constraints file for 1.10.7 is plainly and simply wrong, which is not 
surprising since it was created for 1.10.10.
   > As I wrote above it includes a constraint for `future==0.18.2` while the 
setup.py for that version includes the requirement for `future>=0.16.0, <0.17`, 
so any good resolver would indicate that there is a mismatch.
   
   The problem is that before 20.3 PIP was happily accepting and resolving this 
anyway. 
   
   But I think there is nothing to fix now anyway. 1.10.7 is prehistory and we 
are recommending everyone to migrate to 1.10.14 which has good constraints 
(those constraints BTW do not work with pip 20.3 anyway). So similarly like 
with any other problems with previous releases - 1.10.14  + PIP 20.2.4 (for 
now) solves the problems we know about (including installation problems). I 
don't think any other action is needed at this stage. 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   3   >