[GitHub] [airflow] Sangarshanan commented on issue #11048: Create FernetEnabledRule to ease upgrade to Airflow 2.0

2020-09-22 Thread GitBox


Sangarshanan commented on issue #11048:
URL: https://github.com/apache/airflow/issues/11048#issuecomment-696545319


   Can pick this up : )



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Sangarshanan commented on issue #11046: Create LoggingConfigurationRule to ease upgrade to Airflow 2.0

2020-09-22 Thread GitBox


Sangarshanan commented on issue #11046:
URL: https://github.com/apache/airflow/issues/11046#issuecomment-696549576


   Would love to take a crack at this 😄  



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #11024: Get Airflow configs with sensitive data from CloudSecretManagerBackend

2020-09-22 Thread GitBox


kaxil merged pull request #11024:
URL: https://github.com/apache/airflow/pull/11024


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Get Airflow configs with sensitive data from CloudSecretManagerBackend (#11024)

2020-09-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new cb979f9  Get Airflow configs with sensitive data from 
CloudSecretManagerBackend (#11024)
cb979f9 is described below

commit cb979f9f213bb3c9835a3dc924f84a07f5387378
Author: Kaxil Naik 
AuthorDate: Tue Sep 22 08:17:58 2020 +0100

Get Airflow configs with sensitive data from CloudSecretManagerBackend 
(#11024)
---
 .../providers/google/cloud/secrets/secret_manager.py | 14 ++
 .../google-cloud-secret-manager-backend.rst  |  2 ++
 docs/security/secrets/secrets-backend/index.rst  |  3 +++
 .../google/cloud/secrets/test_secret_manager.py  | 20 
 4 files changed, 39 insertions(+)

diff --git a/airflow/providers/google/cloud/secrets/secret_manager.py 
b/airflow/providers/google/cloud/secrets/secret_manager.py
index fed5e0e..ffc7b4d 100644
--- a/airflow/providers/google/cloud/secrets/secret_manager.py
+++ b/airflow/providers/google/cloud/secrets/secret_manager.py
@@ -57,6 +57,9 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
 :type variables_prefix: str
+:param config_prefix: Specifies the prefix of the secret to read to get 
Airflow Configurations
+containing secrets.
+:type config_prefix: str
 :param gcp_key_path: Path to Google Cloud Service Account key file (JSON). 
Mutually exclusive with
 gcp_keyfile_dict. use default credentials in the current environment 
if not provided.
 :type gcp_key_path: str
@@ -75,6 +78,7 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 self,
 connections_prefix: str = "airflow-connections",
 variables_prefix: str = "airflow-variables",
+config_prefix: str = "airflow-config",
 gcp_keyfile_dict: Optional[dict] = None,
 gcp_key_path: Optional[str] = None,
 gcp_scopes: Optional[str] = None,
@@ -85,6 +89,7 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 super().__init__(**kwargs)
 self.connections_prefix = connections_prefix
 self.variables_prefix = variables_prefix
+self.config_prefix = config_prefix
 self.sep = sep
 if not self._is_valid_prefix_and_sep():
 raise AirflowException(
@@ -129,6 +134,15 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 """
 return self._get_secret(self.variables_prefix, key)
 
+def get_config(self, key: str) -> Optional[str]:
+"""
+Get Airflow Configuration
+
+:param key: Configuration Option Key
+:return: Configuration Option Value
+"""
+return self._get_secret(self.config_prefix, key)
+
 def _get_secret(self, path_prefix: str, secret_id: str) -> Optional[str]:
 """
 Get secret value from the SecretManager based on prefix.
diff --git 
a/docs/security/secrets/secrets-backend/google-cloud-secret-manager-backend.rst 
b/docs/security/secrets/secrets-backend/google-cloud-secret-manager-backend.rst
index bbca274..1435d73 100644
--- 
a/docs/security/secrets/secrets-backend/google-cloud-secret-manager-backend.rst
+++ 
b/docs/security/secrets/secrets-backend/google-cloud-secret-manager-backend.rst
@@ -116,11 +116,13 @@ The name of the secret must fit the following formats:
 
  * for connection: ``[variable_prefix][sep][connection_name]``
  * for variable: ``[connections_prefix][sep][variable_name]``
+ * for Airflow config: ``[config_prefix][sep][config_name]``
 
 where:
 
  * ``connections_prefix`` - fixed value defined in the ``connections_prefix`` 
parameter in backend configuration. Default: ``airflow-connections``.
  * ``variable_prefix`` - fixed value defined in the ``variable_prefix`` 
parameter in backend configuration. Default: ``airflow-variables``.
+ * ``config_prefix`` - fixed value defined in the ``config_prefix`` parameter 
in backend configuration. Default: ``airflow-config``.
  * ``sep`` - fixed value defined in the ``sep`` parameter in backend 
configuration. Default: ``-``.
 
 The Cloud Secrets Manager secret name should follow the pattern 
``^[a-zA-Z0-9-_]*$``.
diff --git a/docs/security/secrets/secrets-backend/index.rst 
b/docs/security/secrets/secrets-backend/index.rst
index 81526d7..0c03708 100644
--- a/docs/security/secrets/secrets-backend/index.rst
+++ b/docs/security/secrets/secrets-backend/index.rst
@@ -31,6 +31,9 @@ such as :ref:`Google Cloud Secret 
Manager`,
 The Airflow UI only shows connections and variables stored in the Metadata 
DB and not via any other method.
 If you use an alternative secrets backend, check inside your backend to 
view the values of your variables and con

[airflow] branch master updated: Get Airflow configs with sensitive data from CloudSecretManagerBackend (#11024)

2020-09-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new cb979f9  Get Airflow configs with sensitive data from 
CloudSecretManagerBackend (#11024)
cb979f9 is described below

commit cb979f9f213bb3c9835a3dc924f84a07f5387378
Author: Kaxil Naik 
AuthorDate: Tue Sep 22 08:17:58 2020 +0100

Get Airflow configs with sensitive data from CloudSecretManagerBackend 
(#11024)
---
 .../providers/google/cloud/secrets/secret_manager.py | 14 ++
 .../google-cloud-secret-manager-backend.rst  |  2 ++
 docs/security/secrets/secrets-backend/index.rst  |  3 +++
 .../google/cloud/secrets/test_secret_manager.py  | 20 
 4 files changed, 39 insertions(+)

diff --git a/airflow/providers/google/cloud/secrets/secret_manager.py 
b/airflow/providers/google/cloud/secrets/secret_manager.py
index fed5e0e..ffc7b4d 100644
--- a/airflow/providers/google/cloud/secrets/secret_manager.py
+++ b/airflow/providers/google/cloud/secrets/secret_manager.py
@@ -57,6 +57,9 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
 :type variables_prefix: str
+:param config_prefix: Specifies the prefix of the secret to read to get 
Airflow Configurations
+containing secrets.
+:type config_prefix: str
 :param gcp_key_path: Path to Google Cloud Service Account key file (JSON). 
Mutually exclusive with
 gcp_keyfile_dict. use default credentials in the current environment 
if not provided.
 :type gcp_key_path: str
@@ -75,6 +78,7 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 self,
 connections_prefix: str = "airflow-connections",
 variables_prefix: str = "airflow-variables",
+config_prefix: str = "airflow-config",
 gcp_keyfile_dict: Optional[dict] = None,
 gcp_key_path: Optional[str] = None,
 gcp_scopes: Optional[str] = None,
@@ -85,6 +89,7 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 super().__init__(**kwargs)
 self.connections_prefix = connections_prefix
 self.variables_prefix = variables_prefix
+self.config_prefix = config_prefix
 self.sep = sep
 if not self._is_valid_prefix_and_sep():
 raise AirflowException(
@@ -129,6 +134,15 @@ class CloudSecretManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 """
 return self._get_secret(self.variables_prefix, key)
 
+def get_config(self, key: str) -> Optional[str]:
+"""
+Get Airflow Configuration
+
+:param key: Configuration Option Key
+:return: Configuration Option Value
+"""
+return self._get_secret(self.config_prefix, key)
+
 def _get_secret(self, path_prefix: str, secret_id: str) -> Optional[str]:
 """
 Get secret value from the SecretManager based on prefix.
diff --git 
a/docs/security/secrets/secrets-backend/google-cloud-secret-manager-backend.rst 
b/docs/security/secrets/secrets-backend/google-cloud-secret-manager-backend.rst
index bbca274..1435d73 100644
--- 
a/docs/security/secrets/secrets-backend/google-cloud-secret-manager-backend.rst
+++ 
b/docs/security/secrets/secrets-backend/google-cloud-secret-manager-backend.rst
@@ -116,11 +116,13 @@ The name of the secret must fit the following formats:
 
  * for connection: ``[variable_prefix][sep][connection_name]``
  * for variable: ``[connections_prefix][sep][variable_name]``
+ * for Airflow config: ``[config_prefix][sep][config_name]``
 
 where:
 
  * ``connections_prefix`` - fixed value defined in the ``connections_prefix`` 
parameter in backend configuration. Default: ``airflow-connections``.
  * ``variable_prefix`` - fixed value defined in the ``variable_prefix`` 
parameter in backend configuration. Default: ``airflow-variables``.
+ * ``config_prefix`` - fixed value defined in the ``config_prefix`` parameter 
in backend configuration. Default: ``airflow-config``.
  * ``sep`` - fixed value defined in the ``sep`` parameter in backend 
configuration. Default: ``-``.
 
 The Cloud Secrets Manager secret name should follow the pattern 
``^[a-zA-Z0-9-_]*$``.
diff --git a/docs/security/secrets/secrets-backend/index.rst 
b/docs/security/secrets/secrets-backend/index.rst
index 81526d7..0c03708 100644
--- a/docs/security/secrets/secrets-backend/index.rst
+++ b/docs/security/secrets/secrets-backend/index.rst
@@ -31,6 +31,9 @@ such as :ref:`Google Cloud Secret 
Manager`,
 The Airflow UI only shows connections and variables stored in the Metadata 
DB and not via any other method.
 If you use an alternative secrets backend, check inside your backend to 
view the values of your variables and con

[GitHub] [airflow] olchas commented on a change in pull request #11061: Add template fields renderers for better UI rendering

2020-09-22 Thread GitBox


olchas commented on a change in pull request #11061:
URL: https://github.com/apache/airflow/pull/11061#discussion_r492527472



##
File path: airflow/models/baseoperator.py
##
@@ -271,6 +271,9 @@ class derived from this one results in the creation of a 
task object,
 template_fields: Iterable[str] = ()
 # Defines which files extensions to look for in the templated fields
 template_ext: Iterable[str] = ()
+# Template filed renderers indicating type of the filed for example sql, 
json

Review comment:
   ```suggestion
   # Template field renderers indicating type of the field, for example 
sql, json
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] baolsen opened a new issue #11076: Add bulk "clear" to Browse -> DAG Runs UI

2020-09-22 Thread GitBox


baolsen opened a new issue #11076:
URL: https://github.com/apache/airflow/issues/11076


   **Description**
   
   Airflow DAG Run UI (Browse -> DAG Runs) is really useful for managing many 
DAG Runs, creating specific DAG Runs and so on.
   It would be great to have an additional option to "With Selected DAG Runs" 
-> "Clear" to reset all task instances under those DAG Runs.
   
   **Use case / motivation**
   
   When rerunning DAGs especially during development cycles it is tedious to go 
into the Tree view and clear many DAG Runs manually. It would be great to have 
some way to do this in bulk / over a range of DAG Runs. 
   
   The DAG Run UI seems like a good place to add this since it is already a 
kind of administrative area over all the DAG Runs and has "with selected" 
options like "delete", "mark failed", etc. Discussion welcome :)
   
   **Related Issues**
   None
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #11077: Autogenerated reference docs for Helm Chart

2020-09-22 Thread GitBox


mik-laj commented on issue #11077:
URL: https://github.com/apache/airflow/issues/11077#issuecomment-696578609


   @flvndh Can you help with it?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj opened a new issue #11077: Autogenerated reference docs for Helm Chart

2020-09-22 Thread GitBox


mik-laj opened a new issue #11077:
URL: https://github.com/apache/airflow/issues/11077


   Hello,
   
   We have JSON schema definition for values.yaml - 
[values.schema.json](https://github.com/apache/airflow/blob/master/chart/values.schema.json)
 This can us generate user-friendly reference documentation. Currently [the 
documentation](https://github.com/apache/airflow/tree/master/chart#parameters) 
is not updated very often and does not describe all fields. I don't know a tool 
that can do that, but I imagine it could be displayed as a tree. There may be a 
ready tool to generate documentation for Helm file or generic documentation 
generation tools for JSON Schema, but I haven't checked it yet.
   
   This will allow us to keep the reference documentation up-to-date and to 
make it more user-friendly.
   
   Best regards,
   Kamil Breguła
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] michalslowikowski00 commented on a change in pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


michalslowikowski00 commented on a change in pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#discussion_r492557117



##
File path: airflow/providers/microsoft/azure/transfers/local_to_adls.py
##
@@ -0,0 +1,96 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from typing import Dict, Any, Optional
+from airflow import AirflowException
+from airflow.models import BaseOperator
+from airflow.providers.microsoft.azure.hooks.azure_data_lake import 
AzureDataLakeHook
+from airflow.utils.decorators import apply_defaults
+
+
+class AzureDataLakeStorageUploadOperator(BaseOperator):
+"""
+Upload file(s) to Azure Data Lake
+
+:param local_path: local path. Can be single file, directory (in which 
case,
+upload recursively) or glob pattern. Recursive glob patterns using 
`**`
+are not supported.
+:type local_path: str
+:param remote_path: Remote path to upload to; if multiple files, this is 
the
+directory root to write within.
+:type remote_path: str
+:param nthreads: Number of threads to use. If None, uses the number of 
cores.
+:type nthreads: int
+:param overwrite: Whether to forcibly overwrite existing files/directories.
+If False and remote path is a directory, will quit regardless if 
any files
+would be overwritten or not. If True, only matching filenames are 
actually
+overwritten.
+:type overwrite: bool
+:param buffersize: int [2**22]
+Number of bytes for internal buffer. This block cannot be bigger 
than
+a chunk and cannot be smaller than a block.
+:type buffersize: int
+:param blocksize: int [2**22]
+Number of bytes for a block. Within each chunk, we write a smaller
+block for each API call. This block cannot be bigger than a chunk.
+:type blocksize: int
+:param extra_upload_options: Extra upload options to add to the hook 
upload method
+:type extra_upload_options: dict
+:param azure_data_lake_conn_id: Reference to the Azure Data Lake 
connection.
+:type azure_data_lake_conn_id: str
+"""
+
+@apply_defaults
+def __init__(
+self,
+*,
+local_path: str,
+remote_path: str,
+overwrite: bool = True,
+nthreads: int = 64,
+buffersize: int = 4194304,
+blocksize: int = 4194304,
+extra_upload_options: Optional[Dict[str, Any]] = None,
+azure_data_lake_conn_id: str = 'azure_data_lake_default',
+**kwargs,
+):
+super().__init__(**kwargs)
+self.local_path = local_path
+self.remote_path = remote_path
+self.overwrite = overwrite
+self.nthreads = nthreads
+self.buffersize = buffersize
+self.blocksize = blocksize
+self.extra_upload_options = extra_upload_options
+self.azure_data_lake_conn_id = azure_data_lake_conn_id
+
+def execute(self, context):
+if '**' in self.local_path:
+raise AirflowException("Recursive glob patterns using `**` are not 
supported")
+if not self.extra_upload_options:

Review comment:
   Do you need to pass empty dict in execute?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] michalslowikowski00 commented on a change in pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


michalslowikowski00 commented on a change in pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#discussion_r492558461



##
File path: airflow/providers/microsoft/azure/transfers/local_to_adls.py
##
@@ -0,0 +1,96 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from typing import Dict, Any, Optional
+from airflow import AirflowException
+from airflow.models import BaseOperator
+from airflow.providers.microsoft.azure.hooks.azure_data_lake import 
AzureDataLakeHook
+from airflow.utils.decorators import apply_defaults
+
+
+class AzureDataLakeStorageUploadOperator(BaseOperator):
+"""
+Upload file(s) to Azure Data Lake
+
+:param local_path: local path. Can be single file, directory (in which 
case,
+upload recursively) or glob pattern. Recursive glob patterns using 
`**`
+are not supported.

Review comment:
   We are trying to avoid full stops in documentation.
   ```suggestion
   are not supported
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-4470) RBAC Github Enterprise OAuth provider callback URL?

2020-09-22 Thread Nidhi Chourasia (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4470?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17199922#comment-17199922
 ] 

Nidhi Chourasia commented on AIRFLOW-4470:
--

[~coopergillan] I think it will be a good contribution to the documentation ,if 
you add the values under OAUTH_PROVIDER section in webserver_config.py as many 
might not stumble across this to get a solution.
The link for the same is 
[https://flask-appbuilder.readthedocs.io/en/latest/security.html.]

 

> RBAC Github Enterprise OAuth provider callback URL?
> ---
>
> Key: AIRFLOW-4470
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4470
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication, webserver
>Affects Versions: 1.10.2
>Reporter: Geez
>Assignee: Massipssa Kerrache
>Priority: Blocker
>  Labels: usability
> Attachments: airflow_ss0_2.PNG, airflow_sso3.PNG, airflow_sso4.PNG, 
> image-2019-10-30-16-25-14-436.png, image-2019-10-31-11-47-04-041.png
>
>
> Hi all,
> Quick question, when using RBAC with OAuth providers (1.10.2):
>  * we are not specifying the {{authenticate}} or {{auth_backend}} in the 
> [webserver] section of \{{airflow.cfg}}anymore
>  * Instead, we set the OAuth provider config in the flask-appbuilder's 
> {{webserver_config.py}}:
> {code:java}
>  
> # Adapting Google OAuth example to Github:
> OAUTH_PROVIDERS = [
> {'name':'github', 'icon':'fa-github', 'token_key':'access_token',
>  'remote_app': {
> 'base_url':'https://github.corporate-domain.com/login',
> 
> 'access_token_url':'https://github.corporate-domain.com/login/oauth/access_token',
> 
> 'authorize_url':'https://github.corporate-domain.com/login/oauth/authorize',
> 'request_token_url': None,
> 'consumer_key': '',
> 'consumer_secret': 'X',
>  }
> }
> ]
>  
> {code}
>  _Question:_
>  * so what callback URL do we specify in the app? 
> {{http:/webapp/ghe_oauth/callback}} would not work right? (example with 
> github entreprise)
> No matter what I specify for the callback url (/ghe_oauth/callback or 
> [http://webapp.com|http://webapp.com/]), I get an error message about 
> {{redirect_uri}} mismatch:
> {code:java}
> {{error=redirect_uri_mismatch&error_description=The+redirect_uri+MUST+match+the+registered+callback+URL+for+this+application
>  }}{code}
> _Docs ref:_
>  Here is how you setup OAuth with Github Entreprise on Airflow _*without*_ 
> RBAC: 
> [https://airflow.apache.org/security.html#github-enterprise-ghe-authentication]
> And here is how you setup OAuth via the {{webserver_config.py}} of 
> flask_appbuilder used by airflow _*with*_RBAC:
>  
> [https://flask-appbuilder.readthedocs.io/en/latest/security.html#authentication-oauth]
> What's the *callback url* when using RBAC and OAuth with Airflow?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj commented on pull request #11055: Fix fail to convert Pendulum to MySQL datetime

2020-09-22 Thread GitBox


mik-laj commented on pull request #11055:
URL: https://github.com/apache/airflow/pull/11055#issuecomment-696585151


   Can you add unit tests to avoid regression, please?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on issue #11048: Create FernetEnabledRule to ease upgrade to Airflow 2.0

2020-09-22 Thread GitBox


turbaszek commented on issue #11048:
URL: https://github.com/apache/airflow/issues/11048#issuecomment-696587130


   @Sangarshanan I assigned you 👌 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on issue #11046: Create LoggingConfigurationRule to ease upgrade to Airflow 2.0

2020-09-22 Thread GitBox


turbaszek commented on issue #11046:
URL: https://github.com/apache/airflow/issues/11046#issuecomment-696587165


   @Sangarshanan I assigned you 👌 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #11059: Classifiers seem inaccurate

2020-09-22 Thread GitBox


mik-laj commented on issue #11059:
URL: https://github.com/apache/airflow/issues/11059#issuecomment-696588122


   @pgagnon It makes sense. Can I assign you to this task so that you open a PR?
   
   I always have a problem with the butt classifier, but I checked and we have 
too strict restrictions.
   https://user-images.githubusercontent.com/12058428/93861040-3321e080-fcc0-11ea-8f07-2694c87ae15d.png";>
   
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on issue #11060: Webserver error: 'NoneType' object has no attribute 'last_loaded'

2020-09-22 Thread GitBox


turbaszek commented on issue #11060:
URL: https://github.com/apache/airflow/issues/11060#issuecomment-696589322


   @kaxil have ou waited utill the task will be executed? In my case, the error 
appeared once the task was executed 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek merged pull request #11072: Add some tasks using BashOperator in TaskGroup example dag

2020-09-22 Thread GitBox


turbaszek merged pull request #11072:
URL: https://github.com/apache/airflow/pull/11072


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek merged pull request #11071: Replace Airflow Slack Invite old link to short link

2020-09-22 Thread GitBox


turbaszek merged pull request #11071:
URL: https://github.com/apache/airflow/pull/11071


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (cb979f9 -> 45639cd)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from cb979f9  Get Airflow configs with sensitive data from 
CloudSecretManagerBackend (#11024)
 add 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)

No new revisions were added by this update.

Summary of changes:
 airflow/example_dags/example_task_group.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[GitHub] [airflow] flvndh commented on issue #11077: Autogenerated reference docs for Helm Chart

2020-09-22 Thread GitBox


flvndh commented on issue #11077:
URL: https://github.com/apache/airflow/issues/11077#issuecomment-696590838


   It would be a pleasure 😁



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] michalslowikowski00 commented on pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


michalslowikowski00 commented on pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#issuecomment-696590747


   Looks good to me. 
   Personally I would love to see:
   - example DAG 
(https://github.com/apache/airflow/tree/master/airflow/providers/microsoft/azure/example_dags)
   - system test 
(https://github.com/apache/airflow/tree/master/tests/providers/microsoft/azure/operators)
   - documentation 
(https://github.com/apache/airflow/tree/master/docs/howto/operator)
   
   There are no docs about Azure and Microsoft operators at all. You could be a 
pioneer and add docs and make a good example for future contributors. :)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] michalslowikowski00 edited a comment on pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


michalslowikowski00 edited a comment on pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#issuecomment-696590747


   Looks good to me. 
   Personally I would love to see:
   - example DAG 
(https://github.com/apache/airflow/tree/master/airflow/providers/microsoft/azure/example_dags)
   - system test 
(https://github.com/apache/airflow/tree/master/tests/providers/microsoft/azure/operators)
   - documentation 
(https://github.com/apache/airflow/tree/master/docs/howto/operator)
   
   There are no docs about Azure and Microsoft operators at all. You could be a 
pioneer and add docs and make a good example for future contributors. :)
   
   I am sorry for such late review. :(



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (cb979f9 -> 45639cd)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from cb979f9  Get Airflow configs with sensitive data from 
CloudSecretManagerBackend (#11024)
 add 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)

No new revisions were added by this update.

Summary of changes:
 airflow/example_dags/example_task_group.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[airflow] branch master updated (45639cd -> e3a5900)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)
 add e3a5900  Replace Airflow Slack Invite old link to short link (#11071)

No new revisions were added by this update.

Summary of changes:
 .github/boring-cyborg.yml | 2 +-
 BREEZE.rst| 2 +-
 CONTRIBUTING.rst  | 4 ++--
 docs/build_docs.py| 2 +-
 docs/project.rst  | 2 +-
 5 files changed, 6 insertions(+), 6 deletions(-)



[airflow] branch master updated (cb979f9 -> 45639cd)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from cb979f9  Get Airflow configs with sensitive data from 
CloudSecretManagerBackend (#11024)
 add 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)

No new revisions were added by this update.

Summary of changes:
 airflow/example_dags/example_task_group.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[GitHub] [airflow] ephraimbuddy commented on a change in pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


ephraimbuddy commented on a change in pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#discussion_r492573833



##
File path: airflow/providers/microsoft/azure/transfers/local_to_adls.py
##
@@ -0,0 +1,96 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from typing import Dict, Any, Optional
+from airflow import AirflowException
+from airflow.models import BaseOperator
+from airflow.providers.microsoft.azure.hooks.azure_data_lake import 
AzureDataLakeHook
+from airflow.utils.decorators import apply_defaults
+
+
+class AzureDataLakeStorageUploadOperator(BaseOperator):
+"""
+Upload file(s) to Azure Data Lake
+
+:param local_path: local path. Can be single file, directory (in which 
case,
+upload recursively) or glob pattern. Recursive glob patterns using 
`**`
+are not supported.
+:type local_path: str
+:param remote_path: Remote path to upload to; if multiple files, this is 
the
+directory root to write within.
+:type remote_path: str
+:param nthreads: Number of threads to use. If None, uses the number of 
cores.
+:type nthreads: int
+:param overwrite: Whether to forcibly overwrite existing files/directories.
+If False and remote path is a directory, will quit regardless if 
any files
+would be overwritten or not. If True, only matching filenames are 
actually
+overwritten.
+:type overwrite: bool
+:param buffersize: int [2**22]
+Number of bytes for internal buffer. This block cannot be bigger 
than
+a chunk and cannot be smaller than a block.
+:type buffersize: int
+:param blocksize: int [2**22]
+Number of bytes for a block. Within each chunk, we write a smaller
+block for each API call. This block cannot be bigger than a chunk.
+:type blocksize: int
+:param extra_upload_options: Extra upload options to add to the hook 
upload method
+:type extra_upload_options: dict
+:param azure_data_lake_conn_id: Reference to the Azure Data Lake 
connection.
+:type azure_data_lake_conn_id: str
+"""
+
+@apply_defaults
+def __init__(
+self,
+*,
+local_path: str,
+remote_path: str,
+overwrite: bool = True,
+nthreads: int = 64,
+buffersize: int = 4194304,
+blocksize: int = 4194304,
+extra_upload_options: Optional[Dict[str, Any]] = None,
+azure_data_lake_conn_id: str = 'azure_data_lake_default',
+**kwargs,
+):
+super().__init__(**kwargs)
+self.local_path = local_path
+self.remote_path = remote_path
+self.overwrite = overwrite
+self.nthreads = nthreads
+self.buffersize = buffersize
+self.blocksize = blocksize
+self.extra_upload_options = extra_upload_options
+self.azure_data_lake_conn_id = azure_data_lake_conn_id
+
+def execute(self, context):
+if '**' in self.local_path:
+raise AirflowException("Recursive glob patterns using `**` are not 
supported")
+if not self.extra_upload_options:

Review comment:
   Hi, sorry, I don't realy get the question. If it's about the 
extra_upload_options, I think that empty dictionary is Ok instead of None or 
what do you advise?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (45639cd -> e3a5900)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)
 add e3a5900  Replace Airflow Slack Invite old link to short link (#11071)

No new revisions were added by this update.

Summary of changes:
 .github/boring-cyborg.yml | 2 +-
 BREEZE.rst| 2 +-
 CONTRIBUTING.rst  | 4 ++--
 docs/build_docs.py| 2 +-
 docs/project.rst  | 2 +-
 5 files changed, 6 insertions(+), 6 deletions(-)



[airflow] branch master updated (cb979f9 -> 45639cd)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from cb979f9  Get Airflow configs with sensitive data from 
CloudSecretManagerBackend (#11024)
 add 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)

No new revisions were added by this update.

Summary of changes:
 airflow/example_dags/example_task_group.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[airflow] branch master updated (45639cd -> e3a5900)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)
 add e3a5900  Replace Airflow Slack Invite old link to short link (#11071)

No new revisions were added by this update.

Summary of changes:
 .github/boring-cyborg.yml | 2 +-
 BREEZE.rst| 2 +-
 CONTRIBUTING.rst  | 4 ++--
 docs/build_docs.py| 2 +-
 docs/project.rst  | 2 +-
 5 files changed, 6 insertions(+), 6 deletions(-)



[airflow] branch master updated (cb979f9 -> 45639cd)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from cb979f9  Get Airflow configs with sensitive data from 
CloudSecretManagerBackend (#11024)
 add 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)

No new revisions were added by this update.

Summary of changes:
 airflow/example_dags/example_task_group.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[airflow] branch master updated (45639cd -> e3a5900)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)
 add e3a5900  Replace Airflow Slack Invite old link to short link (#11071)

No new revisions were added by this update.

Summary of changes:
 .github/boring-cyborg.yml | 2 +-
 BREEZE.rst| 2 +-
 CONTRIBUTING.rst  | 4 ++--
 docs/build_docs.py| 2 +-
 docs/project.rst  | 2 +-
 5 files changed, 6 insertions(+), 6 deletions(-)



[airflow] branch master updated (45639cd -> e3a5900)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 45639cd  Add some tasks using BashOperator in TaskGroup example dag 
(#11072)
 add e3a5900  Replace Airflow Slack Invite old link to short link (#11071)

No new revisions were added by this update.

Summary of changes:
 .github/boring-cyborg.yml | 2 +-
 BREEZE.rst| 2 +-
 CONTRIBUTING.rst  | 4 ++--
 docs/build_docs.py| 2 +-
 docs/project.rst  | 2 +-
 5 files changed, 6 insertions(+), 6 deletions(-)



[GitHub] [airflow] ephraimbuddy commented on a change in pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


ephraimbuddy commented on a change in pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#discussion_r492573833



##
File path: airflow/providers/microsoft/azure/transfers/local_to_adls.py
##
@@ -0,0 +1,96 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from typing import Dict, Any, Optional
+from airflow import AirflowException
+from airflow.models import BaseOperator
+from airflow.providers.microsoft.azure.hooks.azure_data_lake import 
AzureDataLakeHook
+from airflow.utils.decorators import apply_defaults
+
+
+class AzureDataLakeStorageUploadOperator(BaseOperator):
+"""
+Upload file(s) to Azure Data Lake
+
+:param local_path: local path. Can be single file, directory (in which 
case,
+upload recursively) or glob pattern. Recursive glob patterns using 
`**`
+are not supported.
+:type local_path: str
+:param remote_path: Remote path to upload to; if multiple files, this is 
the
+directory root to write within.
+:type remote_path: str
+:param nthreads: Number of threads to use. If None, uses the number of 
cores.
+:type nthreads: int
+:param overwrite: Whether to forcibly overwrite existing files/directories.
+If False and remote path is a directory, will quit regardless if 
any files
+would be overwritten or not. If True, only matching filenames are 
actually
+overwritten.
+:type overwrite: bool
+:param buffersize: int [2**22]
+Number of bytes for internal buffer. This block cannot be bigger 
than
+a chunk and cannot be smaller than a block.
+:type buffersize: int
+:param blocksize: int [2**22]
+Number of bytes for a block. Within each chunk, we write a smaller
+block for each API call. This block cannot be bigger than a chunk.
+:type blocksize: int
+:param extra_upload_options: Extra upload options to add to the hook 
upload method
+:type extra_upload_options: dict
+:param azure_data_lake_conn_id: Reference to the Azure Data Lake 
connection.
+:type azure_data_lake_conn_id: str
+"""
+
+@apply_defaults
+def __init__(
+self,
+*,
+local_path: str,
+remote_path: str,
+overwrite: bool = True,
+nthreads: int = 64,
+buffersize: int = 4194304,
+blocksize: int = 4194304,
+extra_upload_options: Optional[Dict[str, Any]] = None,
+azure_data_lake_conn_id: str = 'azure_data_lake_default',
+**kwargs,
+):
+super().__init__(**kwargs)
+self.local_path = local_path
+self.remote_path = remote_path
+self.overwrite = overwrite
+self.nthreads = nthreads
+self.buffersize = buffersize
+self.blocksize = blocksize
+self.extra_upload_options = extra_upload_options
+self.azure_data_lake_conn_id = azure_data_lake_conn_id
+
+def execute(self, context):
+if '**' in self.local_path:
+raise AirflowException("Recursive glob patterns using `**` are not 
supported")
+if not self.extra_upload_options:

Review comment:
   Hi, sorry, I don't realy get the question. If it's about the 
extra_upload_options, I think that empty dictionary is Ok instead of None or 
what do you advise?

##
File path: airflow/providers/microsoft/azure/transfers/local_to_adls.py
##
@@ -0,0 +1,96 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the 

[GitHub] [airflow] olchas commented on a change in pull request #11061: Add template fields renderers for better UI rendering

2020-09-22 Thread GitBox


olchas commented on a change in pull request #11061:
URL: https://github.com/apache/airflow/pull/11061#discussion_r492583493



##
File path: docs/howto/custom-operator.rst
##
@@ -200,6 +200,35 @@ with actual value. Note that Jinja substitutes the 
operator attributes and not t
 
 In the example, the ``template_fields`` should be ``['guest_name']`` and not  
``['name']``
 
+Additionally you may provide ``template_fields_renderers`` dictionary which 
defines in what style the value
+from template field renders in Web UI. For example:
+
+.. code-block:: python
+
+class MyRequestOperator(BaseOperator):
+template_fields = ['request_body']
+template_fields_renderers = {'request_body': 'json'}
+
+@apply_defaults
+def __init__(
+self,
+request_body: str,
+**kwargs) -> None:
+super().__init__(**kwargs)
+self.request_body = name

Review comment:
   ```suggestion
   self.request_body = request_body
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek opened a new pull request #11078: Fix s.apache.org Slack link

2020-09-22 Thread GitBox


turbaszek opened a new pull request #11078:
URL: https://github.com/apache/airflow/pull/11078


   Fixes #11071 
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] olchas commented on a change in pull request #11061: Add template fields renderers for better UI rendering

2020-09-22 Thread GitBox


olchas commented on a change in pull request #11061:
URL: https://github.com/apache/airflow/pull/11061#discussion_r492584078



##
File path: docs/howto/custom-operator.rst
##
@@ -200,6 +200,35 @@ with actual value. Note that Jinja substitutes the 
operator attributes and not t
 
 In the example, the ``template_fields`` should be ``['guest_name']`` and not  
``['name']``
 
+Additionally you may provide ``template_fields_renderers`` dictionary which 
defines in what style the value
+from template field renders in Web UI. For example:
+
+.. code-block:: python
+
+class MyRequestOperator(BaseOperator):
+template_fields = ['request_body']
+template_fields_renderers = {'request_body': 'json'}
+
+@apply_defaults
+def __init__(
+self,
+request_body: str,
+**kwargs) -> None:
+super().__init__(**kwargs)
+self.request_body = name
+
+Currently available lexers:
+
+  - bash
+  - doc
+  - json
+  - md
+  - py
+  - rst
+  - sql
+  - yaml
+
+If you will use non existing lexer the value of the template filed be rendered 
as pretty printed object.

Review comment:
   ```suggestion
   If you use a non existing lexer then the value of the template field will be 
rendered as a pretty printed object.
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


ephraimbuddy commented on pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#issuecomment-696601230


   > Looks good to me.
   > Personally I would love to see:
   > 
   > * example DAG 
(https://github.com/apache/airflow/tree/master/airflow/providers/microsoft/azure/example_dags)
   > * system test 
(https://github.com/apache/airflow/tree/master/tests/providers/microsoft/azure/operators)
   > * documentation 
(https://github.com/apache/airflow/tree/master/docs/howto/operator)
   > 
   > There are no docs about Azure and Microsoft operators at all. You could be 
a pioneer and add docs and make a good example for future contributors. :)
   > 
   > I am sorry for such late review. :(
   
   Yeah, I plan to actively work on example dags on this provider and some 
others but I'm thinking of starting it up on a separate PR. Currently, there's 
no SystemTest class for this provider. Do you suggest I add everything on this 
PR?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on pull request #11066: Fix rules auto-detection for upgrade check

2020-09-22 Thread GitBox


turbaszek commented on pull request #11066:
URL: https://github.com/apache/airflow/pull/11066#issuecomment-696601649


   @kaxil @dimberman it seems that docs are failing:
   ```
   /opt/airflow/docs/executor/kubernetes.rst:72: WARNING: image file not 
readable: executor/../img/k8s-happy-path.png
   1291
   /opt/airflow/docs/executor/kubernetes.rst:98: WARNING: image file not 
readable: executor/../img/k8s-failed-pod.png
   ```



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on issue #11051: Create TaskHandlersMovedRule to ease upgrade to Airflow 2.0

2020-09-22 Thread GitBox


turbaszek commented on issue #11051:
URL: https://github.com/apache/airflow/issues/11051#issuecomment-696603530


   @ephraimbuddy this is more like about checking that users configuration in 
airflow.cfg uses old task handlers in `task_log_reader`. Similar to #11067



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


turbaszek commented on a change in pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#discussion_r492588547



##
File path: airflow/upgrade/rules/send_grid_moved.py
##
@@ -0,0 +1,38 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+from airflow.configuration import conf
+from airflow.upgrade.rules.base_rule import BaseRule
+
+
+class SendGridEmailerMovedRule(BaseRule):
+title = "SendGrid email uses old airflow.contrib module"
+
+description = """
+Removes the need for SendGrid email code to be in contrib package. The 
SendGrid email function \
+has been moved to airflow.providers.
+"""
+
+def check(self):
+email_conf = conf.get(section="email", key="email_backend")
+if email_conf.startswith("airflow.contrib"):

Review comment:
   I think it would be better to compare it to old sendgrid path





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy edited a comment on pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


ephraimbuddy edited a comment on pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#issuecomment-696601230


   > Looks good to me.
   > Personally I would love to see:
   > 
   > * example DAG 
(https://github.com/apache/airflow/tree/master/airflow/providers/microsoft/azure/example_dags)
   > * system test 
(https://github.com/apache/airflow/tree/master/tests/providers/microsoft/azure/operators)
   > * documentation 
(https://github.com/apache/airflow/tree/master/docs/howto/operator)
   > 
   > There are no docs about Azure and Microsoft operators at all. You could be 
a pioneer and add docs and make a good example for future contributors. :)
   > 
   > I am sorry for such late review. :(
   
   Yeah, I plan to actively work on example dags on this provider and some 
others but I'm thinking of starting it up on a separate PR. Currently, there's 
no SystemTest class for this provider. Do you suggest I start SystemTests with 
this PR or Create another one and do example dags with tests in separate PRs? 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on pull request #11078: Fix s.apache.org Slack link

2020-09-22 Thread GitBox


kaxil commented on pull request #11078:
URL: https://github.com/apache/airflow/pull/11078#issuecomment-696608139


   Good catch



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] michalslowikowski00 commented on pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


michalslowikowski00 commented on pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#issuecomment-696609039


   > > Looks good to me.
   > > Personally I would love to see:
   > > 
   > > * example DAG 
(https://github.com/apache/airflow/tree/master/airflow/providers/microsoft/azure/example_dags)
   > > * system test 
(https://github.com/apache/airflow/tree/master/tests/providers/microsoft/azure/operators)
   > > * documentation 
(https://github.com/apache/airflow/tree/master/docs/howto/operator)
   > > 
   > > There are no docs about Azure and Microsoft operators at all. You could 
be a pioneer and add docs and make a good example for future contributors. :)
   > > I am sorry for such late review. :(
   > 
   > Yeah, I plan to actively work on example dags on this provider and some 
others but I'm thinking of starting it up on a separate PR. Currently, there's 
no SystemTest class for this provider. Do you suggest I start SystemTests with 
this PR or Create another one and do example dags with tests in separate PRs?
   
   IMHO SystemTest could be in separate PR but example dag should be in this PR.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] michalslowikowski00 edited a comment on pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


michalslowikowski00 edited a comment on pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#issuecomment-696609039


   > > Looks good to me.
   > > Personally I would love to see:
   > > 
   > > * example DAG 
(https://github.com/apache/airflow/tree/master/airflow/providers/microsoft/azure/example_dags)
   > > * system test 
(https://github.com/apache/airflow/tree/master/tests/providers/microsoft/azure/operators)
   > > * documentation 
(https://github.com/apache/airflow/tree/master/docs/howto/operator)
   > > 
   > > There are no docs about Azure and Microsoft operators at all. You could 
be a pioneer and add docs and make a good example for future contributors. :)
   > > I am sorry for such late review. :(
   > 
   > Yeah, I plan to actively work on example dags on this provider and some 
others but I'm thinking of starting it up on a separate PR. Currently, there's 
no SystemTest class for this provider. Do you suggest I start SystemTests with 
this PR or Create another one and do example dags with tests in separate PRs?
   
   IMHO SystemTest could be in separate PR but example dag should be in this 
PR. For SystemTest you need example dag anyway.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on issue #11051: Create TaskHandlersMovedRule to ease upgrade to Airflow 2.0

2020-09-22 Thread GitBox


ephraimbuddy commented on issue #11051:
URL: https://github.com/apache/airflow/issues/11051#issuecomment-696610662


   Got it



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek merged pull request #11078: Fix s.apache.org Slack link

2020-09-22 Thread GitBox


turbaszek merged pull request #11078:
URL: https://github.com/apache/airflow/pull/11078


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (e3a5900 -> 29d6297)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e3a5900  Replace Airflow Slack Invite old link to short link (#11071)
 add 29d6297  Fix s.apache.org Slack link (#11078)

No new revisions were added by this update.

Summary of changes:
 .github/boring-cyborg.yml | 2 +-
 BREEZE.rst| 2 +-
 CONTRIBUTING.rst  | 2 +-
 docs/build_docs.py| 2 +-
 docs/project.rst  | 2 +-
 5 files changed, 5 insertions(+), 5 deletions(-)



[GitHub] [airflow] ephraimbuddy commented on pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


ephraimbuddy commented on pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#issuecomment-696613659


   Alright. Thanks



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (e3a5900 -> 29d6297)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from e3a5900  Replace Airflow Slack Invite old link to short link (#11071)
 add 29d6297  Fix s.apache.org Slack link (#11078)

No new revisions were added by this update.

Summary of changes:
 .github/boring-cyborg.yml | 2 +-
 BREEZE.rst| 2 +-
 CONTRIBUTING.rst  | 2 +-
 docs/build_docs.py| 2 +-
 docs/project.rst  | 2 +-
 5 files changed, 5 insertions(+), 5 deletions(-)



[GitHub] [airflow] michalslowikowski00 commented on pull request #10814: Add LocalToAzureDataLakeStorageOperator

2020-09-22 Thread GitBox


michalslowikowski00 commented on pull request #10814:
URL: https://github.com/apache/airflow/pull/10814#issuecomment-696615944


   if you need help, let me know.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pcandoalmeida commented on a change in pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


pcandoalmeida commented on a change in pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#discussion_r492613065



##
File path: airflow/upgrade/rules/send_grid_moved.py
##
@@ -0,0 +1,38 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+from airflow.configuration import conf
+from airflow.upgrade.rules.base_rule import BaseRule
+
+
+class SendGridEmailerMovedRule(BaseRule):
+title = "SendGrid email uses old airflow.contrib module"
+
+description = """
+Removes the need for SendGrid email code to be in contrib package. The 
SendGrid email function \
+has been moved to airflow.providers.
+"""
+
+def check(self):
+email_conf = conf.get(section="email", key="email_backend")
+if email_conf.startswith("airflow.contrib"):

Review comment:
   Hi @turbaszek! Do you mean `airflow.contrib.utils.sendgrid`? I figured 
if we pick up config with `airflow.contrib` it will be the older package as 
opposed to `airflow.providers`.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


turbaszek commented on a change in pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#discussion_r492637570



##
File path: airflow/upgrade/rules/send_grid_moved.py
##
@@ -0,0 +1,38 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+from airflow.configuration import conf
+from airflow.upgrade.rules.base_rule import BaseRule
+
+
+class SendGridEmailerMovedRule(BaseRule):
+title = "SendGrid email uses old airflow.contrib module"
+
+description = """
+Removes the need for SendGrid email code to be in contrib package. The 
SendGrid email function \
+has been moved to airflow.providers.
+"""
+
+def check(self):
+email_conf = conf.get(section="email", key="email_backend")
+if email_conf.startswith("airflow.contrib"):

Review comment:
   > Do you mean airflow.contrib.utils.sendgrid
   
   Yes, this gives better precision than just `airflow.contrib` and this rule 
focuses on sendgrind not whole contrib 😉 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pcandoalmeida commented on a change in pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


pcandoalmeida commented on a change in pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#discussion_r492648824



##
File path: airflow/upgrade/rules/send_grid_moved.py
##
@@ -0,0 +1,38 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+from airflow.configuration import conf
+from airflow.upgrade.rules.base_rule import BaseRule
+
+
+class SendGridEmailerMovedRule(BaseRule):
+title = "SendGrid email uses old airflow.contrib module"
+
+description = """
+Removes the need for SendGrid email code to be in contrib package. The 
SendGrid email function \
+has been moved to airflow.providers.
+"""
+
+def check(self):
+email_conf = conf.get(section="email", key="email_backend")
+if email_conf.startswith("airflow.contrib"):

Review comment:
   Sure, makes sense 👌🏼





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] yuqian90 commented on issue #11060: Webserver error: 'NoneType' object has no attribute 'last_loaded'

2020-09-22 Thread GitBox


yuqian90 commented on issue #11060:
URL: https://github.com/apache/airflow/issues/11060#issuecomment-696660085


   I did the same steps described, triggered a dagrun and waited till some 
tasks executed. I could not reproduce the webserver error.
   
   I did get an error in the `create_cluster` TaskInstance log, but that's 
probably because of my setup:
   ```
   google.auth.exceptions.DefaultCredentialsError: Could not automatically 
determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly 
create credentials and re-run the application. For more information, please see 
https://cloud.google.com/docs/authentication/getting-started
   ```



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk opened a new pull request #11079:
URL: https://github.com/apache/airflow/pull/11079


   We introduced deletion of the old artifacts as this was a
   suspected culprit of Kubernetes Job failures. It turned out
   eventually that those Kubernetes Job failures were caused by
   the #11017 change, but it's good to do housekeeping of the
   artifacts anyway.
   
   The delete workflow action introduced in a hurry had two problems:
   
   * it runs for every fork if they sync master. This is a bit
 too invasive
   
   * it fails continuously after 10 - 30 minutes every time
 as we have too many old artifacts to delete (GitHub has
 90 days retention policy so we have likely tens of
 thousands of artifacts to delete)
   
   * it runs every hour and it causes occasional API rate limit
 exhaution (because we have too many artifacts to loop trough)
   
   This PR introduces filtering with the repo, changes frequency
   of deletion to be 4 times a day and adds script that we are
   running manualy to delete those excessive artifacts now. Eventually
   when the number of artifacts goes down the regular job shoul delete
   maybe few hundreds of artifacts appearing within the 6 hours window
   and it should stop failing.
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


kaxil commented on a change in pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#discussion_r492667776



##
File path: .github/workflows/delete_old_artifacts.yml
##
@@ -1,11 +1,12 @@
 name: 'Delete old artifacts'
 on:
   schedule:
-- cron: '0 * * * *' # every hour
+- cron: '27 */6 * * *' # run every 6 hours

Review comment:
   I would say "Every day" would be good enough, WDYT





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk commented on a change in pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#discussion_r492667994



##
File path: dev/remove_artifacts.sh
##
@@ -0,0 +1,84 @@
+#!/usr/bin/env bash

Review comment:
   I know we could have written it in python/octokit, but I've found the 
bash script that worked out of the box. Improved it slightly and refactored 
according to Google Shell Guide and it seems to do it''s job (running it now in 
the background).





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on issue #11060: Webserver error: 'NoneType' object has no attribute 'last_loaded'

2020-09-22 Thread GitBox


kaxil commented on issue #11060:
URL: https://github.com/apache/airflow/issues/11060#issuecomment-696669114


   > @kaxil have ou waited utill the task will be executed? In my case, the 
error appeared once the task was executed
   
   Same like @yuqian90 my task failed because I didn't have the creds set up 
but even after the failure I was able to browse across the views.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk commented on a change in pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#discussion_r492672151



##
File path: .github/workflows/delete_old_artifacts.yml
##
@@ -1,11 +1,12 @@
 name: 'Delete old artifacts'
 on:
   schedule:
-- cron: '0 * * * *' # every hour
+- cron: '27 */6 * * *' # run every 6 hours

Review comment:
   There is a bit of a problem - we cannot rate-limit the action and we 
occasionally exhausted the limit now.
   I did some back-of-the-envelope calculations:
   
   The limit is  5000 API calls / hr. So if we have > 3000 artifacts to delete, 
we are getting dangerously close to be able to exhaust our API calls within 
single run.
   
   We have (tops) ~ 200 builds a day  with (tops) ~ 50 artifacts each (assume 
intensive period and increasing number of artifacts)  > 10.000 artifacts to 
delete a day. Running it 4 times/day is ~ 2.500 artifacts to delete for each 
run.
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pcandoalmeida commented on pull request #11032: Add D202 pydocstyle check

2020-09-22 Thread GitBox


pcandoalmeida commented on pull request #11032:
URL: https://github.com/apache/airflow/pull/11032#issuecomment-696673880


   OK sure @kaxil 😃 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] feluelle commented on a change in pull request #11020: Fix AWS DataSync tests failing (#10985)

2020-09-22 Thread GitBox


feluelle commented on a change in pull request #11020:
URL: https://github.com/apache/airflow/pull/11020#discussion_r492674454



##
File path: setup.py
##
@@ -455,8 +455,7 @@ def write_version(filename: str = os.path.join(*[my_dir, 
"airflow", "git_version
 'ipdb',
 'jira',
 'mongomock',
-'moto==1.3.14',  # TODO - fix Datasync issues to get higher version of 
moto:
- #See: 
https://github.com/apache/airflow/issues/10985
+'moto=>1.13.16',

Review comment:
   You mean `1.3.16` not `1.13.16` ?!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk commented on a change in pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#discussion_r492675211



##
File path: .github/workflows/delete_old_artifacts.yml
##
@@ -1,11 +1,12 @@
 name: 'Delete old artifacts'
 on:
   schedule:
-- cron: '0 * * * *' # every hour
+- cron: '27 */6 * * *' # run every 6 hours

Review comment:
   BTW. I am running the deletion now using the script with ~ 1000 
deletions/hour ( I also hit rate limit with my personal token). With those 
assumptions and 90 day retention period, it will take ~ 10 days to delete all 
old artifacts (assuming we have 30% of the 10.000/day = 3000/day * 90 /24 ~ 10 
days  :). 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] feluelle commented on a change in pull request #11020: Fix AWS DataSync tests failing (#10985)

2020-09-22 Thread GitBox


feluelle commented on a change in pull request #11020:
URL: https://github.com/apache/airflow/pull/11020#discussion_r492675290



##
File path: setup.py
##
@@ -455,8 +455,7 @@ def write_version(filename: str = os.path.join(*[my_dir, 
"airflow", "git_version
 'ipdb',
 'jira',
 'mongomock',
-'moto==1.3.14',  # TODO - fix Datasync issues to get higher version of 
moto:
- #See: 
https://github.com/apache/airflow/issues/10985
+'moto>=1.13.16',

Review comment:
   ```suggestion
   'moto>=1.3.16',
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk commented on a change in pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#discussion_r492675211



##
File path: .github/workflows/delete_old_artifacts.yml
##
@@ -1,11 +1,12 @@
 name: 'Delete old artifacts'
 on:
   schedule:
-- cron: '0 * * * *' # every hour
+- cron: '27 */6 * * *' # run every 6 hours

Review comment:
   BTW. I am running the deletion now using the script with ~ 1000 
deletions/hour ( I also hit rate limit with my personal token). With those 
assumptions and 90 day retention period, it will take ~ 10 days to delete all 
old artifacts (assuming we have 30% of the 10.000/day = 3000/day * 90 /24 ~ 10  
:). 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on issue #11060: Webserver error: 'NoneType' object has no attribute 'last_loaded'

2020-09-22 Thread GitBox


turbaszek commented on issue #11060:
URL: https://github.com/apache/airflow/issues/11060#issuecomment-696676392


   Hm, interesting I will clean up my environment and will do some tests



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek merged pull request #11066: Fix rules auto-detection for upgrade check

2020-09-22 Thread GitBox


turbaszek merged pull request #11066:
URL: https://github.com/apache/airflow/pull/11066


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


turbaszek commented on pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#issuecomment-696677294


   @pcandoalmeida can you rebase please? I just merged #11066 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: Fix rules auto-detection for upgrade check (#11066)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 33dd68b  Fix rules auto-detection for upgrade check (#11066)
33dd68b is described below

commit 33dd68ba23f596850beeb6e7e63d559a8fc0d34b
Author: Tomek Urbaszek 
AuthorDate: Tue Sep 22 14:03:40 2020 +0200

Fix rules auto-detection for upgrade check (#11066)

The rules were auto-registered using metaclass but first
we would need to load modules that include the classes.
So it's simpler to just load all rule classes and avoid
metaclass.
---
 airflow/upgrade/checker.py|  5 +++--
 airflow/upgrade/problem.py|  2 +-
 airflow/upgrade/rules/__init__.py | 18 ++
 airflow/upgrade/rules/base_rule.py| 16 +---
 tests/upgrade/rules/test_base_rule.py | 13 +++--
 tests/upgrade/rules/test_conn_type_is_not_nullable.py |  3 ++-
 tests/upgrade/test_problem.py |  4 ++--
 7 files changed, 34 insertions(+), 27 deletions(-)

diff --git a/airflow/upgrade/checker.py b/airflow/upgrade/checker.py
index e8c6837..af01413 100644
--- a/airflow/upgrade/checker.py
+++ b/airflow/upgrade/checker.py
@@ -20,9 +20,10 @@ from typing import List
 
 from airflow.upgrade.formatters import BaseFormatter
 from airflow.upgrade.problem import RuleStatus
-from airflow.upgrade.rules.base_rule import BaseRule, RULES
+from airflow.upgrade.rules import get_rules
+from airflow.upgrade.rules.base_rule import BaseRule
 
-ALL_RULES = [cls() for cls in RULES]  # type: List[BaseRule]
+ALL_RULES = [cls() for cls in get_rules()]  # type: List[BaseRule]
 
 
 def check_upgrade(formatter):
diff --git a/airflow/upgrade/problem.py b/airflow/upgrade/problem.py
index d5960e6..e3de490 100644
--- a/airflow/upgrade/problem.py
+++ b/airflow/upgrade/problem.py
@@ -31,7 +31,7 @@ class RuleStatus(NamedTuple(
 
 @property
 def is_success(self):
-return bool(self.messages)
+return len(self.messages) == 0
 
 @classmethod
 def from_rule(cls, rule):
diff --git a/airflow/upgrade/rules/__init__.py 
b/airflow/upgrade/rules/__init__.py
index 13a8339..4735c7f 100644
--- a/airflow/upgrade/rules/__init__.py
+++ b/airflow/upgrade/rules/__init__.py
@@ -14,3 +14,21 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+import os
+
+
+def get_rules():
+"""Automatically discover all rules"""
+rule_classes = []
+path = os.path.dirname(os.path.abspath(__file__))
+for file in os.listdir(path):
+if not file.endswith(".py") or file in ("__init__.py", "base_rule.py"):
+continue
+py_file = file[:-3]
+mod = __import__(".".join([__name__, py_file]), fromlist=[py_file])
+classes = [getattr(mod, x) for x in dir(mod) if 
isinstance(getattr(mod, x), type)]
+for cls in classes:
+bases = [b.__name__ for b in cls.__bases__]
+if cls.__name__ != "BaseRule" and "BaseRule" in bases:
+rule_classes.append(cls)
+return rule_classes
diff --git a/airflow/upgrade/rules/base_rule.py 
b/airflow/upgrade/rules/base_rule.py
index 75ebe2f..c80ec77 100644
--- a/airflow/upgrade/rules/base_rule.py
+++ b/airflow/upgrade/rules/base_rule.py
@@ -15,24 +15,10 @@
 # specific language governing permissions and limitations
 # under the License.
 
-from abc import ABCMeta, abstractmethod
+from abc import abstractmethod
 
-from six import add_metaclass
 
-RULES = []
-
-
-class BaseRuleMeta(ABCMeta):
-def __new__(cls, clsname, bases, attrs):
-clazz = super(BaseRuleMeta, cls).__new__(cls, clsname, bases, attrs)
-if clsname != "BaseRule":
-RULES.append(clazz)
-return clazz
-
-
-@add_metaclass(BaseRuleMeta)
 class BaseRule(object):
-
 @property
 @abstractmethod
 def title(self):
diff --git a/tests/upgrade/rules/test_base_rule.py 
b/tests/upgrade/rules/test_base_rule.py
index dbf47e3..5338b91 100644
--- a/tests/upgrade/rules/test_base_rule.py
+++ b/tests/upgrade/rules/test_base_rule.py
@@ -15,12 +15,13 @@
 # specific language governing permissions and limitations
 # under the License.
 
-from airflow.upgrade.rules.base_rule import BaseRule, RULES
+from airflow.upgrade.rules import get_rules
+from airflow.upgrade.rules.conn_type_is_not_nullable import 
ConnTypeIsNotNullableRule
+from airflow.upgrade.rules.base_rule import BaseRule
 
 
 class TestBaseRule:
-def test_if_custom_rule_is_registered(self):
-class CustomRule(BaseRule):
-pass
-
-assert CustomRule in list(RULES)
+def test_rules_are_registered(self):
+rule_classes = get_rules()
+assert BaseRule not in rule_c

[GitHub] [airflow] feluelle commented on a change in pull request #11004: Pandas behaviour for None changed in 1.1.2

2020-09-22 Thread GitBox


feluelle commented on a change in pull request #11004:
URL: https://github.com/apache/airflow/pull/11004#discussion_r492679715



##
File path: setup.py
##
@@ -720,7 +720,7 @@ def is_package_excluded(package: str, exclusion_list: 
List[str]):
 'markdown>=2.5.2, <3.0',
 'markupsafe>=1.1.1, <2.0',
 'marshmallow-oneofschema>=2.0.1',
-'pandas>=0.17.1, <2.0',
+'pandas>=1.1.2, <2.0',

Review comment:
   Sounds good.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: Fix rules auto-detection for upgrade check (#11066)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 33dd68b  Fix rules auto-detection for upgrade check (#11066)
33dd68b is described below

commit 33dd68ba23f596850beeb6e7e63d559a8fc0d34b
Author: Tomek Urbaszek 
AuthorDate: Tue Sep 22 14:03:40 2020 +0200

Fix rules auto-detection for upgrade check (#11066)

The rules were auto-registered using metaclass but first
we would need to load modules that include the classes.
So it's simpler to just load all rule classes and avoid
metaclass.
---
 airflow/upgrade/checker.py|  5 +++--
 airflow/upgrade/problem.py|  2 +-
 airflow/upgrade/rules/__init__.py | 18 ++
 airflow/upgrade/rules/base_rule.py| 16 +---
 tests/upgrade/rules/test_base_rule.py | 13 +++--
 tests/upgrade/rules/test_conn_type_is_not_nullable.py |  3 ++-
 tests/upgrade/test_problem.py |  4 ++--
 7 files changed, 34 insertions(+), 27 deletions(-)

diff --git a/airflow/upgrade/checker.py b/airflow/upgrade/checker.py
index e8c6837..af01413 100644
--- a/airflow/upgrade/checker.py
+++ b/airflow/upgrade/checker.py
@@ -20,9 +20,10 @@ from typing import List
 
 from airflow.upgrade.formatters import BaseFormatter
 from airflow.upgrade.problem import RuleStatus
-from airflow.upgrade.rules.base_rule import BaseRule, RULES
+from airflow.upgrade.rules import get_rules
+from airflow.upgrade.rules.base_rule import BaseRule
 
-ALL_RULES = [cls() for cls in RULES]  # type: List[BaseRule]
+ALL_RULES = [cls() for cls in get_rules()]  # type: List[BaseRule]
 
 
 def check_upgrade(formatter):
diff --git a/airflow/upgrade/problem.py b/airflow/upgrade/problem.py
index d5960e6..e3de490 100644
--- a/airflow/upgrade/problem.py
+++ b/airflow/upgrade/problem.py
@@ -31,7 +31,7 @@ class RuleStatus(NamedTuple(
 
 @property
 def is_success(self):
-return bool(self.messages)
+return len(self.messages) == 0
 
 @classmethod
 def from_rule(cls, rule):
diff --git a/airflow/upgrade/rules/__init__.py 
b/airflow/upgrade/rules/__init__.py
index 13a8339..4735c7f 100644
--- a/airflow/upgrade/rules/__init__.py
+++ b/airflow/upgrade/rules/__init__.py
@@ -14,3 +14,21 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+import os
+
+
+def get_rules():
+"""Automatically discover all rules"""
+rule_classes = []
+path = os.path.dirname(os.path.abspath(__file__))
+for file in os.listdir(path):
+if not file.endswith(".py") or file in ("__init__.py", "base_rule.py"):
+continue
+py_file = file[:-3]
+mod = __import__(".".join([__name__, py_file]), fromlist=[py_file])
+classes = [getattr(mod, x) for x in dir(mod) if 
isinstance(getattr(mod, x), type)]
+for cls in classes:
+bases = [b.__name__ for b in cls.__bases__]
+if cls.__name__ != "BaseRule" and "BaseRule" in bases:
+rule_classes.append(cls)
+return rule_classes
diff --git a/airflow/upgrade/rules/base_rule.py 
b/airflow/upgrade/rules/base_rule.py
index 75ebe2f..c80ec77 100644
--- a/airflow/upgrade/rules/base_rule.py
+++ b/airflow/upgrade/rules/base_rule.py
@@ -15,24 +15,10 @@
 # specific language governing permissions and limitations
 # under the License.
 
-from abc import ABCMeta, abstractmethod
+from abc import abstractmethod
 
-from six import add_metaclass
 
-RULES = []
-
-
-class BaseRuleMeta(ABCMeta):
-def __new__(cls, clsname, bases, attrs):
-clazz = super(BaseRuleMeta, cls).__new__(cls, clsname, bases, attrs)
-if clsname != "BaseRule":
-RULES.append(clazz)
-return clazz
-
-
-@add_metaclass(BaseRuleMeta)
 class BaseRule(object):
-
 @property
 @abstractmethod
 def title(self):
diff --git a/tests/upgrade/rules/test_base_rule.py 
b/tests/upgrade/rules/test_base_rule.py
index dbf47e3..5338b91 100644
--- a/tests/upgrade/rules/test_base_rule.py
+++ b/tests/upgrade/rules/test_base_rule.py
@@ -15,12 +15,13 @@
 # specific language governing permissions and limitations
 # under the License.
 
-from airflow.upgrade.rules.base_rule import BaseRule, RULES
+from airflow.upgrade.rules import get_rules
+from airflow.upgrade.rules.conn_type_is_not_nullable import 
ConnTypeIsNotNullableRule
+from airflow.upgrade.rules.base_rule import BaseRule
 
 
 class TestBaseRule:
-def test_if_custom_rule_is_registered(self):
-class CustomRule(BaseRule):
-pass
-
-assert CustomRule in list(RULES)
+def test_rules_are_registered(self):
+rule_classes = get_rules()
+assert BaseRule not in rule_c

[airflow] branch v1-10-test updated: Fix rules auto-detection for upgrade check (#11066)

2020-09-22 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 33dd68b  Fix rules auto-detection for upgrade check (#11066)
33dd68b is described below

commit 33dd68ba23f596850beeb6e7e63d559a8fc0d34b
Author: Tomek Urbaszek 
AuthorDate: Tue Sep 22 14:03:40 2020 +0200

Fix rules auto-detection for upgrade check (#11066)

The rules were auto-registered using metaclass but first
we would need to load modules that include the classes.
So it's simpler to just load all rule classes and avoid
metaclass.
---
 airflow/upgrade/checker.py|  5 +++--
 airflow/upgrade/problem.py|  2 +-
 airflow/upgrade/rules/__init__.py | 18 ++
 airflow/upgrade/rules/base_rule.py| 16 +---
 tests/upgrade/rules/test_base_rule.py | 13 +++--
 tests/upgrade/rules/test_conn_type_is_not_nullable.py |  3 ++-
 tests/upgrade/test_problem.py |  4 ++--
 7 files changed, 34 insertions(+), 27 deletions(-)

diff --git a/airflow/upgrade/checker.py b/airflow/upgrade/checker.py
index e8c6837..af01413 100644
--- a/airflow/upgrade/checker.py
+++ b/airflow/upgrade/checker.py
@@ -20,9 +20,10 @@ from typing import List
 
 from airflow.upgrade.formatters import BaseFormatter
 from airflow.upgrade.problem import RuleStatus
-from airflow.upgrade.rules.base_rule import BaseRule, RULES
+from airflow.upgrade.rules import get_rules
+from airflow.upgrade.rules.base_rule import BaseRule
 
-ALL_RULES = [cls() for cls in RULES]  # type: List[BaseRule]
+ALL_RULES = [cls() for cls in get_rules()]  # type: List[BaseRule]
 
 
 def check_upgrade(formatter):
diff --git a/airflow/upgrade/problem.py b/airflow/upgrade/problem.py
index d5960e6..e3de490 100644
--- a/airflow/upgrade/problem.py
+++ b/airflow/upgrade/problem.py
@@ -31,7 +31,7 @@ class RuleStatus(NamedTuple(
 
 @property
 def is_success(self):
-return bool(self.messages)
+return len(self.messages) == 0
 
 @classmethod
 def from_rule(cls, rule):
diff --git a/airflow/upgrade/rules/__init__.py 
b/airflow/upgrade/rules/__init__.py
index 13a8339..4735c7f 100644
--- a/airflow/upgrade/rules/__init__.py
+++ b/airflow/upgrade/rules/__init__.py
@@ -14,3 +14,21 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+import os
+
+
+def get_rules():
+"""Automatically discover all rules"""
+rule_classes = []
+path = os.path.dirname(os.path.abspath(__file__))
+for file in os.listdir(path):
+if not file.endswith(".py") or file in ("__init__.py", "base_rule.py"):
+continue
+py_file = file[:-3]
+mod = __import__(".".join([__name__, py_file]), fromlist=[py_file])
+classes = [getattr(mod, x) for x in dir(mod) if 
isinstance(getattr(mod, x), type)]
+for cls in classes:
+bases = [b.__name__ for b in cls.__bases__]
+if cls.__name__ != "BaseRule" and "BaseRule" in bases:
+rule_classes.append(cls)
+return rule_classes
diff --git a/airflow/upgrade/rules/base_rule.py 
b/airflow/upgrade/rules/base_rule.py
index 75ebe2f..c80ec77 100644
--- a/airflow/upgrade/rules/base_rule.py
+++ b/airflow/upgrade/rules/base_rule.py
@@ -15,24 +15,10 @@
 # specific language governing permissions and limitations
 # under the License.
 
-from abc import ABCMeta, abstractmethod
+from abc import abstractmethod
 
-from six import add_metaclass
 
-RULES = []
-
-
-class BaseRuleMeta(ABCMeta):
-def __new__(cls, clsname, bases, attrs):
-clazz = super(BaseRuleMeta, cls).__new__(cls, clsname, bases, attrs)
-if clsname != "BaseRule":
-RULES.append(clazz)
-return clazz
-
-
-@add_metaclass(BaseRuleMeta)
 class BaseRule(object):
-
 @property
 @abstractmethod
 def title(self):
diff --git a/tests/upgrade/rules/test_base_rule.py 
b/tests/upgrade/rules/test_base_rule.py
index dbf47e3..5338b91 100644
--- a/tests/upgrade/rules/test_base_rule.py
+++ b/tests/upgrade/rules/test_base_rule.py
@@ -15,12 +15,13 @@
 # specific language governing permissions and limitations
 # under the License.
 
-from airflow.upgrade.rules.base_rule import BaseRule, RULES
+from airflow.upgrade.rules import get_rules
+from airflow.upgrade.rules.conn_type_is_not_nullable import 
ConnTypeIsNotNullableRule
+from airflow.upgrade.rules.base_rule import BaseRule
 
 
 class TestBaseRule:
-def test_if_custom_rule_is_registered(self):
-class CustomRule(BaseRule):
-pass
-
-assert CustomRule in list(RULES)
+def test_rules_are_registered(self):
+rule_classes = get_rules()
+assert BaseRule not in rule_c

[GitHub] [airflow] potiuk commented on pull request #11020: Fix AWS DataSync tests failing (#10985)

2020-09-22 Thread GitBox


potiuk commented on pull request #11020:
URL: https://github.com/apache/airflow/pull/11020#issuecomment-696683855


   I see now that we do want to get >=1.13.16 :). Good. :)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk merged pull request #11004: Pandas behaviour for None changed in 1.1.2

2020-09-22 Thread GitBox


potiuk merged pull request #11004:
URL: https://github.com/apache/airflow/pull/11004


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #11004: Pandas behaviour for None changed in 1.1.2

2020-09-22 Thread GitBox


potiuk commented on pull request #11004:
URL: https://github.com/apache/airflow/pull/11004#issuecomment-696686343


   Those errors were just K8S tests just fixed and some killed worker). Mergeing



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on pull request #11004: Pandas behaviour for None changed in 1.1.2

2020-09-22 Thread GitBox


potiuk edited a comment on pull request #11004:
URL: https://github.com/apache/airflow/pull/11004#issuecomment-696686343


   Those errors were just K8S tests just fixed and some killed worker). Merging



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (29d6297 -> 1ebd3a6)

2020-09-22 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 29d6297  Fix s.apache.org Slack link (#11078)
 add 1ebd3a6  Pandas behaviour for None changed in 1.1.2 (#11004)

No new revisions were added by this update.

Summary of changes:
 tests/providers/salesforce/hooks/test_salesforce.py | 8 ++--
 1 file changed, 6 insertions(+), 2 deletions(-)



[GitHub] [airflow] potiuk commented on pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk commented on pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#issuecomment-696687591


   Raw logs show "2020-09-22T12:02:38.4767818Z ##[error]Process completed with 
exit code 137." (seems a worker killed or OOM)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (29d6297 -> 1ebd3a6)

2020-09-22 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 29d6297  Fix s.apache.org Slack link (#11078)
 add 1ebd3a6  Pandas behaviour for None changed in 1.1.2 (#11004)

No new revisions were added by this update.

Summary of changes:
 tests/providers/salesforce/hooks/test_salesforce.py | 8 ++--
 1 file changed, 6 insertions(+), 2 deletions(-)



[airflow] branch master updated (29d6297 -> 1ebd3a6)

2020-09-22 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 29d6297  Fix s.apache.org Slack link (#11078)
 add 1ebd3a6  Pandas behaviour for None changed in 1.1.2 (#11004)

No new revisions were added by this update.

Summary of changes:
 tests/providers/salesforce/hooks/test_salesforce.py | 8 ++--
 1 file changed, 6 insertions(+), 2 deletions(-)



[GitHub] [airflow] potiuk edited a comment on pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk edited a comment on pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#issuecomment-696687591


   Raw logs show "2020-09-22T12:02:38.4767818Z ##[error]Process completed with 
exit code 137." (seems a worker killed or OOM)
   
   Merging.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk merged pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk merged pull request #11079:
URL: https://github.com/apache/airflow/pull/11079


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Improves deletion of old artifacts. (#11079)

2020-09-22 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new cea9e82  Improves deletion of old artifacts. (#11079)
cea9e82 is described below

commit cea9e829b302931d170e64ba5b08e9642c8bc82e
Author: Jarek Potiuk 
AuthorDate: Tue Sep 22 14:31:14 2020 +0200

Improves deletion of old artifacts. (#11079)

We introduced deletion of the old artifacts as this was
the suspected culprit of Kubernetes Job failures. It turned out
eventually that those Kubernetes Job failures were caused by
the #11017 change, but it's good to do housekeeping of the
artifacts anyway.

The delete workflow action introduced in a hurry had two problems:

* it runs for every fork if they sync master. This is a bit
  too invasive

* it fails continuously after 10 - 30 minutes every time
  as we have too many old artifacts to delete (GitHub has
  90 days retention policy so we have likely tens of
  thousands of artifacts to delete)

* it runs every hour and it causes occasional API rate limit
  exhaustion (because we have too many artifacts to loop trough)

This PR introduces filtering with the repo, changes the frequency
of deletion to be 4 times a day. Back of the envelope calculation
tops 4/day at 2500 artifacts to delete at every run so we have low risk
of reaching 5000 API calls/hr rate limit. and adds script that we are
running manually to delete those excessive artifacts now. Eventually
when the number of artifacts goes down the regular job should delete
maybe a few hundreds of artifacts appearing within the 6 hours window
in normal circumstances and it should stop failing then.
---
 .github/workflows/delete_old_artifacts.yml |  3 +-
 CI.rst | 10 
 dev/remove_artifacts.sh| 84 ++
 3 files changed, 96 insertions(+), 1 deletion(-)

diff --git a/.github/workflows/delete_old_artifacts.yml 
b/.github/workflows/delete_old_artifacts.yml
index c5c9da0..53d43b0 100644
--- a/.github/workflows/delete_old_artifacts.yml
+++ b/.github/workflows/delete_old_artifacts.yml
@@ -1,11 +1,12 @@
 name: 'Delete old artifacts'
 on:
   schedule:
-- cron: '0 * * * *' # every hour
+- cron: '27 */6 * * *' # run every 6 hours
 
 jobs:
   delete-artifacts:
 runs-on: ubuntu-latest
+if: github.repository == 'apache/airflow'
 steps:
   - uses: kolpav/purge-artifacts-action@v1
 with:
diff --git a/CI.rst b/CI.rst
index 8cdd62e..4f518a7 100644
--- a/CI.rst
+++ b/CI.rst
@@ -615,6 +615,16 @@ This is manually triggered workflow (via GitHub UI manual 
run) that should only
 When triggered, it will force-push the "apache/airflow" master to the fork's 
master. It's the easiest
 way to sync your fork master to the Apache Airflow's one.
 
+Delete old artifacts
+
+
+This workflow is introduced, to delete old artifacts from the Github Actions 
build. We set it to
+delete old artifacts that are > 7 days old. It only runs for the 
'apache/airflow' repository.
+
+We also have a script that can help to clean-up the old artifacts:
+`remove_artifacts.sh `_
+
+
 Naming conventions for stored images
 
 
diff --git a/dev/remove_artifacts.sh b/dev/remove_artifacts.sh
new file mode 100755
index 000..d387eb6
--- /dev/null
+++ b/dev/remove_artifacts.sh
@@ -0,0 +1,84 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+set -euo pipefail
+
+# Parameters:
+#
+# GITHUB_REPO - repository to delete the artifacts
+# GITHUB_USER - your personal user name
+# GITHUB_TOKEN - your personal token with `repo` scope
+#
+GITHUB_REPO=https://api.github.com/repos/apache/airflow
+readonly GITHUB_REPO
+
+if [[ -z ${GITHUB_USER} ]]; then
+echo 2>&1
+echo 2>&1 "Set GITHUB_USER variable to your user"
+echo 2>&1
+exit 1
+fi
+readonly GITHUB_USER
+
+if [[ -z ${GITHUB_TOKEN} ]]; then
+echo 2>&1
+echo 2>&1 "Set GITHUB_TOKEN variable to a token with 'repo' scope"

[GitHub] [airflow] potiuk commented on pull request #10784: Requirements might get upgraded without setup.py change

2020-09-22 Thread GitBox


potiuk commented on pull request #10784:
URL: https://github.com/apache/airflow/pull/10784#issuecomment-696691198


   Pushed it on top of the latest setup.py merges.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] FelixReuthlingerBMW opened a new issue #11080: Dynamic queueing via REST API trigger

2020-09-22 Thread GitBox


FelixReuthlingerBMW opened a new issue #11080:
URL: https://github.com/apache/airflow/issues/11080


   **Description**
   
   I want to run Airflow DAGs triggered using the REST API in different queue 
and want to set the queue name on calling the REST API.
   
   **Use case / motivation**
   
   Currently Airflow does support to set queues only on a per DAG deployment 
level, but in a case where I have a DAG that I want once run with high-prio and 
1000s of times with a lower prio, I don't have a chance to make this happen 
using the REST API (which we need to use for doing programmatic integration 
with other software / services.
   
   A work around is to deploy the DAG twice with some different hard-coded 
value for setting the queue, but this will cause huge amount of code and/or 
code deployment duplications. Also probably causing naming conflicts, etc. 
Applying work arounds for the part that a feature is missing that I can set the 
queue for a DAG run using the REST API's data model. 
   
   I would want to just send a Json like this to Airflow and it will figure out 
how to queue the DAG run:
   
   {
   "conf": {
   ... my application config for the DAG run...
   }
  "queue": ""
   }
   
   **Related Issues**
   
   Not aware of, yet.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #11069: Attempt to seperate JobWatcher from multiprocessing

2020-09-22 Thread GitBox


potiuk commented on pull request #11069:
URL: https://github.com/apache/airflow/pull/11069#issuecomment-696696014


   Looking at the errors/timeouts. I think we still drag too many things with 
the run method. I am still figuring out how exactly multiprocessing.Process 
pickles it's `run` parameter. Seems that making the method static did not help. 
@ashb - I'd love to understand how it works. Maybe you can explain what happens 
here? This is very interesting, but looks like we are trying to send half ot 
Airflow objects between the processes with that single run= method.
   
   BTW. Did I tell too many times, that I hate multiprocessing? 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


turbaszek commented on pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#issuecomment-696702406


   Hey @pcandoalmeida it seems that you did merge instead of rebase. I can sort 
it out if you wish. Here is a reference for future:
   https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#id49



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek edited a comment on pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


turbaszek edited a comment on pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#issuecomment-696702406


   Hey @pcandoalmeida it seems that you did merge instead of rebase. I can sort 
it out if you wish. Here is a reference for future:
   https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#id9



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pcandoalmeida commented on pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


pcandoalmeida commented on pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#issuecomment-696703989


   Hiya @turbaszek I did `git rebase airflow/v1-10-test` -- maybe that was the 
wrong way to go? I was reading the guide over the weekend, but I'll try and 
make sure to do better for next time. Thank you for sharing the guide!



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


turbaszek commented on pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#issuecomment-696705865


   > Hiya @turbaszek I did `git rebase airflow/v1-10-test` -- maybe that was 
the wrong way to go? 
   
   That was the right thing to do, after that you should have done `git push 
--force-with-lease`
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


turbaszek commented on a change in pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#discussion_r492716337



##
File path: airflow/upgrade/rules/send_grid_moved.py
##
@@ -0,0 +1,39 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+from airflow.configuration import conf
+from airflow.upgrade.rules.base_rule import BaseRule
+
+
+class SendGridEmailerMovedRule(BaseRule):
+title = "SendGrid email uses old airflow.contrib module"
+
+description = """
+Removes the need for SendGrid email code to be in contrib package. The 
SendGrid email function \
+has been moved to airflow.providers.
+"""
+
+def check(self):
+email_conf = conf.get(section="email", key="email_backend")
+email_contrib_path = "airflow.contrib.utils.sendgrid"
+if email_contrib_path in email_conf:
+email_provider_path = "airflow.providers.sendgrid.utils.emailer"
+msg = "Email backend option uses airflow.contrib module. Please 
use new module: {}".format(
+email_provider_path)
+return [msg]

Review comment:
   There should be else clause returning `[]` otherwise this rule won't 
work. Also can you please add a unit test for that?
   Using `@conf_vars({("email", "email_backend"): 
"airflow.contrib.utils.sendgrid"})` should help :)





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk commented on a change in pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#discussion_r492745847



##
File path: .github/workflows/delete_old_artifacts.yml
##
@@ -1,11 +1,12 @@
 name: 'Delete old artifacts'
 on:
   schedule:
-- cron: '0 * * * *' # every hour
+- cron: '27 */6 * * *' # run every 6 hours

Review comment:
   BTW. I think my calculations were a bit too pessimistics. Seems that 24 
hours ewere enough to delete the artifacts. The job started to succeed and it 
takes ~ 5 minutes to run when it runs. Also my script is now often finishing 
after some 30-50 artifacts so seems we got to a "clean state" much faster than 
I thought. 
   
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #11079: Improves deletion of old artifacts.

2020-09-22 Thread GitBox


potiuk commented on a change in pull request #11079:
URL: https://github.com/apache/airflow/pull/11079#discussion_r492746570



##
File path: .github/workflows/delete_old_artifacts.yml
##
@@ -1,11 +1,12 @@
 name: 'Delete old artifacts'
 on:
   schedule:
-- cron: '0 * * * *' # every hour
+- cron: '27 */6 * * *' # run every 6 hours

Review comment:
   Yep. Now when I run "delete" and no jobs finish in the meantime there 
are no artifacts to delete :) @dimberman -> artifacts are cleaned up now..





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pcandoalmeida commented on a change in pull request #11067: Add SendGrid emailer rule

2020-09-22 Thread GitBox


pcandoalmeida commented on a change in pull request #11067:
URL: https://github.com/apache/airflow/pull/11067#discussion_r492749953



##
File path: airflow/upgrade/rules/send_grid_moved.py
##
@@ -0,0 +1,39 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from __future__ import absolute_import
+
+from airflow.configuration import conf
+from airflow.upgrade.rules.base_rule import BaseRule
+
+
+class SendGridEmailerMovedRule(BaseRule):
+title = "SendGrid email uses old airflow.contrib module"
+
+description = """
+Removes the need for SendGrid email code to be in contrib package. The 
SendGrid email function \
+has been moved to airflow.providers.
+"""
+
+def check(self):
+email_conf = conf.get(section="email", key="email_backend")
+email_contrib_path = "airflow.contrib.utils.sendgrid"
+if email_contrib_path in email_conf:
+email_provider_path = "airflow.providers.sendgrid.utils.emailer"
+msg = "Email backend option uses airflow.contrib module. Please 
use new module: {}".format(
+email_provider_path)
+return [msg]

Review comment:
   Yes, of course... thank you for the tip 👍🏼 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #11080: Dynamic queueing via REST API trigger

2020-09-22 Thread GitBox


mik-laj commented on issue #11080:
URL: https://github.com/apache/airflow/issues/11080#issuecomment-696746646


   This is not officially supported, but you can try to use a cluster policy to 
configure the queue based on DAGRun.conf. See: 
   
https://airflow.readthedocs.io/en/latest/concepts.html#mutate-task-instances-before-task-execution



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] FelixReuthlingerBMW commented on issue #11080: Dynamic queueing via REST API trigger

2020-09-22 Thread GitBox


FelixReuthlingerBMW commented on issue #11080:
URL: https://github.com/apache/airflow/issues/11080#issuecomment-696749540


   Thanks for the hint. If I understand it correctly, this would just apply 
different settings to single tasks, but not the whole DAG run, right?
   
   Problem then would be, if I first start 1000 DAG runs, the high-prio DAG run 
queuing as number 1001 would still need to wait until all the other DAG runs 
were finished, independently from in which queue the tasks are executed, right?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #11080: Dynamic queueing via REST API trigger

2020-09-22 Thread GitBox


mik-laj commented on issue #11080:
URL: https://github.com/apache/airflow/issues/11080#issuecomment-696752099


   @FelixReuthlingerBMW  Yes. But you can assign all tasks from a given DAG to 
one queue. In Airflow, tasks are assigned to a queue, not a DAG Run.
   
   No. The tasks will run according to the queues to which they have been 
assigned. 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] appunni-dishq commented on issue #9860: Kubernetes Executor Config Volumes Break Airflow UI

2020-09-22 Thread GitBox


appunni-dishq commented on issue #9860:
URL: https://github.com/apache/airflow/issues/9860#issuecomment-696752137


   Hey I am using ```kubernetes.client.models.V1VolumeMount``` 
   
   I am using airflow 1.10.12 . This was broken in scheduler completely in 
1.10.10 but it's not anymore there but,
   
   My dag is failing with this error now
   ```
   import pandas
 File "", line 971, in _find_and_load
 File "", line 955, in _find_and_load_unlocked
 File "", line 665, in _load_unlocked
 File "", line 674, in exec_module
 File "", line 786, in get_code
 File "", line 503, in 
_code_to_bytecode
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/timeout.py", 
line 43, in handle_timeout
   raise AirflowTaskTimeout(self.error_message)
   airflow.exceptions.AirflowTaskTimeout: Timeout, PID: 1
   Traceback (most recent call last):
 File "/home/airflow/.local/bin/airflow", line 37, in 
   args.func(args)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 
76, in wrapper
   return f(*args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/bin/cli.py", line 
538, in run
   dag = get_dag(args)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/bin/cli.py", line 
164, in get_dag
   'parse.'.format(args.dag_id))
   airflow.exceptions.AirflowException: dag_id could not be found: 
   ```
   And when I goto Graph views I get this error as well
   
   ```
   Traceback (most recent call last):
 File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 
2447, in wsgi_app
   response = self.full_dispatch_request()
 File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 
1952, in full_dispatch_request
   rv = self.handle_user_exception(e)
 File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 
1821, in handle_user_exception
   reraise(exc_type, exc_value, tb)
 File "/home/airflow/.local/lib/python3.6/site-packages/flask/_compat.py", 
line 39, in reraise
   raise value
 File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 
1950, in full_dispatch_request
   rv = self.dispatch_request()
 File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 
1936, in dispatch_request
   return self.view_functions[rule.endpoint](**req.view_args)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/www_rbac/decorators.py",
 line 121, in wrapper
   return f(self, *args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/security/decorators.py",
 line 109, in wraps
   return f(self, *args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/www_rbac/decorators.py",
 line 92, in view_func
   return f(*args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/www_rbac/decorators.py",
 line 56, in wrapper
   return f(*args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 
74, in wrapper
   return func(*args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/www_rbac/views.py", 
line 1666, in graph
   show_external_logs=bool(external_logs))
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/www_rbac/views.py", 
line 202, in render_template
   **kwargs
 File 
"/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/baseviews.py",
 line 281, in render_template
   template, **dict(list(kwargs.items()) + list(self.extra_args.items()))
 File 
"/home/airflow/.local/lib/python3.6/site-packages/flask/templating.py", line 
140, in render_template
   ctx.app,
 File 
"/home/airflow/.local/lib/python3.6/site-packages/flask/templating.py", line 
120, in _render
   rv = template.render(context)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/jinja2/environment.py", line 
1090, in render
   self.environment.handle_exception()
 File 
"/home/airflow/.local/lib/python3.6/site-packages/jinja2/environment.py", line 
832, in handle_exception
   reraise(*rewrite_traceback_stack(source=source))
 File "/home/airflow/.local/lib/python3.6/site-packages/jinja2/_compat.py", 
line 28, in reraise
   raise value.with_traceback(tb)
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/www_rbac/templates/airflow/graph.html",
 line 18, in top-level template code
   {% extends "airflow/dag.html" %}
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/www_rbac/templates/airflow/dag.html",
 line 19, in top-level template code
   {% extends base_template %}
 File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/www_rbac/templates/airflow/master.html",
 line 19, in top-level template code
   {% extends 'appbuilder/baselayout.html' %}
  

[GitHub] [airflow] mik-laj edited a comment on issue #11080: Dynamic queueing via REST API trigger

2020-09-22 Thread GitBox


mik-laj edited a comment on issue #11080:
URL: https://github.com/apache/airflow/issues/11080#issuecomment-696752099


   @FelixReuthlingerBMW  Yes. But you can assign all tasks from a given DAG to 
one queue. In Airflow, tasks are assigned to a queue, not a DAG Run.
   
   No. The tasks will run according to the queues to which they have been 
assigned.  You can create two queues to run some tasks faster.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #10784: Requirements might get upgraded without setup.py change

2020-09-22 Thread GitBox


potiuk commented on pull request #10784:
URL: https://github.com/apache/airflow/pull/10784#issuecomment-696755122


   Also 137 Memory problem . We need to do something about it ..



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   3   4   5   >