[GitHub] [airflow] ericpp opened a new pull request #14869: Fixed autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


ericpp opened a new pull request #14869:
URL: https://github.com/apache/airflow/pull/14869


   The MySQLdb and mysql-connector-python clients use different methods for 
getting and setting the autocommit mode. This code checks the `conn` param 
passed into get/set_autocommit() to see if it's a MySQLdb instance (containing 
the get_autocommit method) or a mysql-connector-python instance (missing 
get_autocommit) and makes the appropriate autocommit calls for the client.
   
   Fixes #14857
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #14869: Fixed autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


boring-cyborg[bot] commented on pull request #14869:
URL: https://github.com/apache/airflow/pull/14869#issuecomment-801652352


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #14868: Update license check to include TypeScript file extensions

2021-03-17 Thread GitBox


github-actions[bot] commented on pull request #14868:
URL: https://github.com/apache/airflow/pull/14868#issuecomment-801627790


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest master at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #14739: Add files to generate Airflow's Python SDK

2021-03-17 Thread GitBox


github-actions[bot] commented on pull request #14739:
URL: https://github.com/apache/airflow/pull/14739#issuecomment-801606787


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest master or amend the last commit 
of the PR, and push it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Revert "Create a documentation package for Docker image (#14765)" (#14867)

2021-03-17 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 6405382  Revert "Create a documentation package for Docker image 
(#14765)" (#14867)
6405382 is described below

commit 6405382556603c80130f8c7c0a8f65e6c46fa33a
Author: Jarek Potiuk 
AuthorDate: Thu Mar 18 03:43:27 2021 +0100

Revert "Create a documentation package for Docker image (#14765)" (#14867)

This reverts commit 03d3c7db931df34a724a3f254bf13197a9063576.
---
 .../docker-images-recipes/gcloud.Dockerfile|   0
 .../docker-images-recipes/hadoop.Dockerfile|   0
 docs/apache-airflow/installation.rst   |   2 +-
 docs/apache-airflow/production-deployment.rst  | 847 -
 docs/apache-airflow/start/docker.rst   |   2 +-
 docs/conf.py   |   2 +-
 docs/docker-stack/build-arg-ref.rst| 212 --
 docs/docker-stack/build.rst| 380 -
 docs/docker-stack/entrypoint.rst   | 201 -
 docs/docker-stack/img/docker-logo.png  | Bin 50112 -> 0 bytes
 docs/docker-stack/index.rst|  54 --
 docs/docker-stack/recipes.rst  |  70 --
 docs/exts/airflow_intersphinx.py   |  11 +-
 .../exts/docs_build/dev_index_template.html.jinja2 |  12 +-
 docs/exts/docs_build/docs_builder.py   |  12 +-
 docs/exts/docs_build/fetch_inventories.py  |  11 +-
 16 files changed, 863 insertions(+), 953 deletions(-)

diff --git a/docs/docker-stack/docker-images-recipes/gcloud.Dockerfile 
b/docs/apache-airflow/docker-images-recipes/gcloud.Dockerfile
similarity index 100%
rename from docs/docker-stack/docker-images-recipes/gcloud.Dockerfile
rename to docs/apache-airflow/docker-images-recipes/gcloud.Dockerfile
diff --git a/docs/docker-stack/docker-images-recipes/hadoop.Dockerfile 
b/docs/apache-airflow/docker-images-recipes/hadoop.Dockerfile
similarity index 100%
rename from docs/docker-stack/docker-images-recipes/hadoop.Dockerfile
rename to docs/apache-airflow/docker-images-recipes/hadoop.Dockerfile
diff --git a/docs/apache-airflow/installation.rst 
b/docs/apache-airflow/installation.rst
index 0184216..eac6894 100644
--- a/docs/apache-airflow/installation.rst
+++ b/docs/apache-airflow/installation.rst
@@ -27,7 +27,7 @@ installation with other tools as well.
 
 .. note::
 
-Airflow is also distributed as a Docker image (OCI Image). Consider using 
it to guarantee that software will always run the same no matter where it is 
deployed. For more information, see: :doc:`docker-stack:index`.
+Airflow is also distributed as a Docker image (OCI Image). For more 
information, see: :ref:`docker_image`
 
 Prerequisites
 '
diff --git a/docs/apache-airflow/production-deployment.rst 
b/docs/apache-airflow/production-deployment.rst
index ecc6077..0f4dfaa 100644
--- a/docs/apache-airflow/production-deployment.rst
+++ b/docs/apache-airflow/production-deployment.rst
@@ -118,7 +118,852 @@ To mitigate these issues, make sure you have a 
:doc:`health check ` 
for use in a containerized environment. Consider using it to guarantee that 
software will always run the same no matter where it’s deployed.
+Production-ready reference Image
+
+
+For the ease of deployment in production, the community releases a 
production-ready reference container
+image.
+
+The docker image provided (as convenience binary package) in the
+`Apache Airflow DockerHub `_ is a 
bare image
+that has a few external dependencies and extras installed..
+
+The Apache Airflow image provided as convenience package is optimized for 
size, so
+it provides just a bare minimal set of the extras and dependencies installed 
and in most cases
+you want to either extend or customize the image. You can see all possible 
extras in
+:doc:`extra-packages-ref`. The set of extras used in Airflow Production image 
are available in the
+`Dockerfile 
`_.
+
+The production images are build in DockerHub from released version and release 
candidates. There
+are also images published from branches but they are used mainly for 
development and testing purpose.
+See `Airflow Git Branching 
`_
+for details.
+
+
+Customizing or extending the Production Image
+-
+
+Before you dive-deeply in the way how the Airflow Image is build, named and 
why we are doing it the
+way we do, you might want to know very quickly how you can extend or customize 
the existing image
+for Apache Airflow. This chapter gives you a short answ

[GitHub] [airflow] potiuk merged pull request #14867: Revert "Create a documentation package for Docker image (#14765)"

2021-03-17 Thread GitBox


potiuk merged pull request #14867:
URL: https://github.com/apache/airflow/pull/14867


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ryanahamilton opened a new pull request #14868: Update license check to include TypeScript file extensions

2021-03-17 Thread GitBox


ryanahamilton opened a new pull request #14868:
URL: https://github.com/apache/airflow/pull/14868


   Adds `.ts` and `.tsx` files to the `insert-license` pre-commit hook.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #14867: Revert "Create a documentation package for Docker image (#14765)"

2021-03-17 Thread GitBox


potiuk commented on pull request #14867:
URL: https://github.com/apache/airflow/pull/14867#issuecomment-801573329


   I'd love to merge that one quickly to unblock master 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #14867: Revert "Create a documentation package for Docker image (#14765)"

2021-03-17 Thread GitBox


potiuk commented on pull request #14867:
URL: https://github.com/apache/airflow/pull/14867#issuecomment-801553076


   not sure if this was the change or not ( I could not find obvious raason) 
but maybe it will fix docs building
   Currntly in master docs are building 2x longer nd fail due to time limit 
(because they are build twice for some reason)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #14867: Revert "Create a documentation package for Docker image (#14765)"

2021-03-17 Thread GitBox


potiuk opened a new pull request #14867:
URL: https://github.com/apache/airflow/pull/14867


   This reverts commit 03d3c7db931df34a724a3f254bf13197a9063576.
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Update the docs to release Providers (#14842)

2021-03-17 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 34008b5  Update the docs to release Providers (#14842)
34008b5 is described below

commit 34008b593cb32ca6696bdc0e5609059bb25736ed
Author: Kaxil Naik 
AuthorDate: Thu Mar 18 01:15:53 2021 +

Update the docs to release Providers (#14842)

While releasing the ES Provider 1.0.3, I updated the docs
to release providers.

One section was duplicate and there was other minor fixes
---
 dev/README_RELEASE_PROVIDER_PACKAGES.md | 62 +
 1 file changed, 9 insertions(+), 53 deletions(-)

diff --git a/dev/README_RELEASE_PROVIDER_PACKAGES.md 
b/dev/README_RELEASE_PROVIDER_PACKAGES.md
index 3842bc3..0584a01 100644
--- a/dev/README_RELEASE_PROVIDER_PACKAGES.md
+++ b/dev/README_RELEASE_PROVIDER_PACKAGES.md
@@ -36,6 +36,7 @@
   - [Verify by Contributors](#verify-by-contributors)
 - [Publish release](#publish-release)
   - [Summarize the voting for the Apache Airflow 
release](#summarize-the-voting-for-the-apache-airflow-release)
+  - [Publish release to SVN](#publish-release-to-svn)
   - [Publish the Regular convenience package to 
PyPI](#publish-the-regular-convenience-package-to-pypi-1)
   - [Publish documentation prepared 
before](#publish-documentation-prepared-before)
   - [Add tags in git](#add-tags-in-git-1)
@@ -255,7 +256,7 @@ export AIRFLOW_SITE_DIRECTORY="$(pwd)"
 ```shell script
 cd "${AIRFLOW_REPO_ROOT}"
 ./breeze build-docs -- \
-  --for-production
+  --for-production \
   --package-filter apache-airflow-providers \
   --package-filter 'apache-airflow-providers-*'
 ```
@@ -642,7 +643,7 @@ Cheers,
 
 
 
-### Publish release to SVN
+## Publish release to SVN
 
 The best way of doing this is to svn cp  between the two repos (this avoids 
having to upload the binaries
 again, and gives a clearer history in the svn commit logs.
@@ -699,61 +700,16 @@ svn commit -m "Release Airflow Providers on $(date)"
 ```
 
 Verify that the packages appear in
-[backport-providers](https://dist.apache.org/repos/dist/release/airflow/providers)
-
-### Publish the final version convenience package to PyPI
-
-Checkout the RC Version for the RC Version released (there is a batch of 
providers - one of them is enough):
-
-```shell script
-git checkout providers-/
-```
-
-In order to publish to PyPI you just need to build and release packages.
-
-* Generate the packages.
-
-```shell script
-./breeze --backports prepare-provider-packages both
-```
-
-if you ony build few packages, run:
-
-```shell script
-./breeze prepare-provider-packages  ...
-```
-
-In case you decided to remove some of the packages. remove them from dist 
folder now:
-
-```shell script
-ls dist/**
-rm dist/**
-```
+[providers](https://dist.apache.org/repos/dist/release/airflow/providers)
 
 
-* Verify the artifacts that would be uploaded:
-
-```shell script
-twine check dist/*
-```
-
-* Upload the package to PyPi's test environment:
-
-```shell script
-twine upload -r pypitest dist/*
-```
-
-* Verify that the test packages look good by downloading it and installing 
them into a virtual environment.
-  Twine prints the package links as output - separately for each package.
-
-* Upload the package to PyPi's production environment:
-
-```shell script
-twine upload -r pypi dist/*
-```
+## Publish the Regular convenience package to PyPI
 
+* Checkout the RC Version for the RC Version released (there is a batch of 
providers - one of them is enough):
 
-## Publish the Regular convenience package to PyPI
+```shell script
+git checkout providers-/
+```
 
 * Generate the packages with final version. Note that
   this will clean up dist folder before generating the packages, so you will 
only have the right packages there.



[GitHub] [airflow] potiuk merged pull request #14842: Update the docs to release Providers

2021-03-17 Thread GitBox


potiuk merged pull request #14842:
URL: https://github.com/apache/airflow/pull/14842


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow-site] potiuk merged pull request #393: Add docs for Apache Airflow 1.10.15

2021-03-17 Thread GitBox


potiuk merged pull request #393:
URL: https://github.com/apache/airflow-site/pull/393


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on issue #14673: Some backport provider packages are broken

2021-03-17 Thread GitBox


potiuk edited a comment on issue #14673:
URL: https://github.com/apache/airflow/issues/14673#issuecomment-801537688


   Thanks @odracci for the report - I would love if you verify if all works on 
your side with the latest versions of the packages.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #14673: Some backport provider packages are broken

2021-03-17 Thread GitBox


potiuk commented on issue #14673:
URL: https://github.com/apache/airflow/issues/14673#issuecomment-801537688


   Thanks @odracci for the report - I would love if you verify if all works on 
your side with the latest versions of the pckages.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk closed issue #14673: Some backport provider packages are broken

2021-03-17 Thread GitBox


potiuk closed issue #14673:
URL: https://github.com/apache/airflow/issues/14673


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #14673: Some backport provider packages are broken

2021-03-17 Thread GitBox


potiuk commented on issue #14673:
URL: https://github.com/apache/airflow/issues/14673#issuecomment-801537488


   New packages released. All packages with 2020.10.29 should be yanked now 
(i.e. skipped by pip, only installed if specifically requested by 
`==2020.10.20`.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #14866: Replaces 1.10.14 with 1.10.15 where needed

2021-03-17 Thread GitBox


potiuk opened a new pull request #14866:
URL: https://github.com/apache/airflow/pull/14866


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] tag backport-providers-2021.3.17 created (now eb884cd)

2021-03-17 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to tag backport-providers-2021.3.17
in repository https://gitbox.apache.org/repos/asf/airflow.git.


  at eb884cd  (commit)
No new revisions were added by this update.



svn commit: r46676 - /release/airflow/backport-providers/

2021-03-17 Thread potiuk
Author: potiuk
Date: Wed Mar 17 23:55:11 2021
New Revision: 46676

Log:
Release Airflow Backport Providers 2021.3.17 from 2021.3.17rc1

Added:

release/airflow/backport-providers/apache-airflow-backport-providers-2021.3.17-source.tar.gz
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-2021.3.17rc1-source.tar.gz

release/airflow/backport-providers/apache-airflow-backport-providers-2021.3.17-source.tar.gz.asc
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-2021.3.17rc1-source.tar.gz.asc

release/airflow/backport-providers/apache-airflow-backport-providers-2021.3.17-source.tar.gz.sha512
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-2021.3.17rc1-source.tar.gz.sha512

release/airflow/backport-providers/apache-airflow-backport-providers-apache-cassandra-2021.3.17.tar.gz
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-cassandra-2021.3.17rc1.tar.gz

release/airflow/backport-providers/apache-airflow-backport-providers-apache-cassandra-2021.3.17.tar.gz.asc
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-cassandra-2021.3.17rc1.tar.gz.asc

release/airflow/backport-providers/apache-airflow-backport-providers-apache-cassandra-2021.3.17.tar.gz.sha512
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-cassandra-2021.3.17rc1.tar.gz.sha512

release/airflow/backport-providers/apache-airflow-backport-providers-apache-hdfs-2021.3.17.tar.gz
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-hdfs-2021.3.17rc1.tar.gz

release/airflow/backport-providers/apache-airflow-backport-providers-apache-hdfs-2021.3.17.tar.gz.asc
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-hdfs-2021.3.17rc1.tar.gz.asc

release/airflow/backport-providers/apache-airflow-backport-providers-apache-hdfs-2021.3.17.tar.gz.sha512
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-hdfs-2021.3.17rc1.tar.gz.sha512

release/airflow/backport-providers/apache-airflow-backport-providers-apache-kylin-2021.3.17.tar.gz
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-kylin-2021.3.17rc1.tar.gz

release/airflow/backport-providers/apache-airflow-backport-providers-apache-kylin-2021.3.17.tar.gz.asc
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-kylin-2021.3.17rc1.tar.gz.asc

release/airflow/backport-providers/apache-airflow-backport-providers-apache-kylin-2021.3.17.tar.gz.sha512
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-kylin-2021.3.17rc1.tar.gz.sha512

release/airflow/backport-providers/apache-airflow-backport-providers-apache-livy-2021.3.17.tar.gz
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-livy-2021.3.17rc1.tar.gz

release/airflow/backport-providers/apache-airflow-backport-providers-apache-livy-2021.3.17.tar.gz.asc
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-livy-2021.3.17rc1.tar.gz.asc

release/airflow/backport-providers/apache-airflow-backport-providers-apache-livy-2021.3.17.tar.gz.sha512
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-livy-2021.3.17rc1.tar.gz.sha512

release/airflow/backport-providers/apache-airflow-backport-providers-apache-pig-2021.3.17.tar.gz
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-pig-2021.3.17rc1.tar.gz

release/airflow/backport-providers/apache-airflow-backport-providers-apache-pig-2021.3.17.tar.gz.asc
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-pig-2021.3.17rc1.tar.gz.asc

release/airflow/backport-providers/apache-airflow-backport-providers-apache-pig-2021.3.17.tar.gz.sha512
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/apache-airflow-backport-providers-apache-pig-2021.3.17rc1.tar.gz.sha512

release/airflow/backport-providers/apache-airflow-backport-providers-apache-sqoop-2021.3.17.tar.gz
  - copied unchanged from r46675, 
dev/airflow/backport-providers/2021.3.17rc1/a

[GitHub] [airflow] github-actions[bot] commented on pull request #14863: Convert tests/www/test_views.py to use Pytest fixtures

2021-03-17 Thread GitBox


github-actions[bot] commented on pull request #14863:
URL: https://github.com/apache/airflow/pull/14863#issuecomment-801509444


   [The Workflow run](https://github.com/apache/airflow/actions/runs/662776699) 
is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static 
checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider 
packages,^Checks: Helm tests$,^Test OpenAPI*.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: Pin SQLAlchemy to <1.4 due to breakage of sqlalchemy-utils (#14812)

2021-03-17 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 594f5de  Pin SQLAlchemy to <1.4 due to breakage of sqlalchemy-utils 
(#14812)
594f5de is described below

commit 594f5de1763615f61ecf3d6c6f95a00309e74f1f
Author: Jarek Potiuk 
AuthorDate: Mon Mar 15 21:28:06 2021 +0100

Pin SQLAlchemy to <1.4 due to breakage of sqlalchemy-utils (#14812)

The 1.4 releae of SQLAlchemy breaks sqlalchemy-utils.

This change pins it to < 1.4

Fixes #14811
---
 setup.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index 84f4b4e..83da0ff 100644
--- a/setup.py
+++ b/setup.py
@@ -645,7 +645,8 @@ INSTALL_REQUIREMENTS = [
 'requests>=2.20.0, <2.23.0;python_version<"3.0"',  # Required to keep 
snowflake happy
 'requests>=2.20.0, <2.24.0;python_version>="3.0"',  # Required to keep 
snowflake happy
 'setproctitle>=1.1.8, <2',
-'sqlalchemy~=1.3',
+# SQLAlchemy 1.4 breaks sqlalchemy-utils 
https://github.com/kvesteri/sqlalchemy-utils/issues/505
+'sqlalchemy>=1.3.18, <1.4',
 'sqlalchemy_jsonfield==0.8.0;python_version<"3.5"',
 'sqlalchemy_jsonfield~=0.9;python_version>="3.5"',
 'tabulate>=0.7.5, <0.9',



[GitHub] [airflow-site] mik-laj opened a new pull request #393: Add docs for Apache Airflow 1.10.15

2021-03-17 Thread GitBox


mik-laj opened a new pull request #393:
URL: https://github.com/apache/airflow-site/pull/393


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #14865: Speed up TestFlaskCli test

2021-03-17 Thread GitBox


github-actions[bot] commented on pull request #14865:
URL: https://github.com/apache/airflow/pull/14865#issuecomment-801506619


   The PR is likely OK to be merged with just subset of tests for default 
Python and Database versions without running the full matrix of tests, because 
it does not modify the core of Airflow. If the committers decide that the full 
tests matrix is needed, they will add the label 'full tests needed'. Then you 
should rebase to the latest master or amend the last commit of the PR, and push 
it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bbovenzi commented on pull request #13199: Create dag dependencies view

2021-03-17 Thread GitBox


bbovenzi commented on pull request #13199:
URL: https://github.com/apache/airflow/pull/13199#issuecomment-801500851


   @ms32035 sounds good. I'll hold off the js migrations until this is ready.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #14865: Speed up TestFlaskCli test

2021-03-17 Thread GitBox


kaxil commented on a change in pull request #14865:
URL: https://github.com/apache/airflow/pull/14865#discussion_r596443307



##
File path: tests/www/test_app.py
##
@@ -241,9 +242,13 @@ def test_correct_default_is_set_for_cookie_samesite(self):
 assert app.config['SESSION_COOKIE_SAMESITE'] == 'Lax'
 
 
-class TestFlaskCli(unittest.TestCase):
-@dont_initialize_flask_app_submodules
-def test_flask_cli_should_display_routes(self):
-with mock.patch.dict("os.environ", 
FLASK_APP="airflow.www.app:create_app"):
-output = subprocess.check_output(["flask", "routes"])
-assert "/api/v1/version" in output.decode()
+class TestFlaskCli:
+@dont_initialize_flask_app_submodules(skip_all_except=['init_appbuilder'])
+def test_flask_cli_should_display_routes(self, capsys):
+with mock.patch.dict("os.environ", 
FLASK_APP="airflow.www.app:cached_app"), mock.patch.object(
+sys, 'argv', ['flask', 'routes']
+), pytest.raises(SystemExit):
+runpy.run_module('flask', run_name='__main__')
+
+output = capsys.readouterr()
+assert "/login/" in output.out

Review comment:
   TIL about `runpy`





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #14865: Speed up TestFlaskCli test

2021-03-17 Thread GitBox


ashb commented on a change in pull request #14865:
URL: https://github.com/apache/airflow/pull/14865#discussion_r596441189



##
File path: tests/www/test_app.py
##
@@ -241,9 +242,13 @@ def test_correct_default_is_set_for_cookie_samesite(self):
 assert app.config['SESSION_COOKIE_SAMESITE'] == 'Lax'
 
 
-class TestFlaskCli(unittest.TestCase):
-@dont_initialize_flask_app_submodules
-def test_flask_cli_should_display_routes(self):
-with mock.patch.dict("os.environ", 
FLASK_APP="airflow.www.app:create_app"):
-output = subprocess.check_output(["flask", "routes"])
-assert "/api/v1/version" in output.decode()
+class TestFlaskCli:
+@dont_initialize_flask_app_submodules(skip_all_except=['init_appbuilder'])
+def test_flask_cli_should_display_routes(self, capsys):
+with mock.patch.dict("os.environ", 
FLASK_APP="airflow.www.app:cached_app"), mock.patch.object(
+sys, 'argv', ['flask', 'routes']
+), pytest.raises(SystemExit):
+runpy.run_module('flask', run_name='__main__')
+
+output = capsys.readouterr()
+assert "/login/" in output.out

Review comment:
   For speed I haven't loaded the Connexion API, so this test is changed to 
one of the build-in FAB routes instead.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #14861: Pass queue in BaseExecutor.execute_async like in airflow 1.10

2021-03-17 Thread GitBox


github-actions[bot] commented on pull request #14861:
URL: https://github.com/apache/airflow/pull/14861#issuecomment-801497457


   [The Workflow run](https://github.com/apache/airflow/actions/runs/662713522) 
is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static 
checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider 
packages,^Checks: Helm tests$,^Test OpenAPI*.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] txr150430 commented on issue #13142: Error while attempting to disable login (setting AUTH_ROLE_PUBLIC = 'Admin')

2021-03-17 Thread GitBox


txr150430 commented on issue #13142:
URL: https://github.com/apache/airflow/issues/13142#issuecomment-801497161


   Thanks @LamaAni! Upgrading to the latest release worked.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb opened a new pull request #14865: Speed up TestFlaskCli test

2021-03-17 Thread GitBox


ashb opened a new pull request #14865:
URL: https://github.com/apache/airflow/pull/14865


   Creating a new process and reloading all of the app is slow, so instead
   use the built-in `runpy` module to run the `flask` module in the same
   interpreter, taking advantage of the already-cached app.
   
   This one test was taking about 20s, now down to 3s (still slow, but much
   much faster)
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] houqp commented on pull request #14739: Add files to generate Airflow's Python SDK

2021-03-17 Thread GitBox


houqp commented on pull request #14739:
URL: https://github.com/apache/airflow/pull/14739#issuecomment-801493476


   @msumit sorry that i have been busy lately, would need some more time to 
test the change on my end. It looks good overall, but it's weird to see that 
content type assert failure prior to you reverting the change in openapi spec. 
It looks like a server error to me and I remember it used to behave correctly 
when I was testing the go client. In fact the go client code gen fix I sent 
upstream was for fixing this problem specifically.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ms32035 commented on pull request #13199: Create dag dependencies view

2021-03-17 Thread GitBox


ms32035 commented on pull request #13199:
URL: https://github.com/apache/airflow/pull/13199#issuecomment-801491972


   @bbovenzi I'd appreciate if this is merged before any other changes to the 
files affected. I already had to resolve the conflicts a couple of times, and 
since then I see a few new PRs popped up, including 
https://github.com/apache/airflow/pull/14661 which breaks this for now. I'll 
try to fix over the weekend.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #14863: Convert tests/www/test_views.py to use Pytest fixtures

2021-03-17 Thread GitBox


ashb commented on a change in pull request #14863:
URL: https://github.com/apache/airflow/pull/14863#discussion_r596430659



##
File path: tests/www/test_views.py
##
@@ -122,31 +122,39 @@ def local_context(self):
 return result
 
 
-class TestBase(unittest.TestCase):
-@classmethod
-@dont_initialize_flask_app_submodules(
-skip_all_except=[
-"init_appbuilder",
-"init_appbuilder_views",
-"init_flash_views",
-"init_jinja_globals",
-]
-)
-def setUpClass(cls):
+class TestBase:
+@pytest.fixture(scope="class")

Review comment:
   And if you make this a fixture in `tests/www/conftest.py` then it can be 
used by `tests/www/test_app.py` etc too (I have a change where I'd like to use 
this in, for instance.)





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #14863: Convert tests/www/test_views.py to use Pytest fixtures

2021-03-17 Thread GitBox


ashb commented on a change in pull request #14863:
URL: https://github.com/apache/airflow/pull/14863#discussion_r596428652



##
File path: tests/www/test_views.py
##
@@ -122,31 +122,39 @@ def local_context(self):
 return result
 
 
-class TestBase(unittest.TestCase):
-@classmethod
-@dont_initialize_flask_app_submodules(
-skip_all_except=[
-"init_appbuilder",
-"init_appbuilder_views",
-"init_flash_views",
-"init_jinja_globals",
-]
-)
-def setUpClass(cls):
+class TestBase:
+@pytest.fixture(scope="class")

Review comment:
   Making this session scoped will I think make a big difference.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] wolfier opened a new issue #14864: Using TaskGroup without context manager (Graph view visual bug)

2021-03-17 Thread GitBox


wolfier opened a new issue #14864:
URL: https://github.com/apache/airflow/issues/14864


   **Apache Airflow version**: 2.0.0
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): n/a
   
   **What happened**:
   When I do not use the context manager for the task group and instead call 
the add function to add the tasks, those tasks show up on the Graph view.
   ![Screen Shot 2021-03-17 at 2 06 17 
PM](https://user-images.githubusercontent.com/5952735/111544849-5939b200-8732-11eb-80dc-89c013aeb083.png)
 
   
   However, when I click on the task group item on the Graph UI, it will fix 
the issue. When I close the task group item, the tasks will not be displayed as 
expected.
   ![Screen Shot 2021-03-17 at 2 06 21 
PM](https://user-images.githubusercontent.com/5952735/111544848-58a11b80-8732-11eb-928b-3c76207a0107.png)
   
   **What you expected to happen**:
   I expected the tasks inside the task group to not display on the Graph view.
   ![Screen Shot 2021-03-17 at 3 17 34 
PM](https://user-images.githubusercontent.com/5952735/111545824-eaf5ef00-8733-11eb-99c2-75b051bfefe1.png)
   
   **How to reproduce it**:
   Render this DAG in Airflow
   
   ```python
   from airflow.models import DAG
   from airflow.operators.bash import BashOperator
   from airflow.operators.dummy import DummyOperator
   from airflow.utils.task_group import TaskGroup
   from datetime import datetime
   with DAG(dag_id="example_task_group", start_date=datetime(2021, 1, 1), 
tags=["example"], catchup=False) as dag:
   start = BashOperator(task_id="start", bash_command='echo 1; sleep 10; 
echo 2;')
   tg = TaskGroup("section_1", tooltip="Tasks for section_1")
   task_1 = DummyOperator(task_id="task_1")
   task_2 = BashOperator(task_id="task_2", bash_command='echo 1')
   task_3 = DummyOperator(task_id="task_3")
   tg.add(task_1)
   tg.add(task_2)
   tg.add(task_3)
   ```



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on pull request #14816: Extend HTTP extra_options to LivyHook and operator

2021-03-17 Thread GitBox


kaxil commented on pull request #14816:
URL: https://github.com/apache/airflow/pull/14816#issuecomment-801478618


   Thanks @dsynkov 🎉 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #14816: Extend HTTP extra_options to LivyHook and operator

2021-03-17 Thread GitBox


kaxil merged pull request #14816:
URL: https://github.com/apache/airflow/pull/14816


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Extend HTTP extra_options to LivyHook and operator (#14816)

2021-03-17 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 7073107  Extend HTTP extra_options to LivyHook and operator (#14816)
7073107 is described below

commit 70731073d0509ac44777624c03cd9eeae71e6fea
Author: Dmitriy Synkov <30638538+dsyn...@users.noreply.github.com>
AuthorDate: Wed Mar 17 18:17:57 2021 -0400

Extend HTTP extra_options to LivyHook and operator (#14816)

The LivyHook used by the LivyOperator has extra_options in its main 
run_method but there's no way to use them from the actual operator itself.
---
 airflow/providers/apache/livy/hooks/livy.py| 14 +++---
 airflow/providers/apache/livy/operators/livy.py|  6 +-
 airflow/providers/apache/livy/sensors/livy.py  | 16 
 tests/providers/apache/livy/operators/test_livy.py | 11 +++
 4 files changed, 35 insertions(+), 12 deletions(-)

diff --git a/airflow/providers/apache/livy/hooks/livy.py 
b/airflow/providers/apache/livy/hooks/livy.py
index 4d1d8dd..75d08af 100644
--- a/airflow/providers/apache/livy/hooks/livy.py
+++ b/airflow/providers/apache/livy/hooks/livy.py
@@ -69,8 +69,11 @@ class LivyHook(HttpHook, LoggingMixin):
 conn_type = 'livy'
 hook_name = 'Apache Livy'
 
-def __init__(self, livy_conn_id: str = default_conn_name) -> None:
+def __init__(
+self, livy_conn_id: str = default_conn_name, extra_options: 
Optional[Dict[str, Any]] = None
+) -> None:
 super().__init__(http_conn_id=livy_conn_id)
+self.extra_options = extra_options or {}
 
 def get_conn(self, headers: Optional[Dict[str, Any]] = None) -> Any:
 """
@@ -92,7 +95,6 @@ class LivyHook(HttpHook, LoggingMixin):
 method: str = 'GET',
 data: Optional[Any] = None,
 headers: Optional[Dict[str, Any]] = None,
-extra_options: Optional[Dict[Any, Any]] = None,
 ) -> Any:
 """
 Wrapper for HttpHook, allows to change method on the same HttpHook
@@ -105,20 +107,18 @@ class LivyHook(HttpHook, LoggingMixin):
 :type data: dict
 :param headers: headers
 :type headers: dict
-:param extra_options: extra options
-:type extra_options: dict
 :return: http response
 :rtype: requests.Response
 """
 if method not in ('GET', 'POST', 'PUT', 'DELETE', 'HEAD'):
 raise ValueError(f"Invalid http method '{method}'")
-if extra_options is None:
-extra_options = {'check_response': False}
+if not self.extra_options:
+self.extra_options = {'check_response': False}
 
 back_method = self.method
 self.method = method
 try:
-result = self.run(endpoint, data, headers, extra_options)
+result = self.run(endpoint, data, headers, self.extra_options)
 finally:
 self.method = back_method
 return result
diff --git a/airflow/providers/apache/livy/operators/livy.py 
b/airflow/providers/apache/livy/operators/livy.py
index 4f302a8..d135194 100644
--- a/airflow/providers/apache/livy/operators/livy.py
+++ b/airflow/providers/apache/livy/operators/livy.py
@@ -66,6 +66,8 @@ class LivyOperator(BaseOperator):
 :type livy_conn_id: str
 :param polling_interval: time in seconds between polling for job 
completion. Don't poll for values >=0
 :type polling_interval: int
+:type extra_options: A dictionary of options, where key is string and value
+depends on the option that's being modified.
 """
 
 template_fields = ('spark_params',)
@@ -92,6 +94,7 @@ class LivyOperator(BaseOperator):
 proxy_user: Optional[str] = None,
 livy_conn_id: str = 'livy_default',
 polling_interval: int = 0,
+extra_options: Optional[Dict[str, Any]] = None,
 **kwargs: Any,
 ) -> None:
 # pylint: disable-msg=too-many-arguments
@@ -119,6 +122,7 @@ class LivyOperator(BaseOperator):
 
 self._livy_conn_id = livy_conn_id
 self._polling_interval = polling_interval
+self._extra_options = extra_options or {}
 
 self._livy_hook: Optional[LivyHook] = None
 self._batch_id: Union[int, str]
@@ -131,7 +135,7 @@ class LivyOperator(BaseOperator):
 :rtype: LivyHook
 """
 if self._livy_hook is None or not isinstance(self._livy_hook, 
LivyHook):
-self._livy_hook = LivyHook(livy_conn_id=self._livy_conn_id)
+self._livy_hook = LivyHook(livy_conn_id=self._livy_conn_id, 
extra_options=self._extra_options)
 return self._livy_hook
 
 def execute(self, context: Dict[Any, Any]) -> Any:
diff --git a/airflow/providers/apache/livy/sensors/livy.py 
b/airflow/providers/apache/livy/sensors/livy.py
index ae470c9..782ae08 100644
--- a/airflow/providers/apache/livy/sensors/l

[GitHub] [airflow] uranusjr opened a new pull request #14863: Convert tests/www/test_views.py to use Pytest fixtures

2021-03-17 Thread GitBox


uranusjr opened a new pull request #14863:
URL: https://github.com/apache/airflow/pull/14863


   A follow-up to #14746, this applies a similar approach to convert tests to 
use Pytest fixtures.
   
   I managed to make all but one test to work. The one failure (marked as xfail 
for now) is `TestAirflowBaseViews::test_index`; somehow my conversion makes the 
index page to only use 8 queries instead of the previous 43. I couldn’t figure 
out what caused the change and would be happy for any insights.
   
   The change does not produce any measuable time difference from my 
benchmarks, which is kind of expected since I mostly performed one-to-one 
conversion from `setUpClass` to `scope="class"` and `setUp` to 
`scope="function"`. I intend to go through the changes and try to locate any 
possible optimisations, but it seems the tests did not re-create expensive 
resources repeated to begin with.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #14863: Convert tests/www/test_views.py to use Pytest fixtures

2021-03-17 Thread GitBox


boring-cyborg[bot] commented on pull request #14863:
URL: https://github.com/apache/airflow/pull/14863#issuecomment-801474406


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ferruzzi edited a comment on issue #8544: Create EKS operators

2021-03-17 Thread GitBox


ferruzzi edited a comment on issue #8544:
URL: https://github.com/apache/airflow/issues/8544#issuecomment-800643763


   I've started putting down the boilerplate code, I think this is how I am 
going to tackle this.  I'm not sure how formal you folks like it yet, but 
please let me know what you think of the plan.
   
   ### Goals
   
   Add a collection of Apache Airflow Operators which interact with Amazon 
Elastic Kubernetes Service (EKS) and abstract all Amazon logic from the Airflow 
user.  This will allow Airflow to be a general purpose Kubernetes orchestrator 
able to do multi-cluster orchestration across multiple clouds.
   
   ### Proposal
   
   The proposed solution is a collection of Operators, and their underlying 
Hooks, which will be added to the Amazon AWS provider package.  These Operators 
will handle creating and deleting clusters, as well as executing tasks using 
EKS Managed Node Groups.  
   
   ### Assumptions and Prerequisites
   
   * The account running the DAGs will need eks:DescribeCluster IAM permissions 
to retrieve the information currently provided by the manual kubeconfig file.
   
   ### Definitions
   
   *Pod* - A Kubernetes *pod* is the way that Kubernetes runs containers on a 
compute instance and includes containers and specifications for how they should 
run, networking, and storage. A *pod* can be a single container or multiple 
containers that always run together.
   
   *Cluster* - An Amazon EKS *cluster* consists of the Amazon EKS control 
plane, which runs the Kubernetes software and API server, and the *pod* that is 
registered with the control plane.
   
   *Operator* - An *operator* defines a single task within the workflow.
   
   *kubectl* - The Kubernetes command-line tool which allows users to run 
commands against Kubernetes clusters. Uses include deploying applications, 
inspecting and managing cluster resources, and viewing logs.
   
   *eksctl* - An open source CLI tool created by the community to create 
clusters on EKS using CloudFormation.
   
   *aws eks (cli tool)* - A CLI tool which, among other things, is used to 
generate the kubeconfig file.
   
   *kubeconfig* - A config file containing required information about clusters, 
users, namespaces, and authentication mechanisms. *kubectl* uses *kubeconfig* 
files to find the information it needs to choose a cluster and communicate with 
the API server of a cluster.
   
   *EKS Managed Node Groups* (nodegroup) - Infrastructure as a Service - *EKS 
Managed Node Groups* create and manage Amazon Elastic Compute Cloud (EC2) 
instances which host a Kubernetes cluster.  This is the default underlying 
compute platform for EKS clusters. 
   
   *Task* - The process or command being run in a pod.
   
   
   ### Context and User Experience
   
   While the basic functions of creating and running pods on EKS can be handled 
through the existing Cloud Native Computing Foundation (CNCF) Kubernetes Pod 
Operator, running the pods on EKS introduces pain points to the users, some of 
which are detailed below, and requires some specific EKS knowledge.  By 
abstracting away some of this Amazon-specific logic, we can automate and 
streamline the configuration and deployment of new pods.
   
   Currently, in order to deploy a new pod on EKS, the user needs to leverage 
the kubectl, eksctl, and aws command-line tools and generate config files to 
manually pass data to the Kubernetes Pod Operator.  The current manual process 
is:
   
   1. Create a cluster - uses the eksctl CLI tool
   2. Create a namespace - uses the kubectl CLI tool
   3. Create and attach an IAM Role for permission to log into the cluster - 
uses eksctl CLI tool
   4. Create or modify the Airflow requirements.txt file to ensure it contains 
two required packages: awscli and kubernetes==12.0.1
   5. Create and possibly edit the kubeconfig file - uses aws eks CLI tool
   6. Copy the edited kubeconfig file to the dags directory
   
   
   Using the BOTO3 python API, new Operators can automate most or all of those 
steps and create a more seamless experience for the user.
   
   ### Use Cases
   Use Case # | Short Description | Priority | Supporting Operator
   -- | -- | -- | --
   1 | As a user, I want to create a new cluster using existing pods. | 0 | 
Create Cluster
   2 | As a user, I want to be able to delete a cluster I have created. | 0 | 
Delete Cluster
   3 | As a user, I want to execute a new task on my existing pod. | 0 | Start 
Pod
   4 | As a user, I want to delete a pod that I created on a nodegroup. | 0 | 
Delete Nodegroup
   5 | As a user, I want to create a new pod using managed nodegroups. | 0 | 
Create Nodegroup
   
   Benchmarks
   
   At a minimum, this solution should offer feature parity with the Google 
Kubernetes Engine (GKE) Pod Operator functionality.
   
   **Create Cluster** - Create a Google Kubernetes Engine Cluster of specified 
dimensions
   
   ```
   operator = GKEClusterCreateOperator(
   task_id='c

[GitHub] [airflow] potiuk commented on pull request #14531: Running tests in parallel

2021-03-17 Thread GitBox


potiuk commented on pull request #14531:
URL: https://github.com/apache/airflow/pull/14531#issuecomment-801448923


   FYI. I vastly simplified the parallelisation bash code here. I am leaning 
much more on `GNU parallel` functionality rather than manually managing PIDs. 
It is much simpler and straightforward now.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #14792: Fixes some of the flaky tests in test_scheduler_job

2021-03-17 Thread GitBox


potiuk commented on pull request #14792:
URL: https://github.com/apache/airflow/pull/14792#issuecomment-801443712


   Hey @ashb  @kaxil . I think I nailed it finally. Last time when I run it  in 
parallel, everything succeeded. I improved the parallel run additionally and I 
am running it now, but I would love to merge this now.
   
   Summarising the fixes:
   
   * we cannot use __dell__ as SchedulerJob is an SQLAlchemy managed object - 
and those two do not work well
   
   * I updated all tests' setUp and tearDown to "end" SchedulerJob processes. 
That seems to help in most cases except the one count test with 195 updates.
   
   * rather than set to 0 the "min_update/min_fetch" intervals I set it to 100 
in the count test. And that finally fixed the stability of that count test I 
think. The problem was that the test run together with others run long enough 
that there was an extra "get" from serialized dags - that would normally not 
happen because of "min_fetch_interval" was actually too low it seems.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] leonsmith commented on a change in pull request #14861: Pass queue in BaseExecutor.execute_async like in airflow 1.10

2021-03-17 Thread GitBox


leonsmith commented on a change in pull request #14861:
URL: https://github.com/apache/airflow/pull/14861#discussion_r596377787



##
File path: airflow/executors/base_executor.py
##
@@ -182,10 +182,10 @@ def trigger_tasks(self, open_slots: int) -> None:
 sorted_queue = self.order_queued_tasks_by_priority()
 
 for _ in range(min((open_slots, len(self.queued_tasks:
-key, (command, _, _, ti) = sorted_queue.pop(0)
+key, (command, _, queue, ti) = sorted_queue.pop(0)
 self.queued_tasks.pop(key)
 self.running.add(key)
-self.execute_async(key=key, command=command, queue=None, 
executor_config=ti.executor_config)
+self.execute_async(key=key, command=command, queue=queue, 
executor_config=ti.executor_config)

Review comment:
   Yep celery is not broken as it does override the `trigger_tasks` to pass 
the queue.
   
   The BaseExecutor should expose all the functionality (like it used to) 
though without having to have executors override this function? Otherwise the 
queue argument to `execute_async` is redundant?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #14861: Pass queue in BaseExecutor.execute_async like in airflow 1.10

2021-03-17 Thread GitBox


kaxil commented on a change in pull request #14861:
URL: https://github.com/apache/airflow/pull/14861#discussion_r596371734



##
File path: airflow/executors/base_executor.py
##
@@ -182,10 +182,10 @@ def trigger_tasks(self, open_slots: int) -> None:
 sorted_queue = self.order_queued_tasks_by_priority()
 
 for _ in range(min((open_slots, len(self.queued_tasks:
-key, (command, _, _, ti) = sorted_queue.pop(0)
+key, (command, _, queue, ti) = sorted_queue.pop(0)
 self.queued_tasks.pop(key)
 self.running.add(key)
-self.execute_async(key=key, command=command, queue=None, 
executor_config=ti.executor_config)
+self.execute_async(key=key, command=command, queue=queue, 
executor_config=ti.executor_config)

Review comment:
   Celery Executor already overrides this method:
   
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/executors/celery_executor.py#L244-L263





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #14843: Improvements for Docker Image docs

2021-03-17 Thread GitBox


github-actions[bot] commented on pull request #14843:
URL: https://github.com/apache/airflow/pull/14843#issuecomment-801433078


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest master or amend the last commit 
of the PR, and push it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bbovenzi edited a comment on pull request #14862: Adds initial router, routes, and placeholder views

2021-03-17 Thread GitBox


bbovenzi edited a comment on pull request #14862:
URL: https://github.com/apache/airflow/pull/14862#issuecomment-801425554


   Looks good! But I think the test will break. Could we update it to test the 
router (ie pipelines will render on `/` and the 404 page will show up on 
`/invalid-path`?
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bbovenzi commented on pull request #14862: Adds initial router, routes, and placeholder views

2021-03-17 Thread GitBox


bbovenzi commented on pull request #14862:
URL: https://github.com/apache/airflow/pull/14862#issuecomment-801425554


   Looks good! But I think the test will break. Could we update it to test the 
router (ie our main page will render on `/` and the 404 page will show up on 
`/invalid-path`?
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #14846: Optimize docs build for new packages

2021-03-17 Thread GitBox


github-actions[bot] commented on pull request #14846:
URL: https://github.com/apache/airflow/pull/14846#issuecomment-801423053


   [The Workflow run](https://github.com/apache/airflow/actions/runs/662366525) 
is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static 
checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider 
packages,^Checks: Helm tests$,^Test OpenAPI*.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ryanahamilton opened a new pull request #14862: Adds initial router, routes, and placeholder views

2021-03-17 Thread GitBox


ryanahamilton opened a new pull request #14862:
URL: https://github.com/apache/airflow/pull/14862


   Resolves #14802.
   
   This adds `react-router-dom` with an initial collection of Routes and 
corresponding placeholder view components.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Goodkat commented on pull request #14860: Better "dependency already registered" warning message #14613

2021-03-17 Thread GitBox


Goodkat commented on pull request #14860:
URL: https://github.com/apache/airflow/pull/14860#issuecomment-801416674


   It is related to the issue #14613, and as it is stated there, it would be 
nice if the warning message included also the dag information, so we would have 
more of an idea where (in which dag) it is happening.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bbovenzi edited a comment on issue #14803: Add API Query Sort and Order Params

2021-03-17 Thread GitBox


bbovenzi edited a comment on issue #14803:
URL: https://github.com/apache/airflow/issues/14803#issuecomment-799495476


   A lot of our data will be displayed in some sort of list or table view. A 
necessary part of that will be the ability to sort and order the lists by any 
field.
   
   Particularly, data with datetimes should have this and default to return the 
latest results instead of the earliest:
   GET `/dags/{dag_id}/dagRuns`
   GET `/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances`
   GET `​/dags​/{dag_id}​/tasks`
   GET `/eventLogs`
   GET `/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries`
   GET `/importErrors`
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on pull request #14860: Better "dependency already registered" warning message #14613

2021-03-17 Thread GitBox


kaxil commented on pull request #14860:
URL: https://github.com/apache/airflow/pull/14860#issuecomment-801400154


   Can you add a description explaining what this PR does, why is it required, 
etc



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] leonsmith opened a new pull request #14861: Pass queue in BaseExecutor.execute_async like in airflow 1.10

2021-03-17 Thread GitBox


leonsmith opened a new pull request #14861:
URL: https://github.com/apache/airflow/pull/14861


   
[1.10](https://github.com/apache/airflow/blob/1.10.15/airflow/executors/base_executor.py#L153)
 passes the Task Instance queue, but the refactor in 
[2.0](https://github.com/apache/airflow/blob/2.0.1/airflow/executors/base_executor.py#L188)
 looks to have missed this.
   
   Any schedulers depending on the queue functionality that haven't overridden 
`trigger_tasks` will see queue functionality break when upgrading to 2.0



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #14860: Better "dependency already registered" warning message #14613

2021-03-17 Thread GitBox


boring-cyborg[bot] commented on pull request #14860:
URL: https://github.com/apache/airflow/pull/14860#issuecomment-801373609


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Goodkat opened a new pull request #14860: Better "dependency already registered" warning message #14613

2021-03-17 Thread GitBox


Goodkat opened a new pull request #14860:
URL: https://github.com/apache/airflow/pull/14860


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ericpp edited a comment on issue #14857: MySQL hook uses wrong autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


ericpp edited a comment on issue #14857:
URL: https://github.com/apache/airflow/issues/14857#issuecomment-801362326







This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ericpp edited a comment on issue #14857: MySQL hook uses wrong autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


ericpp edited a comment on issue #14857:
URL: https://github.com/apache/airflow/issues/14857#issuecomment-801362326


   @eladkal I did try that, but `conn` seems to point to a 
mysql-connector-python instance rather than the Airflow Connection model in 
`set_autocommit` and `get_autocommit`



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ericpp commented on issue #14857: MySQL hook uses wrong autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


ericpp commented on issue #14857:
URL: https://github.com/apache/airflow/issues/14857#issuecomment-801362326


   @eladkal I did try that, but `conn` seems to point to a 
mysql-connector-python instance rather than the Airflow Connection model



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rob2244 commented on issue #13755: Elasticsearch log retrieval fails when "host" field is not a string

2021-03-17 Thread GitBox


rob2244 commented on issue #13755:
URL: https://github.com/apache/airflow/issues/13755#issuecomment-801356482


   Having the same issue



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal edited a comment on issue #14857: MySQL hook uses wrong autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


eladkal edited a comment on issue #14857:
URL: https://github.com/apache/airflow/issues/14857#issuecomment-801354085


   You can know which is used by using `client_name`:
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L140
   
   
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L142
   
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L148
   
   Will you submit a PR to fix the issue?
   
   FYI @feluelle I believe you added the library?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on issue #14857: MySQL hook uses wrong autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


eladkal commented on issue #14857:
URL: https://github.com/apache/airflow/issues/14857#issuecomment-801354085


   You can know which is used by using `client_name`:
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L140
   
   
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L142
   
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L148
   
   Will you submit a PR to fix the issue?
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jagusztinl commented on issue #14856: ModuleNotFoundError: No module named 'airflow' when docker-compose up airflow-init

2021-03-17 Thread GitBox


jagusztinl commented on issue #14856:
URL: https://github.com/apache/airflow/issues/14856#issuecomment-801336330


   Bitnami created a working verison, stable verison is unusable:
   version: '2'
   
   services:
 postgresql:
   image: 'docker.io/bitnami/postgresql:10'
   volumes:
 - 'postgresql_data:/bitnami/postgresql'
   environment:
 - POSTGRESQL_DATABASE=bitnami_airflow
 - POSTGRESQL_USERNAME=bn_airflow
 - POSTGRESQL_PASSWORD=bitnami1
 - ALLOW_EMPTY_PASSWORD=yes
 redis:
   image: docker.io/bitnami/redis:6.0
   volumes:
 - 'redis_data:/bitnami'
   environment:
 - ALLOW_EMPTY_PASSWORD=yes
 airflow-scheduler:
   # TODO: to be reverted to use proper registry/distro on T39501
   # image: docker.io/bitnami/airflow-scheduler:2-debian-10
   image: bitnami/airflow-scheduler:2
   environment:
 - AIRFLOW_DATABASE_NAME=bitnami_airflow
 - AIRFLOW_DATABASE_USERNAME=bn_airflow
 - AIRFLOW_DATABASE_PASSWORD=bitnami1
 - AIRFLOW_EXECUTOR=CeleryExecutor
   volumes:
 - airflow_scheduler_data:/bitnami
 airflow-worker:
   # TODO: to be reverted to use proper registry/distro on T39501
   # image: docker.io/bitnami/airflow-worker:2-debian-10
   image: bitnami/airflow-worker:2
   environment:
 - AIRFLOW_DATABASE_NAME=bitnami_airflow
 - AIRFLOW_DATABASE_USERNAME=bn_airflow
 - AIRFLOW_DATABASE_PASSWORD=bitnami1
 - AIRFLOW_EXECUTOR=CeleryExecutor
   volumes:
 - airflow_worker_data:/bitnami
 airflow:
   image: docker.io/bitnami/airflow:2-debian-10
   environment:
 - AIRFLOW_DATABASE_NAME=bitnami_airflow
 - AIRFLOW_DATABASE_USERNAME=bn_airflow
 - AIRFLOW_DATABASE_PASSWORD=bitnami1
 - AIRFLOW_EXECUTOR=CeleryExecutor
   ports:
 - '8080:8080'
   volumes:
 - airflow_data:/bitnami
   volumes:
 airflow_scheduler_data:
   driver: local
 airflow_worker_data:
   driver: local
 airflow_data:
   driver: local
 postgresql_data:
   driver: local
 redis_data:
   driver: local
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jagusztinl commented on issue #14855: /2.0.1/docker-compose.yaml not working

2021-03-17 Thread GitBox


jagusztinl commented on issue #14855:
URL: https://github.com/apache/airflow/issues/14855#issuecomment-801336052


   Bitnami created a working version, stabe version unusable:
   
   version: '2'
   
   services:
 postgresql:
   image: 'docker.io/bitnami/postgresql:10'
   volumes:
 - 'postgresql_data:/bitnami/postgresql'
   environment:
 - POSTGRESQL_DATABASE=bitnami_airflow
 - POSTGRESQL_USERNAME=bn_airflow
 - POSTGRESQL_PASSWORD=bitnami1
 - ALLOW_EMPTY_PASSWORD=yes
 redis:
   image: docker.io/bitnami/redis:6.0
   volumes:
 - 'redis_data:/bitnami'
   environment:
 - ALLOW_EMPTY_PASSWORD=yes
 airflow-scheduler:
   # TODO: to be reverted to use proper registry/distro on T39501
   # image: docker.io/bitnami/airflow-scheduler:2-debian-10
   image: bitnami/airflow-scheduler:2
   environment:
 - AIRFLOW_DATABASE_NAME=bitnami_airflow
 - AIRFLOW_DATABASE_USERNAME=bn_airflow
 - AIRFLOW_DATABASE_PASSWORD=bitnami1
 - AIRFLOW_EXECUTOR=CeleryExecutor
   volumes:
 - airflow_scheduler_data:/bitnami
 airflow-worker:
   # TODO: to be reverted to use proper registry/distro on T39501
   # image: docker.io/bitnami/airflow-worker:2-debian-10
   image: bitnami/airflow-worker:2
   environment:
 - AIRFLOW_DATABASE_NAME=bitnami_airflow
 - AIRFLOW_DATABASE_USERNAME=bn_airflow
 - AIRFLOW_DATABASE_PASSWORD=bitnami1
 - AIRFLOW_EXECUTOR=CeleryExecutor
   volumes:
 - airflow_worker_data:/bitnami
 airflow:
   image: docker.io/bitnami/airflow:2-debian-10
   environment:
 - AIRFLOW_DATABASE_NAME=bitnami_airflow
 - AIRFLOW_DATABASE_USERNAME=bn_airflow
 - AIRFLOW_DATABASE_PASSWORD=bitnami1
 - AIRFLOW_EXECUTOR=CeleryExecutor
   ports:
 - '8080:8080'
   volumes:
 - airflow_data:/bitnami
   volumes:
 airflow_scheduler_data:
   driver: local
 airflow_worker_data:
   driver: local
 airflow_data:
   driver: local
 postgresql_data:
   driver: local
 redis_data:
   driver: local
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-1156) Using a timedelta object as a Schedule Interval with catchup=False causes the start_date to no longer be honored.

2021-03-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17303662#comment-17303662
 ] 

ASF GitHub Bot commented on AIRFLOW-1156:
-

kaxil commented on pull request #8776:
URL: https://github.com/apache/airflow/pull/8776#issuecomment-801332546


   @mpeteuil yeah like Ash suggested please create a new issue with steps to 
reproduce. We fixed it in 1.10.11 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Using a timedelta object as a Schedule Interval with catchup=False causes the 
> start_date to no longer be honored.
> -
>
> Key: AIRFLOW-1156
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1156
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.8.0
>Reporter: Zachary Lawson
>Assignee: Kaxil Naik
>Priority: Minor
> Fix For: 1.10.11
>
>
> Currently, in Airflow v1.8, if you set your schedule_interval to a timedelta 
> object and set catchup=False, the start_date is no longer honored and the DAG 
> is scheduled immediately upon unpausing the DAG. It is then schedule on the 
> schedule interval from that point onward. Example below:
> {code}
> from airflow import DAG
> from datetime import datetime, timedelta
> import logging
> from airflow.operators.python_operator import PythonOperator
> default_args = {
> 'owner': 'airflow',
> 'depends_on_past': False,
> 'start_date': datetime(2015, 6, 1),
> }
> dag = DAG('test', default_args=default_args, 
> schedule_interval=timedelta(seconds=5), catchup=False)
> def context_test(ds, **context):
> logging.info('testing')
> test_context = PythonOperator(
> task_id='test_context',
> provide_context=True,
> python_callable=context_test,
> dag=dag
> )
> {code}
> If you switch the above over to a CRON expression, the behavior of the 
> scheduling is returned to the expected.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil commented on pull request #8776: [AIRFLOW-1156] BugFix: Unpausing a DAG with catchup=False creates an extra DAG run

2021-03-17 Thread GitBox


kaxil commented on pull request #8776:
URL: https://github.com/apache/airflow/pull/8776#issuecomment-801332546


   @mpeteuil yeah like Ash suggested please create a new issue with steps to 
reproduce. We fixed it in 1.10.11 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #14859: Smart Sensors - Inconsistent Logging

2021-03-17 Thread GitBox


boring-cyborg[bot] commented on issue #14859:
URL: https://github.com/apache/airflow/issues/14859#issuecomment-801324808


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] anthonyp97 opened a new issue #14859: Smart Sensors - Inconsistent Logging

2021-03-17 Thread GitBox


anthonyp97 opened a new issue #14859:
URL: https://github.com/apache/airflow/issues/14859


   
   
   
   
   **Apache Airflow version**: 2.0.1
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): N/A
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: AWS
   - **OS** (e.g. from /etc/os-release): Ubuntu 18.04.2 LTS
   - **Kernel** (e.g. `uname -a`): GNU/Linux 4.15.0-1043-aws x86_64
   - **Others**: `LocalExecutor` with a `PARALLELISM` of 32, smart sensors 
enabled using 2 shards.
   
   **What happened**:
   
   The logs for the 2 smart sensor tasks that we run always show: `[2021-03-16 
21:07:45,415] {smart_sensor.py:373} INFO - Loaded 0 sensor_works`. However, I 
can confirm that our FTP sensors are getting registered properly in the smart 
sensors in the FTP sensor logs. Strangely, the logs for a certain FTP sensor 
will occasionally show `{smart_sensor.py:373} INFO - 4 tasks detected.` and 
then the poke information for these 4 sensors appear in this FTP sensor's log, 
so it looks like the logs are going to the wrong location.
   
   
   
   **What you expected to happen**:
   
   The logs in the smart sensor tasks themselves should show the number of 
sensor_works loaded in each smart sensor shard (it should not alwasys be 0), 
and this information should not be in a random FTP sensor's logs. Also the logs 
for a specific FTP sensor should not include logs from different sensors.
   
   
   
   **How to reproduce it**:
   
   Run a `LocalExecutor` with several sensors running in parallel, and 2+ 
shards enabled for the smart sensors. The logs in `smart_sensor_group_shard_0`, 
`smart_sensor_group_shard_1` etc. should always show `Loaded 0 sensor_works` 
when that is not what is expected. You may be able to find the sensor_works 
loaded information in the logs of one of the sensors in your DAG.
   
   
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jagusztinl commented on issue #14520: docker-compose install error

2021-03-17 Thread GitBox


jagusztinl commented on issue #14520:
URL: https://github.com/apache/airflow/issues/14520#issuecomment-801321620


   > ```yaml
   > PYTHONPATH: "/home/airflow/.local/lib/python3.6/site-packages"
   > ```
   
   Please explain in detail how the yaml file should look like please
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-1156) Using a timedelta object as a Schedule Interval with catchup=False causes the start_date to no longer be honored.

2021-03-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17303633#comment-17303633
 ] 

ASF GitHub Bot commented on AIRFLOW-1156:
-

ashb commented on pull request #8776:
URL: https://github.com/apache/airflow/pull/8776#issuecomment-801304079


   @mpeteuil could you create a new issue for that please?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Using a timedelta object as a Schedule Interval with catchup=False causes the 
> start_date to no longer be honored.
> -
>
> Key: AIRFLOW-1156
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1156
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.8.0
>Reporter: Zachary Lawson
>Assignee: Kaxil Naik
>Priority: Minor
> Fix For: 1.10.11
>
>
> Currently, in Airflow v1.8, if you set your schedule_interval to a timedelta 
> object and set catchup=False, the start_date is no longer honored and the DAG 
> is scheduled immediately upon unpausing the DAG. It is then schedule on the 
> schedule interval from that point onward. Example below:
> {code}
> from airflow import DAG
> from datetime import datetime, timedelta
> import logging
> from airflow.operators.python_operator import PythonOperator
> default_args = {
> 'owner': 'airflow',
> 'depends_on_past': False,
> 'start_date': datetime(2015, 6, 1),
> }
> dag = DAG('test', default_args=default_args, 
> schedule_interval=timedelta(seconds=5), catchup=False)
> def context_test(ds, **context):
> logging.info('testing')
> test_context = PythonOperator(
> task_id='test_context',
> provide_context=True,
> python_callable=context_test,
> dag=dag
> )
> {code}
> If you switch the above over to a CRON expression, the behavior of the 
> scheduling is returned to the expected.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on pull request #8776: [AIRFLOW-1156] BugFix: Unpausing a DAG with catchup=False creates an extra DAG run

2021-03-17 Thread GitBox


ashb commented on pull request #8776:
URL: https://github.com/apache/airflow/pull/8776#issuecomment-801304079


   @mpeteuil could you create a new issue for that please?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pgillet commented on a change in pull request #14028: Depreciate private_key_pass extra param and rename to private_key_passphrase

2021-03-17 Thread GitBox


pgillet commented on a change in pull request #14028:
URL: https://github.com/apache/airflow/pull/14028#discussion_r596253056



##
File path: airflow/providers/ssh/hooks/ssh.py
##
@@ -131,9 +132,11 @@ def __init__(  # pylint: disable=too-many-statements
 self.key_file = extra_options.get("key_file")
 
 private_key = extra_options.get('private_key')
-private_key_passphrase = 
extra_options.get('private_key_passphrase')
+self.private_key_passphrase = 
extra_options.get('private_key_passphrase')

Review comment:
   The idea was to reuse it in SFTPHook subclass, not to change it. The 
private key passphrase is consumed by Paramiko in SSHhook, while it is consumed 
by Pysftp in SFTPHook. Anyway, I do not inherit this field anymore.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #14028: Depreciate private_key_pass extra param and rename to private_key_passphrase

2021-03-17 Thread GitBox


ashb commented on a change in pull request #14028:
URL: https://github.com/apache/airflow/pull/14028#discussion_r596271017



##
File path: airflow/providers/ssh/hooks/ssh.py
##
@@ -131,9 +132,11 @@ def __init__(  # pylint: disable=too-many-statements
 self.key_file = extra_options.get("key_file")
 
 private_key = extra_options.get('private_key')
-private_key_passphrase = 
extra_options.get('private_key_passphrase')
+self.private_key_passphrase = 
extra_options.get('private_key_passphrase')

Review comment:
   Ah gotcha





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pgillet commented on a change in pull request #14028: Depreciate private_key_pass extra param and rename to private_key_passphrase

2021-03-17 Thread GitBox


pgillet commented on a change in pull request #14028:
URL: https://github.com/apache/airflow/pull/14028#discussion_r596253056



##
File path: airflow/providers/ssh/hooks/ssh.py
##
@@ -131,9 +132,11 @@ def __init__(  # pylint: disable=too-many-statements
 self.key_file = extra_options.get("key_file")
 
 private_key = extra_options.get('private_key')
-private_key_passphrase = 
extra_options.get('private_key_passphrase')
+self.private_key_passphrase = 
extra_options.get('private_key_passphrase')

Review comment:
   The idea was to reuse it in SFTPHook subclass, not to change it. The 
private key passphrase is consumed by Paramiko in SSHhook, while it is consumed 
by Pysftp in SFTPHook.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pgillet commented on a change in pull request #14028: Reuse private_key_passphrase from SSHHook super class

2021-03-17 Thread GitBox


pgillet commented on a change in pull request #14028:
URL: https://github.com/apache/airflow/pull/14028#discussion_r596253056



##
File path: airflow/providers/ssh/hooks/ssh.py
##
@@ -131,9 +132,11 @@ def __init__(  # pylint: disable=too-many-statements
 self.key_file = extra_options.get("key_file")
 
 private_key = extra_options.get('private_key')
-private_key_passphrase = 
extra_options.get('private_key_passphrase')
+self.private_key_passphrase = 
extra_options.get('private_key_passphrase')

Review comment:
   The idea was to use it in SFTPHook subclass, not to change it.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pgillet commented on a change in pull request #14028: Reuse private_key_passphrase from SSHHook super class

2021-03-17 Thread GitBox


pgillet commented on a change in pull request #14028:
URL: https://github.com/apache/airflow/pull/14028#discussion_r596250372



##
File path: airflow/providers/sftp/hooks/sftp.py
##
@@ -76,8 +75,6 @@ def __init__(self, ftp_conn_id: str = 'sftp_default', *args, 
**kwargs) -> None:
 conn = self.get_connection(self.ssh_conn_id)
 if conn.extra is not None:
 extra_options = conn.extra_dejson
-if 'private_key_pass' in extra_options:
-self.private_key_pass = 
extra_options.get('private_key_pass')

Review comment:
   :heavy_check_mark:  Done





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #14858: upgrade_check finds no problem with ignore but exit with 1

2021-03-17 Thread GitBox


boring-cyborg[bot] commented on issue #14858:
URL: https://github.com/apache/airflow/issues/14858#issuecomment-801281270


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] davidshao opened a new issue #14858: upgrade_check finds no problem with ignore but exit with 1

2021-03-17 Thread GitBox


davidshao opened a new issue #14858:
URL: https://github.com/apache/airflow/issues/14858


   
   
   
   
   **Apache Airflow version**:
   1.10.14
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   n/a
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**:
   - **OS** (e.g. from /etc/os-release):
   ```
   NAME="CentOS Linux"
   VERSION="7 (Core)"
   ID="centos"
   ID_LIKE="rhel fedora"
   VERSION_ID="7"
   PRETTY_NAME="CentOS Linux 7 (Core)"
   ANSI_COLOR="0;31"
   CPE_NAME="cpe:/o:centos:centos:7"
   HOME_URL="https://www.centos.org/";
   BUG_REPORT_URL="https://bugs.centos.org/";
   
   CENTOS_MANTISBT_PROJECT="CentOS-7"
   CENTOS_MANTISBT_PROJECT_VERSION="7"
   REDHAT_SUPPORT_PRODUCT="centos"
   REDHAT_SUPPORT_PRODUCT_VERSION="7"
   ```
   - **Kernel** (e.g. `uname -a`):
   - `Linux iebdev22 3.10.0-1160.11.1.el7.x86_64 #1 SMP Fri Dec 18 16:34:56 UTC 
2020 x86_64 x86_64 x86_64 GNU/Linux`
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   
   
   
   **What you expected to happen**:
   
   
   
   **How to reproduce it**:
   
   When running `upgrade_check` (v1.3.0) with `-I` (ignore) option, it exits 
with `1` on success.
   
   ```
   (ve) $ airflow upgrade_check --version
   1.3.0
   ```
   
   ```
(ve)$ airflow upgrade_check -i VersionCheckRule -i PodTemplateFileRule
   
   === STATUS 
=
   
   Remove airflow.AirflowMacroPlugin 
class..SUCCESS
   Ensure users are not using custom metaclasses in custom 
operatorsSUCCESS
   Chain between DAG and operator not 
allowed...SUCCESS
   Connection.conn_type is not 
nullable.SUCCESS
   Custom Executors now require full 
path...SUCCESS
   Check versions of PostgreSQL, MySQL, and SQLite to ease upgrade to Airflow 
2.0...SUCCESS
   Hooks that run DB functions must inherit from 
DBApiHook..SUCCESS
   Fernet is enabled by 
default.SUCCESS
   GCP service account key 
deprecation..SUCCESS
   Unify hostname_callable option in core 
section...SUCCESS
   Changes in import paths of hooks, operators, sensors and 
others..SUCCESS
   Legacy UI is deprecated by 
default...SUCCESS
   Logging configuration has been moved to new 
section..SUCCESS
   Removal of Mesos 
ExecutorSUCCESS
   No additional argument allowed in 
BaseOperator...SUCCESS
   SendGrid email uses old airflow.contrib 
module...SUCCESS
   Check Spark JDBC Operator default connection 
nameSUCCESS
   Changes in import path of remote task 
handlers...SUCCESS
   Connection.conn_id is not 
unique.SUCCESS
   Use CustomSQLAInterface instead of SQLAInterface for custom data 
models..SUCCESS
   Found 0 problems.
   
   Not found any problems. World is beautiful.
   You can safely update Airflow to the new version.
   
   (ve)$ echo $?
   1
   ```
   
   
   **Anything else we need to know**:
   
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #14857: MySQL hook uses wrong autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


boring-cyborg[bot] commented on issue #14857:
URL: https://github.com/apache/airflow/issues/14857#issuecomment-801278015


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ericpp opened a new issue #14857: MySQL hook uses wrong autocommit calls for mysql-connector-python

2021-03-17 Thread GitBox


ericpp opened a new issue #14857:
URL: https://github.com/apache/airflow/issues/14857


   The MySQL hook seems to be using `conn.get_autocommit()` and 
`conn.autocommit()` to get/set the autocommit flag for both mysqlclient and 
mysql-connector-python. These method don't actually exist in 
mysql-connector-python as it uses autocommit as a property rather than a method.
   
   I was able to work around it by adding an `if not callable(conn.autocommit)` 
condition to detect when mysql-connector-python is being used, but I'm sure 
there's probably a more elegant way of detecting which client is being used.
   
   mysql-connector-python documentation:
   
https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlconnection-autocommit.html
   
   Autocommit calls:
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L55
   
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L66



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #14851: Airflow workers causing 100% CPU on PostgreSQL Database

2021-03-17 Thread GitBox


mik-laj commented on issue #14851:
URL: https://github.com/apache/airflow/issues/14851#issuecomment-801274271


   have you considered using PGBouncer? See: 
https://github.com/apache/airflow/issues/13941



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj edited a comment on issue #14856: ModuleNotFoundError: No module named 'airflow' when docker-compose up airflow-init

2021-03-17 Thread GitBox


mik-laj edited a comment on issue #14856:
URL: https://github.com/apache/airflow/issues/14856#issuecomment-801272403


   https://github.com/apache/airflow/issues/14616 duplicate



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #14856: ModuleNotFoundError: No module named 'airflow' when docker-compose up airflow-init

2021-03-17 Thread GitBox


mik-laj commented on issue #14856:
URL: https://github.com/apache/airflow/issues/14856#issuecomment-801272403


   https://github.com/apache/airflow/issues/14616



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj closed issue #14856: ModuleNotFoundError: No module named 'airflow' when docker-compose up airflow-init

2021-03-17 Thread GitBox


mik-laj closed issue #14856:
URL: https://github.com/apache/airflow/issues/14856


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jagusztinl opened a new issue #14856: ModuleNotFoundError: No module named 'airflow' when docker-compose up airflow-init

2021-03-17 Thread GitBox


jagusztinl opened a new issue #14856:
URL: https://github.com/apache/airflow/issues/14856


   
   [root@localhost Airflow]# ./docker-compose-Linux-x86_64  up airflow-init
   Creating network "airflow_default" with the default driver
   Creating airflow_postgres_1 ... done
   Creating airflow_redis_1... done
   Creating airflow_airflow-init_1 ... done
   Attaching to airflow_airflow-init_1
   airflow-init_1   | BACKEND=postgresql+psycopg2
   airflow-init_1   | DB_HOST=postgres
   airflow-init_1   | DB_PORT=5432
   airflow-init_1   |
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow_airflow-init_1 exited with code 1
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bbovenzi commented on pull request #13199: Create dag dependencies view

2021-03-17 Thread GitBox


bbovenzi commented on pull request #13199:
URL: https://github.com/apache/airflow/pull/13199#issuecomment-801269882


   From a JS perspective, this looks fine. There is some linting that could be 
improved but that's not a big deal. After this PR, we would move all of the 
inline js in `graph.html` to its own file, so we'll have to rename `graph.js` 
or something.
   
   With a new React UI coming, it may be worth exploring new ways to 
incorporate this feature but I don't think that's a blocker for now.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] annotated tag 1.10.15 updated (5786dcd -> 62ed80f)

2021-03-17 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to annotated tag 1.10.15
in repository https://gitbox.apache.org/repos/asf/airflow.git.


*** WARNING: tag 1.10.15 was modified! ***

from 5786dcd  (commit)
  to 62ed80f  (tag)
 tagging 5786dcdc392f7a2649f398353a0beebef01c428e (commit)
 replaces upgrade-check/1.3.0
  by Kaxil Naik
  on Wed Mar 17 17:24:10 2021 +

- Log -
Apache Airflow 1.10.15

- Fix `airflow db upgrade` to upgrade db as intended (#13267)
- Moved boto3 limitation to snowflake (#13286)
- `KubernetesExecutor` should accept images from `executor_config` (#13074)
- Scheduler should acknowledge active runs properly (#13803)
- Bugfix: Unable to import Airflow plugins on Python 3.8 (#12859)
- Include `airflow/contrib/executors` in the dist package
- Pin Click version for Python 2.7 users
- Ensure all statsd timers use millisecond values. (#10633)
- [`kubernetes_generate_dag_yaml`] - Fix dag yaml generate function (#13816)
- Fix `airflow tasks clear` cli command wirh `--yes` (#14188)
- Fix permission error on non-POSIX filesystem (#13121) (#14383)
- Fixed deprecation message for "variables" command (#14457)
- BugFix: fix the `delete_dag` function of json_client (#14441)
- Fix merging of secrets and configmaps for `KubernetesExecutor` (#14090)
- Fix webserver exiting when gunicorn master crashes (#13470)
- Bump ini from 1.3.5 to 1.3.8 in `airflow/www_rbac`
- Bump datatables.net from 1.10.21 to 1.10.23 in `airflow/www_rbac`
- Webserver: Sanitize string passed to origin param (#14738)
- Make `rbac_app`'s `db.session` use the same timezone with `@provide_session` 
(#14025)

- Adds airflow as viable docker command in official image (#12878)
- `StreamLogWriter`: Provide (no-op) close method (#10885)
- Add 'airflow variables list' command for 1.10.x transition version (#14462)

- Update URL for Airflow docs (#13561)
- Clarifies version args for installing 1.10 in Docker (#12875)
-BEGIN PGP SIGNATURE-

iQEzBAABCAAdFiEEEnF1VgQO7y7q8bnCdfzNCiX6DksFAmBSO5cACgkQdfzNCiX6
DkuLCgf/Wjqb191s6n8dZjvF1hEXDlcVKTVa1IOPVeSq7/TioxpoUA5oi5zzHyaw
/LJN3bTpmAH1jjF6yYxr3KJ5OyxNuRKjF/cvIr//3yUUzlgtRXF1lJwMEmUoSRgp
r8yHwuSJ9LTFccPhICiWrLFNly5L888OtDtk3OQIvHfun1s1c9QNlI0LzbcMuOuE
qA/WRJUF0cGQuFhAHIez52WG9WNn0dic/DGEASfk3sVOH78ifKmsxPBFntG8+Dis
bJGH5ZPg3SUdJNsiC5MH5YVk3PgwyptS/55AY0YU2vJqdmsFmQwwwCkmFz2bPzE6
Xd71W+au53o1wuBotekuU+u9fKlzuA==
=STiL
-END PGP SIGNATURE-
---


No new revisions were added by this update.

Summary of changes:



[GitHub] [airflow] jagusztinl commented on issue #14855: /2.0.1/docker-compose.yaml not working

2021-03-17 Thread GitBox


jagusztinl commented on issue #14855:
URL: https://github.com/apache/airflow/issues/14855#issuecomment-801268402


   Thanks, working!
   
   Another error :-(
   
   Creating airflow_postgres_1 ... done
   Creating airflow_redis_1... done
   Creating airflow_airflow-init_1 ... done
   Attaching to airflow_airflow-init_1
   airflow-init_1   | BACKEND=postgresql+psycopg2
   airflow-init_1   | DB_HOST=postgres
   airflow-init_1   | DB_PORT=5432
   airflow-init_1   |
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow-init_1   | Traceback (most recent call last):
   airflow-init_1   |   File "/home/airflow/.local/bin/airflow", line 5, in 

   airflow-init_1   | from airflow.__main__ import main
   airflow-init_1   | ModuleNotFoundError: No module named 'airflow'
   airflow_airflow-init_1 exited with code 1
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] annotated tag constraints-1.10.15 updated (fa5cf0d -> 5d6602f)

2021-03-17 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to annotated tag constraints-1.10.15
in repository https://gitbox.apache.org/repos/asf/airflow.git.


*** WARNING: tag constraints-1.10.15 was modified! ***

from fa5cf0d  (commit)
  to 5d6602f  (tag)
 tagging fa5cf0d8d721edca55f63ce73f1ead6f202e1e46 (commit)
  by Kaxil Naik
  on Wed Mar 17 17:22:29 2021 +

- Log -
Apache Airflow Constraints 1.10.15
-BEGIN PGP SIGNATURE-

iQEzBAABCAAdFiEEEnF1VgQO7y7q8bnCdfzNCiX6DksFAmBSOuAACgkQdfzNCiX6
DkvmjQgAtFk4Srq2HUQ/UqHCWlRUZtSF9uoPVTfZPeWqq9fA+xEyR5PUahyjcAxG
5tfv3jQdwZsiAipIk2pwNnLj5DV+u+m75yRAmyokzt2o/r8A7r74k3+54sBIman6
KhVVXexl9tPtK8/lVoLrDrXrAsjcwDu5IlpQrIwuORi3ta/zi9CpojmY4Vmh3kle
ITpIPwNx7Ly606txd0Q5QZN7JhhljlFZ657jbrluYpJDyKkuzDQTDmjDzbuEOCjh
E9A9O7qtFSM0aWJ3s1IdqJLmOiwcNKN9JN25ka0Wr9kKMqYcKAhSDRLz6qYmrQ98
Pjinp7st2P7EYWSxhew0bIAxfkyg7Q==
=Nm6T
-END PGP SIGNATURE-
---


No new revisions were added by this update.

Summary of changes:



[GitHub] [airflow] mik-laj edited a comment on issue #14855: /2.0.1/docker-compose.yaml not working

2021-03-17 Thread GitBox


mik-laj edited a comment on issue #14855:
URL: https://github.com/apache/airflow/issues/14855#issuecomment-801252587


   Please update docker-compose to  v1.27.0 or newer 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj closed issue #14855: /2.0.1/docker-compose.yaml not working

2021-03-17 Thread GitBox


mik-laj closed issue #14855:
URL: https://github.com/apache/airflow/issues/14855


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #14855: /2.0.1/docker-compose.yaml not working

2021-03-17 Thread GitBox


mik-laj commented on issue #14855:
URL: https://github.com/apache/airflow/issues/14855#issuecomment-801252587


   Please update docker-compose. 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] itroulli edited a comment on issue #12413: Description Field for Variables

2021-03-17 Thread GitBox


itroulli edited a comment on issue #12413:
URL: https://github.com/apache/airflow/issues/12413#issuecomment-801194346


   @abhilash1in @eladkal I'm willing to dive into this as my first attempt to 
contribute if there is still interest. I'll probably need some help to start 
though. I think it's a bit more tricky than the Connection description as the 
Variables are mostly handled as a key-value pair.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on pull request #14795: Ensure we use ti.queued_by_job_id

2021-03-17 Thread GitBox


ashb commented on pull request #14795:
URL: https://github.com/apache/airflow/pull/14795#issuecomment-801233225


   The usage in Celery is correct. `ti.external_exeuctor_id` there is the 
Celery Task ID (a UUID) and it's how we keep track of what the celery task ID 
is.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




svn commit: r46672 - /release/airflow/1.10.14/

2021-03-17 Thread kaxilnaik
Author: kaxilnaik
Date: Wed Mar 17 16:25:24 2021
New Revision: 46672

Log:
Remove old release: 1.10.14

Removed:
release/airflow/1.10.14/



svn commit: r46671 - /release/airflow/1.10.15/

2021-03-17 Thread kaxilnaik
Author: kaxilnaik
Date: Wed Mar 17 16:24:50 2021
New Revision: 46671

Log:
Release Airflow 1.10.15 from 1.10.15rc1

Added:
release/airflow/1.10.15/
release/airflow/1.10.15/apache-airflow-1.10.15-bin.tar.gz
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache-airflow-1.10.15rc1-bin.tar.gz
release/airflow/1.10.15/apache-airflow-1.10.15-bin.tar.gz.asc
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache-airflow-1.10.15rc1-bin.tar.gz.asc
release/airflow/1.10.15/apache-airflow-1.10.15-bin.tar.gz.sha512
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache-airflow-1.10.15rc1-bin.tar.gz.sha512
release/airflow/1.10.15/apache-airflow-1.10.15-source.tar.gz
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache-airflow-1.10.15rc1-source.tar.gz
release/airflow/1.10.15/apache-airflow-1.10.15-source.tar.gz.asc
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache-airflow-1.10.15rc1-source.tar.gz.asc
release/airflow/1.10.15/apache-airflow-1.10.15-source.tar.gz.sha512
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache-airflow-1.10.15rc1-source.tar.gz.sha512
release/airflow/1.10.15/apache_airflow-1.10.15-py2.py3-none-any.whl
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache_airflow-1.10.15rc1-py2.py3-none-any.whl
release/airflow/1.10.15/apache_airflow-1.10.15-py2.py3-none-any.whl.asc
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache_airflow-1.10.15rc1-py2.py3-none-any.whl.asc
release/airflow/1.10.15/apache_airflow-1.10.15-py2.py3-none-any.whl.sha512
  - copied unchanged from r46664, 
dev/airflow/1.10.15rc1/apache_airflow-1.10.15rc1-py2.py3-none-any.whl.sha512



[GitHub] [airflow] github-actions[bot] commented on pull request #14735: Add readonly REST API endpoints for users

2021-03-17 Thread GitBox


github-actions[bot] commented on pull request #14735:
URL: https://github.com/apache/airflow/pull/14735#issuecomment-801221195


   The PR is likely OK to be merged with just subset of tests for default 
Python and Database versions without running the full matrix of tests, because 
it does not modify the core of Airflow. If the committers decide that the full 
tests matrix is needed, they will add the label 'full tests needed'. Then you 
should rebase to the latest master or amend the last commit of the PR, and push 
it with --force-with-lease.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ConstantinoSchillebeeckx closed issue #14591: trigger specific task in an existing DAG run

2021-03-17 Thread GitBox


ConstantinoSchillebeeckx closed issue #14591:
URL: https://github.com/apache/airflow/issues/14591


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ConstantinoSchillebeeckx commented on issue #14591: trigger specific task in an existing DAG run

2021-03-17 Thread GitBox


ConstantinoSchillebeeckx commented on issue #14591:
URL: https://github.com/apache/airflow/issues/14591#issuecomment-801219735


   @vemikhaylov it sure does solve the problem!



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT commented on pull request #14853: Improvements on "DAGs and Tasks documentation"

2021-03-17 Thread GitBox


JavierLopezT commented on pull request #14853:
URL: https://github.com/apache/airflow/pull/14853#issuecomment-801212303


   I have "broken" the tutorial dag file. It was made for being one actual dag 
split into different sections. However, I find it confusing because in each 
section there was just one part of the DAG, and in some cases, as the modified, 
I think it is useful to have the whole picture. So before fixing the static 
cheks, I would like some feedback on this, just in case I have to make other 
changes.
   
   Also, I don't know if with the code the SEO can be improved, because if you 
search DAG documentation or similar in Google it doesn't appear. It doesn't 
even appear in the search of Airflow docs. 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   >