[GitHub] [airflow] feng-tao merged pull request #5409: [AIRFLOW-5409] Added name under Who uses Apache Airflow for tracking purpose.

2019-06-12 Thread GitBox
feng-tao merged pull request #5409: [AIRFLOW-5409] Added name under Who uses 
Apache Airflow for tracking purpose.
URL: https://github.com/apache/airflow/pull/5409
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] milton0825 commented on issue #5349: [AIRFLOW-4591] Make default_pool a real pool

2019-06-12 Thread GitBox
milton0825 commented on issue #5349: [AIRFLOW-4591] Make default_pool a real 
pool
URL: https://github.com/apache/airflow/pull/5349#issuecomment-501536629
 
 
   PTAL @ashb @feng-tao @potiuk 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] brentstrong commented on issue #5183: [AIRFLOW-4410]Add Non-ssl ldap server support

2019-06-12 Thread GitBox
brentstrong commented on issue #5183: [AIRFLOW-4410]Add Non-ssl ldap server 
support
URL: https://github.com/apache/airflow/pull/5183#issuecomment-501502435
 
 
   @Jerevia could we get this updated? It would be very nice to have this 
functionality back.
   
   On a related note, is 1.10.0 the last version that supports non-SSL LDAP? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #5413: [AIRFLOW-4690] Make tests/api Pylint compatible

2019-06-12 Thread GitBox
codecov-io commented on issue #5413: [AIRFLOW-4690] Make tests/api Pylint 
compatible
URL: https://github.com/apache/airflow/pull/5413#issuecomment-501480502
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5413?src=pr=h1) 
Report
   > Merging 
[#5413](https://codecov.io/gh/apache/airflow/pull/5413?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5413/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5413?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5413  +/-   ##
   ==
   + Coverage   79.08%   79.08%   +<.01% 
   ==
 Files 483  483  
 Lines   3028430293   +9 
   ==
   + Hits2394923957   +8 
   - Misses   6335 6336   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5413?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/dagrun.py](https://codecov.io/gh/apache/airflow/pull/5413/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFncnVuLnB5)
 | `96.63% <100%> (+0.15%)` | :arrow_up: |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5413/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.02% <0%> (-0.17%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5413?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5413?src=pr=footer). 
Last update 
[34056f8...f3a0244](https://codecov.io/gh/apache/airflow/pull/5413?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293128733
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
 
 Review comment:
   Whoa! @gerardo  - that would be fantastic if we could have it and merge it 
together!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] gerardo commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
gerardo commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293127227
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
 
 Review comment:
   BTW, I have a PR almost ready for running the k8s tests inside 
docker-compose. I need to fix a couple of things and then I'll publish it. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ryanyuan commented on issue #5402: [AIRFLOW-4746] Implement GCP Cloud Tasks' Hook and Operators

2019-06-12 Thread GitBox
ryanyuan commented on issue #5402: [AIRFLOW-4746] Implement GCP Cloud Tasks' 
Hook and Operators
URL: https://github.com/apache/airflow/pull/5402#issuecomment-501453708
 
 
   @mik-laj Travis is good now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4690) Make tests/api Pylint compatible

2019-06-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862454#comment-16862454
 ] 

ASF GitHub Bot commented on AIRFLOW-4690:
-

BasPH commented on pull request #5413: [AIRFLOW-4690] Make tests/api Pylint 
compatible
URL: https://github.com/apache/airflow/pull/5413
 
 
   Make tests/api compatible with Pylint. Biggest change is adding an 
`__init__` to DagRun.
   
   -
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4690
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Make tests/api Pylint compatible
> 
>
> Key: AIRFLOW-4690
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4690
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Bas Harenslak
>Priority: Major
>
> Fix all Pylint messages in tests/api. To start; running 
> scripts/ci/ci_pylint.sh on master should produce no messages. (1) Remove the 
> files mentioned in your issue from the blacklist. (2) Run 
> scripts/ci/ci_pylint.sh to see all messages on the no longer blacklisted 
> files. (3) Fix all messages and create PR.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] BasPH opened a new pull request #5413: [AIRFLOW-4690] Make tests/api Pylint compatible

2019-06-12 Thread GitBox
BasPH opened a new pull request #5413: [AIRFLOW-4690] Make tests/api Pylint 
compatible
URL: https://github.com/apache/airflow/pull/5413
 
 
   Make tests/api compatible with Pylint. Biggest change is adding an 
`__init__` to DagRun.
   
   -
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4690
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293107546
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   @fokko - The ci_before_install.sh is run in the host environment before 
docker compose is run so this pip install is run in the host. It is used to run 
all the pylint/doc check and other initial linters rather than tests. They are 
run in host to speed them up. This was a workaround for the long time to start 
the original airflow-ci docker image - it was always installing all the 
dependencies first before doing anything and it took a long time to even do 
pylint/doccheck etc.. 
   
   Thus said - with the new Dockerfile approach, running all the pylint/docs 
etc. in Docker should be as fast as running them with local python installation 
(unless we have updated setup.py) and then we might indeed remove any python 
installation on the host. Should be super-easy and I might try to do it now . 
WDYT @fokko ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293107546
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   @fokko - The ci_before_install.sh is run in the host environment before 
docker compose is run so this pip install is run in the host. It is used to run 
all the pylint/doc check and other initial linters rather than tests. They are 
not run in docker to speed it up - but using local python on the host. This was 
a workaround for the long time to start the original airflow-ci docker image - 
it was always installing all the dependencies first before doing anything and 
it took a long time to even do pylint/doccheck etc.. 
   
   Thus said - with the new Dockerfile approach, running all the pylint/docs 
etc. in Docker should be as fast as running them with local python installation 
(unless we have updated setup.py) and then we might indeed remove any python 
installation on the host. Should be super-easy and I might try to do it now . 
WDYT @fokko ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293107546
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   @fokko - The ci_before_install.sh is run in the host environment before 
docker compose is run so this pip install is run in the host. So it used to run 
all the pylint/doc check and other rather than tests. They are not run in 
docker to speed it up - but using local python on the host. This was a 
workaround for the long time to start the original airflow-ci docker image - it 
was always installing all the dependencies first before doing anything and it 
took a long time to even do pylint/doccheck etc.. 
   
   Thus said - with the new Dockerfile approach, running all the pylint/docs 
etc. in Docker should be as fast as running them with local python installation 
(unless we have updated setup.py) and then we might indeed remove any python 
installation on the host. Should be super-easy and I might try to do it now . 
WDYT @fokko ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293107546
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   @fokko - The ci_before_install.sh is run in the host environment before 
docker compose is run so this pip install is run in the host. So it used to run 
all the pylint/doc check and other rather than tests. They are not run in 
docker to speed it up - but using local python on the host. This was a 
workaround for the long time to start the original airflow-ci docker image - it 
was always installing all the dependencies first before doing anything and it 
took a long time to even do pylint/doccheck etc.. 
   
   Thus said - with the new Dockerfile approach, running all the pylint/docs 
etc. in Docker should be as fast as running them with local python installation 
and then we might indeed remove any python installation on the host. Should be 
super-easy and I might try to do it now . WDYT @fokko ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293107546
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   @fokko - The ci_before_install.sh is run in the host environment before 
docker compose is run so this pip install is run in the host. So it used to run 
all the pylint/doc check and other rather than tests. They are not run in 
docker to speed it up - but using local python on the host. This was a 
workaround for the long time to start the original airflow-ci docker image - it 
was always installing all the dependencies first before doing anything and it 
took a long time to even do pylint/doccheck etc.. 
   
   Thus said - with the new Dockerfile approach, running all the pylint/docs 
etc. should be as fast as running them with local python installation and then 
we might indeed remove any python installation on the host. Should be 
super-easy and I might try to do it now . WDYT @fokko ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293107546
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   @fokko - The ci_before_install.sh is run in the host environment before 
docker compose is run so this pip install is run in the host. So it used to run 
all the pylint/doc check and other rather than tests. They are not run in 
docker to speed it up - but using local python on the host. This was a 
workaround for the long time to start the original airflow-ci docker image - it 
was always installing all the dependencies first before doing anything and it 
took a long time to even do pylint/doccheck etc.. 
   
   Thus said - with the new Dockerfile approach, runing all the pylint/docs 
etc. should be as fast as running them with local python installation and then 
we might indeed remove any python installation on the host. Should be 
super-easy and I might try to do it now . WDYT @fokko ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5389: [AIRFLOW-4750] Log identified zombie task instances

2019-06-12 Thread GitBox
ashb commented on a change in pull request #5389: [AIRFLOW-4750] Log identified 
zombie task instances
URL: https://github.com/apache/airflow/pull/5389#discussion_r293104748
 
 

 ##
 File path: airflow/utils/dag_processing.py
 ##
 @@ -1268,7 +1268,10 @@ def _find_zombies(self, session):
 )
 self._last_zombie_query_time = timezone.utcnow()
 for ti in tis:
-zombies.append(SimpleTaskInstance(ti))
+sti = SimpleTaskInstance(ti)
+self.log.info("=> Failing job with dag_id %s, task_id %s, and 
execution date %s",
 
 Review comment:
   ```suggestion
   self.log.info("=> Zombie job with dag_id %s, task_id %s, and 
execution date %s detected",
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5389: [AIRFLOW-4750] Log identified zombie task instances

2019-06-12 Thread GitBox
ashb commented on a change in pull request #5389: [AIRFLOW-4750] Log identified 
zombie task instances
URL: https://github.com/apache/airflow/pull/5389#discussion_r293104802
 
 

 ##
 File path: airflow/utils/dag_processing.py
 ##
 @@ -1268,7 +1268,10 @@ def _find_zombies(self, session):
 )
 self._last_zombie_query_time = timezone.utcnow()
 for ti in tis:
-zombies.append(SimpleTaskInstance(ti))
+sti = SimpleTaskInstance(ti)
+self.log.info("=> Failing job with dag_id %s, task_id %s, and 
execution date %s",
+  sti.dag_id, sti.task_id, sti.execution_date)
 
 Review comment:
   ```suggestion
 sti.dag_id, sti.task_id, 
sti.execution_date.isoformat())
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293103221
 
 

 ##
 File path: .travis.yml
 ##
 @@ -16,27 +16,34 @@
 # specific language governing permissions and limitations
 # under the License.
 #
+sudo: true
 dist: xenial
 language: python
-python:
-  - "3.6"
 env:
   global:
-- TRAVIS_CACHE=$HOME/.travis_cache/
+- BUILD_ID=${TRAVIS_BUILD_ID}
+- BRANCH_NAME=${TRAVIS_BRANCH}
   matrix:
-- TOX_ENV=py27-backend_mysql-env_docker
-- TOX_ENV=py27-backend_sqlite-env_docker
-- TOX_ENV=py27-backend_postgres-env_docker
-- TOX_ENV=py35-backend_mysql-env_docker PYTHON_VERSION=3
-- TOX_ENV=py35-backend_sqlite-env_docker PYTHON_VERSION=3
-- TOX_ENV=py35-backend_postgres-env_docker PYTHON_VERSION=3
-- TOX_ENV=py27-backend_postgres-env_kubernetes KUBERNETES_VERSION=v1.9.0
-- TOX_ENV=py35-backend_postgres-env_kubernetes KUBERNETES_VERSION=v1.13.0 
PYTHON_VERSION=3
-
+- BACKEND=mysql ENV=docker
+- BACKEND=postgres ENV=docker
+- BACKEND=sqlite ENV=docker
+- BACKEND=postgres ENV=kubernetes KUBERNETES_VERSION=v1.9.0
+- BACKEND=postgres ENV=kubernetes KUBERNETES_VERSION=v1.13.0
+python:
+  - '3.6'
+  - '3.5'
 
 Review comment:
   Correct. Tests run within Docker. Each python version has a separate image.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Work started] (AIRFLOW-4734) Upsert functionality for PostgresHook.insert_rows()

2019-06-12 Thread William Tran (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4734?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-4734 started by William Tran.
-
> Upsert functionality for PostgresHook.insert_rows()
> ---
>
> Key: AIRFLOW-4734
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4734
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.3
>Reporter: William Tran
>Assignee: William Tran
>Priority: Minor
>  Labels: features
> Fix For: 1.10.4
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> PostgresHook's parent class, DbApiHook, implements upsert in its 
> insert_rows() method with the replace=True flag. However, the underlying 
> generated SQL is specific to MySQL's "REPLACE INTO" syntax and is not 
> applicable to Postgres.
> I'd like to override this method in PostgresHook to implement the "INSERT ... 
> ON CONFLICT DO UPDATE" syntax (new since Postgres 9.5)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-4788) prev_execution_date is not always pendulum.datetime class

2019-06-12 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4788?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish updated AIRFLOW-4788:
-
Description: 
Despite documentation on macros page, previous execution dates are in general 
not pendulum type.

For one, when reading from database, UtcDateTime returns native datetime type.

Also dag.previous_schedule returns datetime type.

So, in general, `prev_execution_date` and `ti.previous_ti.execution_date` may 
be non-pendulum. 

(there are edge cases when the context var prev_* is pendulum e.g. when there 
is no DR or no schedule interval or manually triggered, but in general, no.)

The problem is, this leads to errors and confusion when using these fields in 
templating, when you expect it to be pendulum but it isn't.

There are a few things to consider:
 # make UtcDateTime sqlalchemy type return pendulum
 # make execution date a property of TaskInstance with appropriate getter 
returning pendulum.
 # Change dag.previous_schedule to return pendulum

 

  was:
In certain circumstances previous execution dates may not be pendulum type.

For one, when reading from database, UtcDateTime returns native datetime type.

Also dag.previous_schedule returns datetime type.

So, depending on circumstances, `prev_execution_date` and 
`ti.previous_ti.execution_date` may be non-pendulum. 

The problem is, this leads to errors and confusion when using these fields in 
templating, when you expect it to be pendulum but it isn't.

There are a few things to consider:
 # make UtcDateTime sqlalchemy type return pendulum
 # make execution date a property of TaskInstance with appropriate getter 
returning pendulum.
 # Change dag.previous_schedule to return pendulum

 


> prev_execution_date is not always pendulum.datetime class
> -
>
> Key: AIRFLOW-4788
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4788
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: core
>Affects Versions: 1.10.3
>Reporter: Daniel Standish
>Priority: Major
>
> Despite documentation on macros page, previous execution dates are in general 
> not pendulum type.
> For one, when reading from database, UtcDateTime returns native datetime type.
> Also dag.previous_schedule returns datetime type.
> So, in general, `prev_execution_date` and `ti.previous_ti.execution_date` may 
> be non-pendulum. 
> (there are edge cases when the context var prev_* is pendulum e.g. when there 
> is no DR or no schedule interval or manually triggered, but in general, no.)
> The problem is, this leads to errors and confusion when using these fields in 
> templating, when you expect it to be pendulum but it isn't.
> There are a few things to consider:
>  # make UtcDateTime sqlalchemy type return pendulum
>  # make execution date a property of TaskInstance with appropriate getter 
> returning pendulum.
>  # Change dag.previous_schedule to return pendulum
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] OmerJog commented on issue #5229: [AIRFLOW-XXX] Links to Pendulum in macros.rst

2019-06-12 Thread GitBox
OmerJog commented on issue #5229: [AIRFLOW-XXX] Links to Pendulum in macros.rst
URL: https://github.com/apache/airflow/pull/5229#issuecomment-501420918
 
 
   @mrshu can you rebase? there is a conflict


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4788) prev_execution_date is not always pendulum.datetime class

2019-06-12 Thread Daniel Standish (JIRA)
Daniel Standish created AIRFLOW-4788:


 Summary: prev_execution_date is not always pendulum.datetime class
 Key: AIRFLOW-4788
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4788
 Project: Apache Airflow
  Issue Type: Bug
  Components: core
Affects Versions: 1.10.3
Reporter: Daniel Standish


In certain circumstances previous execution dates may not be pendulum type.

For one, when reading from database, UtcDateTime returns native datetime type.

Also dag.previous_schedule returns datetime type.

So, depending on circumstances, `prev_execution_date` and 
`ti.previous_ti.execution_date` may be non-pendulum. 

The problem is, this leads to errors and confusion when using these fields in 
templating, when you expect it to be pendulum but it isn't.

There are a few things to consider:
 # make UtcDateTime sqlalchemy type return pendulum
 # make execution date a property of TaskInstance with appropriate getter 
returning pendulum.
 # Change dag.previous_schedule to return pendulum

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] OmerJog commented on issue #4590: [AIRFLOW-3727] Change references of is_localized to is_aware

2019-06-12 Thread GitBox
OmerJog commented on issue #4590: [AIRFLOW-3727] Change references of 
is_localized to is_aware
URL: https://github.com/apache/airflow/pull/4590#issuecomment-501416978
 
 
   What's the status of this PR?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3333) New features enable transferring of files or data from GCS to a SFTP remote path and SFTP to GCS path.

2019-06-12 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862378#comment-16862378
 ] 

jack commented on AIRFLOW-:
---

 

This can be very similar to S3ToSFTPOperator & SFTPToS3Operator

[https://github.com/apache/airflow/blob/master/airflow/contrib/operators/s3_to_sftp_operator.py]

[https://github.com/apache/airflow/blob/master/airflow/contrib/operators/sftp_to_s3_operator.py]

 

> New features enable transferring of files or data from GCS to a SFTP remote 
> path and SFTP to GCS path. 
> ---
>
> Key: AIRFLOW-
> URL: https://issues.apache.org/jira/browse/AIRFLOW-
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, gcp
>Reporter: Pulin Pathneja
>Assignee: Pulin Pathneja
>Priority: Major
>
> New features enable transferring of files or data from GCS(Google Cloud 
> Storage) to a SFTP remote path and SFTP to GCS(Google Cloud Storage) path. 
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3503) GoogleCloudStorageHook delete return success when nothing was done

2019-06-12 Thread Aizhamal Nurmamat kyzy (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aizhamal Nurmamat kyzy updated AIRFLOW-3503:

Labels: gcp gcs hooks  (was: bigquery gcp hooks)

> GoogleCloudStorageHook  delete return success when nothing was done
> ---
>
> Key: AIRFLOW-3503
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3503
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Affects Versions: 1.10.1
>Reporter: lot
>Assignee: Yohei Onishi
>Priority: Major
>  Labels: gcp, gcs, hooks
>
> I'm loading files to BigQuery from Storage using:
>  
> {{gcs_export_uri = BQ_TABLE_NAME + '/' + EXEC_TIMESTAMP_PATH + '/*' 
> gcs_to_bigquery_op = GoogleCloudStorageToBigQueryOperator( dag=dag, 
> task_id='load_products_to_BigQuery', bucket=GCS_BUCKET_ID, 
> destination_project_dataset_table=table_name_template, 
> source_format='NEWLINE_DELIMITED_JSON', source_objects=[gcs_export_uri], 
> src_fmt_configs=\{'ignoreUnknownValues': True}, 
> create_disposition='CREATE_IF_NEEDED', write_disposition='WRITE_TRUNCATE', 
> skip_leading_rows = 1, google_cloud_storage_conn_id=CONNECTION_ID, 
> bigquery_conn_id=CONNECTION_ID)}}
>  
> After that I want to delete the files so I do:
> {{def delete_folder():}}
> {{    """}}
> {{    Delete files Google cloud storage}}
> {{    """}}
> {{    hook = GoogleCloudStorageHook(}}
> {{    google_cloud_storage_conn_id=CONNECTION_ID)}}
> {{    hook.delete(}}
> {{    bucket=GCS_BUCKET_ID,}}
> {{    object=gcs_export_uri)}}
>  
>  
> {{This runs with PythonOperator.}}
> {{The task marked as Success even though nothing was deleted.}}
> {{Log:}}
> [2018-12-12 11:31:29,247] \{base_task_runner.py:98} INFO - Subtask: 
> [2018-12-12 11:31:29,247] \{transport.py:151} INFO - Attempting refresh to 
> obtain initial access_token [2018-12-12 11:31:29,249] 
> \{base_task_runner.py:98} INFO - Subtask: [2018-12-12 11:31:29,249] 
> \{client.py:795} INFO - Refreshing access_token [2018-12-12 11:31:29,584] 
> \{base_task_runner.py:98} INFO - Subtask: [2018-12-12 11:31:29,583] 
> \{python_operator.py:90} INFO - Done. Returned value was: None
>  
>  
> I expect the function to fail and return something like "file was not found" 
> if there is nothing to delete Or let the user decide with specific flag if he 
> wants the function to fail or success if files were not found.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3978) Add missing types in MySqlToGoogleCloudStorageOperator

2019-06-12 Thread Aizhamal Nurmamat kyzy (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3978?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aizhamal Nurmamat kyzy updated AIRFLOW-3978:

Labels: gcs  (was: bigquery)

> Add missing types in MySqlToGoogleCloudStorageOperator
> --
>
> Key: AIRFLOW-3978
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3978
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Affects Versions: 1.10.2
>Reporter: Roster
>Assignee: Roster
>Priority: Minor
>  Labels: gcs
> Fix For: 1.10.4
>
>
> There fields are missing and can not be mapped: 
> TIME, BINARY , VARBINARY



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3978) Add missing types in MySqlToGoogleCloudStorageOperator

2019-06-12 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3978?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862371#comment-16862371
 ] 

jack commented on AIRFLOW-3978:
---

Partially fixed in [https://github.com/apache/airflow/pull/5196]

> Add missing types in MySqlToGoogleCloudStorageOperator
> --
>
> Key: AIRFLOW-3978
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3978
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Affects Versions: 1.10.2
>Reporter: Roster
>Assignee: Roster
>Priority: Minor
>  Labels: bigquery
> Fix For: 1.10.4
>
>
> There fields are missing and can not be mapped: 
> TIME, BINARY , VARBINARY



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4233) BigQueryToCloudStorageOperator with TemplateNotFound for filename ending with .sql

2019-06-12 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4233?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862369#comment-16862369
 ] 

jack commented on AIRFLOW-4233:
---

I'm not sure why there is 
{code:java}
 template_ext = ('.sql',){code}
in the operator.

The BigQueryToCloudStorageOperator doesn't have sql parameter as it doesn't 
work with queries at all. It takes only tables. If one wish to export query 
result to storage once must first run BigQueryOperator to save query result to 
table and then export the table with BigQueryToCloudStorageOperator .

[~kaxilnaik] maybe you have some input on this template_ext ?

[https://github.com/apache/airflow/blob/master/airflow/contrib/operators/bigquery_to_gcs.py#L63]

 

> BigQueryToCloudStorageOperator with TemplateNotFound for filename ending with 
> .sql
> --
>
> Key: AIRFLOW-4233
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4233
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Reporter: Harris Chiu
>Priority: Major
>
> The issue happens when we have destination url ending with .sql. 
> It is related to the defined template extension template_ext = ('.sql',)
> The operator looks for jinja template, however, it's an output path, so the 
> file is not found when looking for any jinja template syntax



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4758) Add GoogleCloudStorageToGoogleDrive Operator

2019-06-12 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862365#comment-16862365
 ] 

jack commented on AIRFLOW-4758:
---

Hi [~kamil.bregula] I'm not working on this. I submitted it as a feature 
request.

I was surprised that Google doesn't offer such basic functionality to it's own 
product (Google Drive), so even Cloud Composer users still can't move files 
between storage to drive in their dags

> Add GoogleCloudStorageToGoogleDrive Operator
> 
>
> Key: AIRFLOW-4758
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4758
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: gcp, operators
>Affects Versions: 1 10.3
>Reporter: jack
>Priority: Major
>
> Add Operators:
> GoogleCloudStorageToGoogleDrive
> GoogleDriveToGoogleCloudStorage
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3503) GoogleCloudStorageHook delete return success when nothing was done

2019-06-12 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3503?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862361#comment-16862361
 ] 

jack commented on AIRFLOW-3503:
---

I'm not sure why BigQuery label was added here. This is a Google Storage issue.

> GoogleCloudStorageHook  delete return success when nothing was done
> ---
>
> Key: AIRFLOW-3503
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3503
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Affects Versions: 1.10.1
>Reporter: lot
>Assignee: Yohei Onishi
>Priority: Major
>  Labels: bigquery, gcp, hooks
>
> I'm loading files to BigQuery from Storage using:
>  
> {{gcs_export_uri = BQ_TABLE_NAME + '/' + EXEC_TIMESTAMP_PATH + '/*' 
> gcs_to_bigquery_op = GoogleCloudStorageToBigQueryOperator( dag=dag, 
> task_id='load_products_to_BigQuery', bucket=GCS_BUCKET_ID, 
> destination_project_dataset_table=table_name_template, 
> source_format='NEWLINE_DELIMITED_JSON', source_objects=[gcs_export_uri], 
> src_fmt_configs=\{'ignoreUnknownValues': True}, 
> create_disposition='CREATE_IF_NEEDED', write_disposition='WRITE_TRUNCATE', 
> skip_leading_rows = 1, google_cloud_storage_conn_id=CONNECTION_ID, 
> bigquery_conn_id=CONNECTION_ID)}}
>  
> After that I want to delete the files so I do:
> {{def delete_folder():}}
> {{    """}}
> {{    Delete files Google cloud storage}}
> {{    """}}
> {{    hook = GoogleCloudStorageHook(}}
> {{    google_cloud_storage_conn_id=CONNECTION_ID)}}
> {{    hook.delete(}}
> {{    bucket=GCS_BUCKET_ID,}}
> {{    object=gcs_export_uri)}}
>  
>  
> {{This runs with PythonOperator.}}
> {{The task marked as Success even though nothing was deleted.}}
> {{Log:}}
> [2018-12-12 11:31:29,247] \{base_task_runner.py:98} INFO - Subtask: 
> [2018-12-12 11:31:29,247] \{transport.py:151} INFO - Attempting refresh to 
> obtain initial access_token [2018-12-12 11:31:29,249] 
> \{base_task_runner.py:98} INFO - Subtask: [2018-12-12 11:31:29,249] 
> \{client.py:795} INFO - Refreshing access_token [2018-12-12 11:31:29,584] 
> \{base_task_runner.py:98} INFO - Subtask: [2018-12-12 11:31:29,583] 
> \{python_operator.py:90} INFO - Done. Returned value was: None
>  
>  
> I expect the function to fail and return something like "file was not found" 
> if there is nothing to delete Or let the user decide with specific flag if he 
> wants the function to fail or success if files were not found.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4787) clicking task status in dag view should take you to TIs of last run only

2019-06-12 Thread Daniel Standish (JIRA)
Daniel Standish created AIRFLOW-4787:


 Summary: clicking task status in dag view should take you to TIs 
of last run only
 Key: AIRFLOW-4787
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4787
 Project: Apache Airflow
  Issue Type: Improvement
  Components: ui
Affects Versions: 1.10.3
Reporter: Daniel Standish


In dags view, hen you click on the tasks from last run (e.g. the red circle 
indicating failed tasks), it currently takes you to TIs but includes _all_ 
failed tasks for that dag.  

If your dag has a lot of tasks this makes it harder to clear them.  You have to 
select them all individually and be careful not to select tasks in prior run.

Following this change, you would initially only see the tasks from last run 
(consistent with the number represented in dag view that you clicked on).  And 
if you want to see all tasks you can just drop the execution date filter.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io edited a comment on issue #5411: Upgrade alembic to latest release.

2019-06-12 Thread GitBox
codecov-io edited a comment on issue #5411: Upgrade alembic to latest release.
URL: https://github.com/apache/airflow/pull/5411#issuecomment-501388615
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=h1) 
Report
   > Merging 
[#5411](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5411/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5411  +/-   ##
   ==
   - Coverage   79.08%   79.07%   -0.01% 
   ==
 Files 483  483  
 Lines   3028430284  
   ==
   - Hits2394923948   -1 
   - Misses   6335 6336   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5411/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.02% <0%> (-0.17%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=footer). 
Last update 
[34056f8...871dc84](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #5411: Upgrade alembic to latest release.

2019-06-12 Thread GitBox
codecov-io commented on issue #5411: Upgrade alembic to latest release.
URL: https://github.com/apache/airflow/pull/5411#issuecomment-501388615
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=h1) 
Report
   > Merging 
[#5411](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5411/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5411  +/-   ##
   ==
   - Coverage   79.08%   79.07%   -0.01% 
   ==
 Files 483  483  
 Lines   3028430284  
   ==
   - Hits2394923948   -1 
   - Misses   6335 6336   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5411/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.02% <0%> (-0.17%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=footer). 
Last update 
[34056f8...871dc84](https://codecov.io/gh/apache/airflow/pull/5411?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] andrewhharmon commented on issue #3997: [AIRFLOW-3153] send dag last_run to statsd

2019-06-12 Thread GitBox
andrewhharmon commented on issue #3997: [AIRFLOW-3153] send dag last_run to 
statsd
URL: https://github.com/apache/airflow/pull/3997#issuecomment-501384470
 
 
   curious why this was implemented as a gauge and not a timer. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5297: AIRFLOW-2143 - Fix TaskTries graph counts off-by-1

2019-06-12 Thread GitBox
codecov-io edited a comment on issue #5297: AIRFLOW-2143 - Fix TaskTries graph 
counts off-by-1
URL: https://github.com/apache/airflow/pull/5297#issuecomment-493643996
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=h1) 
Report
   > Merging 
[#5297](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5297/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5297  +/-   ##
   ==
   + Coverage   79.08%   79.08%   +<.01% 
   ==
 Files 483  483  
 Lines   3028430286   +2 
   ==
   + Hits2394923951   +2 
 Misses   6335 6335
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/5297/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `75.91% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5297/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.21% <100%> (+0.02%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=footer). 
Last update 
[34056f8...ce30e73](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5297: AIRFLOW-2143 - Fix TaskTries graph counts off-by-1

2019-06-12 Thread GitBox
codecov-io edited a comment on issue #5297: AIRFLOW-2143 - Fix TaskTries graph 
counts off-by-1
URL: https://github.com/apache/airflow/pull/5297#issuecomment-493643996
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=h1) 
Report
   > Merging 
[#5297](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5297/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5297  +/-   ##
   ==
   + Coverage   79.08%   79.08%   +<.01% 
   ==
 Files 483  483  
 Lines   3028430286   +2 
   ==
   + Hits2394923951   +2 
 Misses   6335 6335
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/5297/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `75.91% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5297/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.21% <100%> (+0.02%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=footer). 
Last update 
[34056f8...ce30e73](https://codecov.io/gh/apache/airflow/pull/5297?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pgagnon edited a comment on issue #5407: [AIRFLOW-4741] Include Sentry into core Airflow

2019-06-12 Thread GitBox
pgagnon edited a comment on issue #5407: [AIRFLOW-4741] Include Sentry into 
core Airflow
URL: https://github.com/apache/airflow/pull/5407#issuecomment-501377152
 
 
   @tiopi CI is sad due to a couple of pylint errors: 
https://travis-ci.org/apache/airflow/jobs/544497697
   
   edit: Wrong mentions. Sorry! My github UI bugged out. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pgagnon edited a comment on issue #5407: [AIRFLOW-4741] Include Sentry into core Airflow

2019-06-12 Thread GitBox
pgagnon edited a comment on issue #5407: [AIRFLOW-4741] Include Sentry into 
core Airflow
URL: https://github.com/apache/airflow/pull/5407#issuecomment-501377152
 
 
   @tomchapin  CI is sad due to a couple of pylint errors: 
https://travis-ci.org/apache/airflow/jobs/544497697
   
   edit: Wrong mention. Sorry Omer! 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pgagnon edited a comment on issue #5407: [AIRFLOW-4741] Include Sentry into core Airflow

2019-06-12 Thread GitBox
pgagnon edited a comment on issue #5407: [AIRFLOW-4741] Include Sentry into 
core Airflow
URL: https://github.com/apache/airflow/pull/5407#issuecomment-501377152
 
 
   @tiopi CI is sad due to a couple of pylint errors: 
https://travis-ci.org/apache/airflow/jobs/544497697
   
   Wrong mention. Sorry Omer! 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pgagnon commented on issue #5407: [AIRFLOW-4741] Include Sentry into core Airflow

2019-06-12 Thread GitBox
pgagnon commented on issue #5407: [AIRFLOW-4741] Include Sentry into core 
Airflow
URL: https://github.com/apache/airflow/pull/5407#issuecomment-501377152
 
 
   @OmerJog CI is sad due to a couple of pylint errors: 
https://travis-ci.org/apache/airflow/jobs/544497697


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] Fokko commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
Fokko commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging 
Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293024730
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   My suggestion would be to remove the whole `pip install --upgrade pip` since 
this is regulated in Docker image itself as well: 
https://github.com/docker-library/python/blob/master/3.6/stretch/Dockerfile#L68


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] Fokko commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
Fokko commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging 
Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r293023666
 
 

 ##
 File path: .travis.yml
 ##
 @@ -16,27 +16,34 @@
 # specific language governing permissions and limitations
 # under the License.
 #
+sudo: true
 dist: xenial
 language: python
-python:
-  - "3.6"
 env:
   global:
-- TRAVIS_CACHE=$HOME/.travis_cache/
+- BUILD_ID=${TRAVIS_BUILD_ID}
+- BRANCH_NAME=${TRAVIS_BRANCH}
   matrix:
-- TOX_ENV=py27-backend_mysql-env_docker
-- TOX_ENV=py27-backend_sqlite-env_docker
-- TOX_ENV=py27-backend_postgres-env_docker
-- TOX_ENV=py35-backend_mysql-env_docker PYTHON_VERSION=3
-- TOX_ENV=py35-backend_sqlite-env_docker PYTHON_VERSION=3
-- TOX_ENV=py35-backend_postgres-env_docker PYTHON_VERSION=3
-- TOX_ENV=py27-backend_postgres-env_kubernetes KUBERNETES_VERSION=v1.9.0
-- TOX_ENV=py35-backend_postgres-env_kubernetes KUBERNETES_VERSION=v1.13.0 
PYTHON_VERSION=3
-
+- BACKEND=mysql ENV=docker
+- BACKEND=postgres ENV=docker
+- BACKEND=sqlite ENV=docker
+- BACKEND=postgres ENV=kubernetes KUBERNETES_VERSION=v1.9.0
+- BACKEND=postgres ENV=kubernetes KUBERNETES_VERSION=v1.13.0
+python:
+  - '3.6'
+  - '3.5'
 
 Review comment:
   Again, I'm confused. The tests will run inside of Docker, right?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4786) Task execution wails when Celery is used.

2019-06-12 Thread Piotr Pekala (JIRA)
Piotr Pekala created AIRFLOW-4786:
-

 Summary: Task execution wails when Celery is used.
 Key: AIRFLOW-4786
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4786
 Project: Apache Airflow
  Issue Type: Bug
  Components: celery
Affects Versions: 1.10.3
Reporter: Piotr Pekala


I'm using airflow 1.10.3 with LocalExecutor and everything is working properly. 
I want to switch to CeleryExecutor but tasks are failing (or are not executed 
at all).

I've tried in two separate clusters (2 machines: airflow + worker). Both with 
similar configuration (redis for broker and mysql for backend). In both cluster 
similar exception appears (below). In the first one tasks have status null and 
exception appears in scheduler while in second cluster tasks are started by 
worker but are failing with the same exception (on worker side):

 
{code:java}
Traceback (most recent call last):
  File "/usr/local/bin/airflow", line 32, in 
args.func(args)
  File "/usr/local/lib/python2.7/dist-packages/airflow/utils/cli.py", line 74, 
in wrapper
return f(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 498, 
in run
_run(args, dag, ti)
  File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 397, 
in _run
run_job.run()
  File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 202, in 
run
self._execute()
  File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 2598, in 
_execute
pool=self.pool):
  File "/usr/local/lib/python2.7/dist-packages/airflow/utils/db.py", line 74, 
in wrapper
return func(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 1557, 
in _check_and_change_state_before_execution
session.commit()
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 
927, in commit
self.transaction.commit()
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 
467, in commit
self._prepare_impl()
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 
447, in _prepare_impl
self.session.flush()
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 
2209, in flush
self._flush(objects)
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 
2329, in _flush
transaction.rollback(_capture_exception=True)
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", 
line 66, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 
2293, in _flush
flush_context.execute()
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/unitofwork.py", 
line 389, in execute
rec.execute(self)
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/unitofwork.py", 
line 548, in execute
uow
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/persistence.py", 
line 177, in save_obj
mapper, table, update)
  File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/persistence.py", 
line 760, in _emit_update_statements
(table.description, len(records), rows))
sqlalchemy.orm.exc.StaleDataError: UPDATE statement on table 'task_instance' 
expected to update 1 row(s); 0 were matched.
[2019-06-12 16:41:31,167: ERROR/ForkPoolWorker-4] execute_command encountered a 
CalledProcessError
Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/dist-packages/airflow/executors/celery_executor.py", 
line 60, in execute_command
close_fds=True, env=env)
  File "/usr/lib/python2.7/subprocess.py", line 186, in check_call
raise CalledProcessError(retcode, cmd)
CalledProcessError: Command 'airflow run hello_world dummy_task 
2019-06-12T15:05:00+00:00 --local -sd /data/airflow/dags/hello_world.py' 
returned non-zero exit status 1
[2019-06-12 16:41:31,168: ERROR/ForkPoolWorker-4] None
[2019-06-12 16:41:31,172: ERROR/ForkPoolWorker-4] Task 
airflow.executors.celery_executor.execute_command[dc2d3451-1d86-4097-8ded-6fd1aacd1de1]
 raised unexpected: AirflowException('Celery command failed',)
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 375, 
in trace_task
R = retval = fun(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 632, 
in __protected_call__
return self.run(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/dist-packages/airflow/executors/celery_executor.py", 
line 65, in execute_command
raise AirflowException('Celery command failed')
AirflowException: Celery command failed{code}
 

I've tested different configuration variations with different celery and 
airflow versions. I was able to make it work only on airflow 1.8.2.

There is either a bug in airflow 1.10+ or some missing / 

[jira] [Created] (AIRFLOW-4785) Config flag to exclude dummyoperator from having Unknown 'Dependencies Blocking Task From Getting Scheduled'

2019-06-12 Thread t oo (JIRA)
t oo created AIRFLOW-4785:
-

 Summary: Config flag to exclude dummyoperator from having Unknown 
'Dependencies Blocking Task From Getting Scheduled'
 Key: AIRFLOW-4785
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4785
 Project: Apache Airflow
  Issue Type: Improvement
  Components: dependencies, scheduler
Affects Versions: 1.10.3
Reporter: t oo


My Dag has tasks from 12 different types of operators. One of the operators is 
the dummyoperator (which is meant to do 'nothing') but it can't be run during 
busy times as the '{{parallelism}}, {{dag_concurrency}}, 
{{max_active_dag_runs_per_dag}}, {{non_pooled_task_slot_count' }}limits have 
been met (so it is stuck in scheduled state). I would like a new config flag 
(dont_block_dummy=True) with the ability for dummyOperator tasks to always get 
run even if the parallelism.etc limits are met. Without this feature, the only 
workaround for this is to make a huge parallelism limit (above now) and then 
give pools to all the other operators in my dag. But my idea is that 
dummyOperator should not have limits as it is not a resource hog.

 
h4. Task Instance Details
h5. Dependencies Blocking Task From Getting Scheduled
||Dependency||Reason||
|Unknown|All dependencies are met but the task instance is not running. In most 
cases this just means that the task will probably be scheduled soon unless:
- The scheduler is down or under heavy load
- The following configuration values may be limiting the number of queueable 
processes: {{parallelism}}, {{dag_concurrency}}, 
{{max_active_dag_runs_per_dag}}, {{non_pooled_task_slot_count}}
 
If this task instance does not start soon please contact your Airflow 
administrator for assistance.|



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] mcw0933 commented on a change in pull request #5297: AIRFLOW-2143 - Fix TaskTries graph counts off-by-1

2019-06-12 Thread GitBox
mcw0933 commented on a change in pull request #5297: AIRFLOW-2143 - Fix 
TaskTries graph counts off-by-1
URL: https://github.com/apache/airflow/pull/5297#discussion_r293016519
 
 

 ##
 File path: CONTRIBUTING.md
 ##
 @@ -158,10 +158,7 @@ There are three ways to setup an Apache Airflow 
development environment.
   Start a docker container through Compose for development to avoid installing 
the packages directly on your system. The following will give you a shell 
inside a container, run all required service containers (MySQL, PostgresSQL, 
krb5 and so on) and install all the dependencies:
 
   ```bash
-  docker-compose -f scripts/ci/docker-compose.yml run airflow-testing bash
-  # From the container
-  export TOX_ENV=py35-backend_mysql-env_docker
-  /app/scripts/ci/run-ci.sh
+  docker-compose -f scripts/ci/docker-compose.yml run -e 
TOX_ENV=py35-backend_mysql-env_docker airflow-testing /app/scripts/ci/run-ci.sh
 
 Review comment:
   Done: https://github.com/apache/airflow/pull/5412/files


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mcw0933 opened a new pull request #5412: Add TOX_ENV environment variable inline with test suite run script

2019-06-12 Thread GitBox
mcw0933 opened a new pull request #5412: Add TOX_ENV environment variable 
inline with test suite run script
URL: https://github.com/apache/airflow/pull/5412
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
  Adds the TOX_ENV environment variable inline with the test suite run 
script.  Without this var set, the suite fails to run.  
  Docs were inconsistent in the two places this script is mentioned, one 
had it, the other did not, so I made them consistent.
  
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   No tests, only doc changes
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5251: [AIRFLOW-4135] Add Google Cloud Build operator and hook

2019-06-12 Thread GitBox
codecov-io edited a comment on issue #5251: [AIRFLOW-4135] Add Google Cloud 
Build operator and hook
URL: https://github.com/apache/airflow/pull/5251#issuecomment-499510667
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=h1) 
Report
   > Merging 
[#5251](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **decrease** coverage by `0.04%`.
   > The diff coverage is `81.34%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5251/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5251  +/-   ##
   ==
   - Coverage   79.08%   79.03%   -0.05% 
   ==
 Files 483  484   +1 
 Lines   3028430331  +47 
   ==
   + Hits2394923973  +24 
   - Misses   6335 6358  +23
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/contrib/hooks/gcp\_api\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9hcGlfYmFzZV9ob29rLnB5)
 | `84.76% <ø> (ø)` | :arrow_up: |
   | 
[...ow/contrib/example\_dags/example\_gcp\_cloud\_build.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djcF9jbG91ZF9idWlsZC5weQ==)
 | `0% <0%> (ø)` | |
   | 
[...flow/contrib/operators/gcp\_cloud\_build\_operator.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9nY3BfY2xvdWRfYnVpbGRfb3BlcmF0b3IucHk=)
 | `100% <100%> (ø)` | |
   | 
[airflow/contrib/hooks/gcp\_cloud\_build\_hook.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9jbG91ZF9idWlsZF9ob29rLnB5)
 | `89.18% <89.18%> (ø)` | |
   | 
[airflow/api/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvX19pbml0X18ucHk=)
 | `72.22% <0%> (-3.97%)` | :arrow_down: |
   | 
[airflow/api/client/api\_client.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY2xpZW50L2FwaV9jbGllbnQucHk=)
 | `62.5% <0%> (-2.21%)` | :arrow_down: |
   | 
[...ample\_dags/example\_branch\_python\_dop\_operator\_3.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9icmFuY2hfcHl0aG9uX2RvcF9vcGVyYXRvcl8zLnB5)
 | `73.33% <0%> (-1.67%)` | :arrow_down: |
   | 
[airflow/example\_dags/example\_xcom.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV94Y29tLnB5)
 | `60.86% <0%> (-1.64%)` | :arrow_down: |
   | 
[airflow/dag/base\_dag.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9kYWcvYmFzZV9kYWcucHk=)
 | `66.66% <0%> (-1.34%)` | :arrow_down: |
   | 
[airflow/api/auth/backend/kerberos\_auth.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvYXV0aC9iYWNrZW5kL2tlcmJlcm9zX2F1dGgucHk=)
 | `83.09% <0%> (-0.91%)` | :arrow_down: |
   | ... and [53 
more](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=footer). 
Last update 
[34056f8...b5ace39](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5251: [AIRFLOW-4135] Add Google Cloud Build operator and hook

2019-06-12 Thread GitBox
codecov-io edited a comment on issue #5251: [AIRFLOW-4135] Add Google Cloud 
Build operator and hook
URL: https://github.com/apache/airflow/pull/5251#issuecomment-499510667
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=h1) 
Report
   > Merging 
[#5251](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **decrease** coverage by `0.04%`.
   > The diff coverage is `81.34%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5251/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5251  +/-   ##
   ==
   - Coverage   79.08%   79.03%   -0.05% 
   ==
 Files 483  484   +1 
 Lines   3028430331  +47 
   ==
   + Hits2394923973  +24 
   - Misses   6335 6358  +23
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/contrib/hooks/gcp\_api\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9hcGlfYmFzZV9ob29rLnB5)
 | `84.76% <ø> (ø)` | :arrow_up: |
   | 
[...ow/contrib/example\_dags/example\_gcp\_cloud\_build.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djcF9jbG91ZF9idWlsZC5weQ==)
 | `0% <0%> (ø)` | |
   | 
[...flow/contrib/operators/gcp\_cloud\_build\_operator.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9nY3BfY2xvdWRfYnVpbGRfb3BlcmF0b3IucHk=)
 | `100% <100%> (ø)` | |
   | 
[airflow/contrib/hooks/gcp\_cloud\_build\_hook.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9jbG91ZF9idWlsZF9ob29rLnB5)
 | `89.18% <89.18%> (ø)` | |
   | 
[airflow/api/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvX19pbml0X18ucHk=)
 | `72.22% <0%> (-3.97%)` | :arrow_down: |
   | 
[airflow/api/client/api\_client.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY2xpZW50L2FwaV9jbGllbnQucHk=)
 | `62.5% <0%> (-2.21%)` | :arrow_down: |
   | 
[...ample\_dags/example\_branch\_python\_dop\_operator\_3.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9icmFuY2hfcHl0aG9uX2RvcF9vcGVyYXRvcl8zLnB5)
 | `73.33% <0%> (-1.67%)` | :arrow_down: |
   | 
[airflow/example\_dags/example\_xcom.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV94Y29tLnB5)
 | `60.86% <0%> (-1.64%)` | :arrow_down: |
   | 
[airflow/dag/base\_dag.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9kYWcvYmFzZV9kYWcucHk=)
 | `66.66% <0%> (-1.34%)` | :arrow_down: |
   | 
[airflow/api/auth/backend/kerberos\_auth.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvYXV0aC9iYWNrZW5kL2tlcmJlcm9zX2F1dGgucHk=)
 | `83.09% <0%> (-0.91%)` | :arrow_down: |
   | ... and [53 
more](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=footer). 
Last update 
[34056f8...b5ace39](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5251: [AIRFLOW-4135] Add Google Cloud Build operator and hook

2019-06-12 Thread GitBox
codecov-io edited a comment on issue #5251: [AIRFLOW-4135] Add Google Cloud 
Build operator and hook
URL: https://github.com/apache/airflow/pull/5251#issuecomment-499510667
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=h1) 
Report
   > Merging 
[#5251](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **decrease** coverage by `0.04%`.
   > The diff coverage is `81.34%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5251/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5251  +/-   ##
   ==
   - Coverage   79.08%   79.03%   -0.05% 
   ==
 Files 483  484   +1 
 Lines   3028430331  +47 
   ==
   + Hits2394923973  +24 
   - Misses   6335 6358  +23
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/contrib/hooks/gcp\_api\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9hcGlfYmFzZV9ob29rLnB5)
 | `84.76% <ø> (ø)` | :arrow_up: |
   | 
[...ow/contrib/example\_dags/example\_gcp\_cloud\_build.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djcF9jbG91ZF9idWlsZC5weQ==)
 | `0% <0%> (ø)` | |
   | 
[...flow/contrib/operators/gcp\_cloud\_build\_operator.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9nY3BfY2xvdWRfYnVpbGRfb3BlcmF0b3IucHk=)
 | `100% <100%> (ø)` | |
   | 
[airflow/contrib/hooks/gcp\_cloud\_build\_hook.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9jbG91ZF9idWlsZF9ob29rLnB5)
 | `89.18% <89.18%> (ø)` | |
   | 
[airflow/api/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvX19pbml0X18ucHk=)
 | `72.22% <0%> (-3.97%)` | :arrow_down: |
   | 
[airflow/api/client/api\_client.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY2xpZW50L2FwaV9jbGllbnQucHk=)
 | `62.5% <0%> (-2.21%)` | :arrow_down: |
   | 
[...ample\_dags/example\_branch\_python\_dop\_operator\_3.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9icmFuY2hfcHl0aG9uX2RvcF9vcGVyYXRvcl8zLnB5)
 | `73.33% <0%> (-1.67%)` | :arrow_down: |
   | 
[airflow/example\_dags/example\_xcom.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV94Y29tLnB5)
 | `60.86% <0%> (-1.64%)` | :arrow_down: |
   | 
[airflow/dag/base\_dag.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9kYWcvYmFzZV9kYWcucHk=)
 | `66.66% <0%> (-1.34%)` | :arrow_down: |
   | 
[airflow/api/auth/backend/kerberos\_auth.py](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvYXV0aC9iYWNrZW5kL2tlcmJlcm9zX2F1dGgucHk=)
 | `83.09% <0%> (-0.91%)` | :arrow_down: |
   | ... and [53 
more](https://codecov.io/gh/apache/airflow/pull/5251/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=footer). 
Last update 
[34056f8...b5ace39](https://codecov.io/gh/apache/airflow/pull/5251?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #4890: [AIRFLOW-4048] HttpSensor provide-context to response_check

2019-06-12 Thread GitBox
ashb commented on a change in pull request #4890: [AIRFLOW-4048] HttpSensor 
provide-context to response_check
URL: https://github.com/apache/airflow/pull/4890#discussion_r292992991
 
 

 ##
 File path: airflow/sensors/http_sensor.py
 ##
 @@ -81,8 +89,10 @@ def poke(self, context):
  headers=self.headers,
  extra_options=self.extra_options)
 if self.response_check:
-# run content check on response
-return self.response_check(response)
+if self.provide_context:
+return self.response_check(response, context)
 
 Review comment:
   That was what you had before:
   
   ```python
   response_check_kwargs["context"] = context
   # ...
   response_check(response, **response_check_kwargs)
   ```
   which would still have the check fn having a signature of `(response, 
context)`. I am proposing making it have a signature of `(response, **context)`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #5386: AIRFLOW-4748 Print stacktrace for dagbag process exceptions

2019-06-12 Thread GitBox
ashb commented on issue #5386: AIRFLOW-4748 Print stacktrace for dagbag process 
exceptions
URL: https://github.com/apache/airflow/pull/5386#issuecomment-501337380
 
 
   Do you fancy a bit of a deeper change here? Rather than just making import 
errors a list of strings, seei if we could make it a list of dicts of 
`{'error': str(e), 'traceback': traceback.format_exc() }` and then we can 
improve the rendering of this in the UI. This _may_ turn out to be a much 
bigger change though, but would I think produce a nicer result.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] raphaelauv commented on a change in pull request #4890: [AIRFLOW-4048] HttpSensor provide-context to response_check

2019-06-12 Thread GitBox
raphaelauv commented on a change in pull request #4890: [AIRFLOW-4048] 
HttpSensor provide-context to response_check
URL: https://github.com/apache/airflow/pull/4890#discussion_r292990375
 
 

 ##
 File path: airflow/sensors/http_sensor.py
 ##
 @@ -81,8 +89,10 @@ def poke(self, context):
  headers=self.headers,
  extra_options=self.extra_options)
 if self.response_check:
-# run content check on response
-return self.response_check(response)
+if self.provide_context:
+return self.response_check(response, context)
 
 Review comment:
   This was my first proposition but @Fokko proposed another one ( you can 
check )


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] jmcarp opened a new pull request #5411: Upgrade alembic to latest release.

2019-06-12 Thread GitBox
jmcarp opened a new pull request #5411: Upgrade alembic to latest release.
URL: https://github.com/apache/airflow/pull/5411
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Use recent versions of alembic to pick up some performance and deprecation 
fixes. See the 
[changelog](https://alembic.sqlalchemy.org/en/latest/changelog.html) for 
details.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mb-m commented on a change in pull request #5289: [AIRFLOW-4517] allow for airflow config to influence flask configuration

2019-06-12 Thread GitBox
mb-m commented on a change in pull request #5289: [AIRFLOW-4517] allow for 
airflow config to influence flask configuration
URL: https://github.com/apache/airflow/pull/5289#discussion_r292986385
 
 

 ##
 File path: airflow/bin/cli.py
 ##
 @@ -857,21 +857,31 @@ def webserver(args):
 if ssl_cert and not ssl_key:
 raise AirflowException(
 'An SSL key must also be provided for use with ' + ssl_cert)
+flask_web_config = conf.getsection('flask_web')
+if flask_web_config:
+flask_web_upper_update = {}
+for k in flask_web_config:
+K = k.upper()
+if k != K:
+flask_web_upper_update[k] = K
+for k in flask_web_upper_update:
+flask_web_config[flask_web_upper_update[k]] = flask_web_config[k]
+del flask_web_config[k]
 
 Review comment:
   The flask config names are upper-cased 
(http://flask.pocoo.org/docs/1.0/config), but the airflow ones are lower-cased. 
When you get variables set by environment (same as many other airflow configs), 
you want these to operate in the correct way too. Unfortunately the handler for 
environment in config has already lower-cased them by this point.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mb-m commented on issue #5289: [AIRFLOW-4517] allow for airflow config to influence flask configuration

2019-06-12 Thread GitBox
mb-m commented on issue #5289: [AIRFLOW-4517] allow for airflow config to 
influence flask configuration
URL: https://github.com/apache/airflow/pull/5289#issuecomment-501331695
 
 
   @ashb as discussed above, `master` doesn't appear to have the same structure 
or issues.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #5251: [AIRFLOW-4135] Add Google Cloud Build operator and hook

2019-06-12 Thread GitBox
mik-laj commented on issue #5251: [AIRFLOW-4135] Add Google Cloud Build 
operator and hook
URL: https://github.com/apache/airflow/pull/5251#issuecomment-501331115
 
 
   @potiuk @kaxil PTAL


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #5402: [AIRFLOW-4746] Implement GCP Cloud Tasks' Hook and Operators

2019-06-12 Thread GitBox
mik-laj commented on issue #5402: [AIRFLOW-4746] Implement GCP Cloud Tasks' 
Hook and Operators
URL: https://github.com/apache/airflow/pull/5402#issuecomment-501326746
 
 
   @ryanyuan Is everything alright now with Travis? Can I help you with 
something else?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #4890: [AIRFLOW-4048] HttpSensor provide-context to response_check

2019-06-12 Thread GitBox
ashb commented on a change in pull request #4890: [AIRFLOW-4048] HttpSensor 
provide-context to response_check
URL: https://github.com/apache/airflow/pull/4890#discussion_r292970603
 
 

 ##
 File path: airflow/sensors/http_sensor.py
 ##
 @@ -81,8 +89,10 @@ def poke(self, context):
  headers=self.headers,
  extra_options=self.extra_options)
 if self.response_check:
-# run content check on response
-return self.response_check(response)
+if self.provide_context:
+return self.response_check(response, context)
 
 Review comment:
   This behaviour doesn't match the description - (keyword-args) for the 
description to be true this line should be
   
   ```
return self.response_check(response, **context)
   ```
   
   This diff here is probably the right thing as it resembles what the 
provide_context arg does for the PythonOperator.
   
   An example in the docs would help clarify this:
   
   ```rst
   Executes a HTTP GET statement and returns False on failure caused by
   404 Not Found or `response_check` returning False.
   
   HTTP Error codes other than 404 (like 403) or Connection Refused Error
   would fail the sensor itself directly (no more poking).
   
   
   The response check can access the template context by passing 
``provide_context=True`` to the operator::
   
   def response_check(response, **context):
   # Can look at context['ti'] etc.
   return True
   
   HttpSensor(task_id='my_http_sensor', ..., provide_context=True, 
response_check=response_check)
   
   
   :param http_conn_id: The connection to run the sensor against
   :type http_conn_id: str
   :param method: The HTTP request method to use
   :type method: str
   :param endpoint: The relative part of the full url
   :type endpoint: str
   :param request_params: The parameters to be added to the GET url
   :type request_params: a dictionary of string key/value pairs
   :param headers: The HTTP headers to be added to the GET request
   :type headers: a dictionary of string key/value pairs
   :param provide_context: if set to true, Airflow will pass a set of
   keyword arguments that can be used in your function. This set of
   kwargs correspond exactly to what you can use in your jinja
   templates.
   :type provide_context: bool
   :param response_check: A check against the 'requests' response object.
   Returns True for 'pass' and False otherwise.
   :type response_check: A lambda or defined function.
   :param extra_options: Extra options for the 'requests' library, see the
   'requests' documentation (options to modify timeout, ssl, etc.)
   :type extra_options: A dictionary of options, where key is string and 
value
   depends on the option that's being modified.
   ```
   
   Probably want to format the example a bit better (line wrap etc.)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-06-12 Thread GitBox
ashb commented on a change in pull request #4743: [AIRFLOW-3871] render 
Operators template fields recursively
URL: https://github.com/apache/airflow/pull/4743#discussion_r292963159
 
 

 ##
 File path: airflow/models/baseoperator.py
 ##
 @@ -644,9 +648,23 @@ def render_template_from_field(self, attr, content, 
context, jinja_env):
 k: rt("{}[{}]".format(attr, k), v, context)
 for k, v in list(content.items())}
 else:
-result = content
+result = self._render_nested_template_fields(content, context)
 return result
 
+def _render_nested_template_fields(self, content, context):
 
 Review comment:
   I think this diff should work:
   
   
   ```diff
   diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
   index b71f3c4ca1..3e081a5e3d 100644
   --- a/airflow/models/baseoperator.py
   +++ b/airflow/models/baseoperator.py
   @@ -632,7 +633,10 @@ def render_template_from_field(self, attr, content, 
context, jinja_env):
Renders a template from a field. If the field is a string, it will
simply render the string and return the result. If it is a 
collection or
nested set of collections, it will traverse the structure and render
   -all elements in it. If the field has another type, it will return 
it as it is.
   +all elements in it. For any other type, it will recursively render 
attributes
   +listed in its 'template_fields' attribute (class or instance level 
attribute)
   +when this 'template_fields' is defined only.
   +Finally returns the rendered field.
"""
rt = self.render_template
if isinstance(content, six.string_types):
   @@ -644,9 +648,23 @@ def render_template_from_field(self, attr, content, 
context, jinja_env):
k: rt("{}[{}]".format(attr, k), v, context)
for k, v in list(content.items())}
else:
   -result = content
   +result = self._render_nested_template_fields(content, context, 
seen_oids=set())
return result

   +def _render_nested_template_fields(self, content, context, seen_oids):
   +if id(content) not in seen_oids:
   +seen_oids.add(id(content))
   +try:
   +nested_template_fields = content.template_fields
   +except AttributeError:
   +# content has no inner template fields
   +return content
   +
   +for field in nested_template_fields:
   +rendered = self.render_template(field, getattr(content, 
field), context, seen_oids=seen_oids)
   +setattr(content, field, rendered)
   +return content
   +
def render_template(self, attr, content, context):
"""
Renders a template either from a file or directly in a field, and 
returns
   ```
   
   But I may have missed something.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4784) Make GCP sensors Pylint compatible

2019-06-12 Thread Tomasz Urbaszek (JIRA)
Tomasz Urbaszek created AIRFLOW-4784:


 Summary: Make GCP sensors Pylint compatible
 Key: AIRFLOW-4784
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4784
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 2.0.0
Reporter: Tomasz Urbaszek






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4783) Make GCP operators Pylint compatible

2019-06-12 Thread Tomasz Urbaszek (JIRA)
Tomasz Urbaszek created AIRFLOW-4783:


 Summary: Make GCP operators Pylint compatible
 Key: AIRFLOW-4783
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4783
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 2.0.0
Reporter: Tomasz Urbaszek






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4782) Make GCP hooks Pylint compatible

2019-06-12 Thread Tomasz Urbaszek (JIRA)
Tomasz Urbaszek created AIRFLOW-4782:


 Summary: Make GCP hooks Pylint compatible
 Key: AIRFLOW-4782
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4782
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 2.0.0
Reporter: Tomasz Urbaszek
Assignee: Tomasz Urbaszek






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] potiuk commented on issue #5406: [AIRFLOW-4681] Fix pylint problems for sensors module

2019-06-12 Thread GitBox
potiuk commented on issue #5406: [AIRFLOW-4681] Fix pylint problems for sensors 
module
URL: https://github.com/apache/airflow/pull/5406#issuecomment-501306993
 
 
   @mik-laj - there are some mypy/flake failures and name of the commit is 
wrong :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
ashb commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging 
Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r292953898
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   Adding in a _single_ build (i.e. not all three DB backends) for extra python 
versions is probably worth it - I'm not sure it's worth doing 3.6 and 3.7. I 
guess it depends what versions distros ship.
   
   We should probably test against the versions shipped by Debian, Ubuntu, 
Centos/Amazon Linux?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r292951352
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   For the 3.5/3.6 - I think that might be an option to mix it "Maybe we can 
mix a bit, so do Postgres with 3.5 and Mysql with 3.6". There is also a 
discussion in devlist I started with running also 3.7 (and possibly 3.8b1) What 
do you think @Fokko  @ashb ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r292951352
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   For the 3.5/3.6 - I think that might be an option to mix it "Maybe we can 
mix a bit, so do Postgres with 3.5 and Mysql with 3.6". There is also a 
discussion in devlist I started with runing also 3.7 (and possibly 3.8b1) What 
do you think @Fokko  @ashb ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r292950513
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
 
 Review comment:
   I am yet looking into the Kubernetes tests. I think Kubernetes tests are 
fairly difficult at this stage to reproduce locally until we make a 
docker-compose based solution. This is a known problem that we decided for now 
to defer to next step "make the kubernetes CI tests better". I am happy to 
discuss and implement it in a better way :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r292949104
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   To avoid upgrading pip by upgraded base image by Travis (this script runs 
outside of the docker) 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
potiuk commented on a change in pull request #4938: [AIRFLOW-4117] 
Multi-staging Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r292948491
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   Yes. I think we can even pin version here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3999) Remove inline style in HTML

2019-06-12 Thread Kamil Bregula (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3999?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula updated AIRFLOW-3999:
---
Affects Version/s: 1.10.3
  Component/s: ui
  Summary: Remove inline style in HTML  (was: Inline style in HTML)

> Remove inline style in HTML
> ---
>
> Key: AIRFLOW-3999
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3999
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ui
>Affects Versions: 1.10.3
>Reporter: Kamil Bregula
>Priority: Major
>
> Styles in the HTML code are not seen as good programming practice.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-4244) Enforce order in imports

2019-06-12 Thread Kamil Bregula (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula closed AIRFLOW-4244.
--
Resolution: Duplicate

> Enforce order in imports
> 
>
> Key: AIRFLOW-4244
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4244
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Kamil Bregula
>Priority: Trivial
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-4473) Add papermill operator

2019-06-12 Thread Kamil Bregula (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula updated AIRFLOW-4473:
---
Component/s: operators

> Add papermill operator
> --
>
> Key: AIRFLOW-4473
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4473
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Reporter: Bolke de Bruin
>Priority: Major
>
> Papermill is a way to parameterize and productize python notebooks



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4473) Add papermill operator

2019-06-12 Thread Kamil Bregula (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4473?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862147#comment-16862147
 ] 

Kamil Bregula commented on AIRFLOW-4473:


Why is it still open?

> Add papermill operator
> --
>
> Key: AIRFLOW-4473
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4473
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Bolke de Bruin
>Priority: Major
>
> Papermill is a way to parameterize and productize python notebooks



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-4560) Tez queue parameter passed by mapred_queue is incorrect

2019-06-12 Thread Kamil Bregula (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4560?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula updated AIRFLOW-4560:
---
Component/s: hooks

> Tez queue parameter passed by mapred_queue is incorrect
> ---
>
> Key: AIRFLOW-4560
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4560
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Reporter: Alice Berard
>Priority: Major
>
> The parameter is currently {{tez.job.queue.name}}, see code: 
> [https://github.com/apache/airflow/blob/355bd56282e6a684c5c060953e9948ba2260aa37/airflow/hooks/hive_hooks.py#L214]
> But it should be {{tez.queue.name}}, see here: 
> [https://tez.apache.org/releases/0.9.2/tez-api-javadocs/configs/TezConfiguration.html]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io commented on issue #5410: [AIRFLOW-4781] Added the ability to specify ports in kubernetesOperator

2019-06-12 Thread GitBox
codecov-io commented on issue #5410: [AIRFLOW-4781] Added the ability to 
specify ports in kubernetesOperator
URL: https://github.com/apache/airflow/pull/5410#issuecomment-501276490
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5410?src=pr=h1) 
Report
   > Merging 
[#5410](https://codecov.io/gh/apache/airflow/pull/5410?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/34056f8fd22b5d16822051a8f7d890090b684762?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5410/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5410?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5410  +/-   ##
   ==
   + Coverage   79.08%   79.08%   +<.01% 
   ==
 Files 483  483  
 Lines   3028430301  +17 
   ==
   + Hits2394923965  +16 
   - Misses   6335 6336   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5410?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/kubernetes/pod.py](https://codecov.io/gh/apache/airflow/pull/5410/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZC5weQ==)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[.../kubernetes\_request\_factory/pod\_request\_factory.py](https://codecov.io/gh/apache/airflow/pull/5410/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL2t1YmVybmV0ZXNfcmVxdWVzdF9mYWN0b3J5L3BvZF9yZXF1ZXN0X2ZhY3RvcnkucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[...etes\_request\_factory/kubernetes\_request\_factory.py](https://codecov.io/gh/apache/airflow/pull/5410/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL2t1YmVybmV0ZXNfcmVxdWVzdF9mYWN0b3J5L2t1YmVybmV0ZXNfcmVxdWVzdF9mYWN0b3J5LnB5)
 | `99.09% <100%> (+0.03%)` | :arrow_up: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/5410/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `98.8% <100%> (+0.04%)` | :arrow_up: |
   | 
[airflow/kubernetes/pod\_generator.py](https://codecov.io/gh/apache/airflow/pull/5410/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9nZW5lcmF0b3IucHk=)
 | `86.84% <100%> (+1.12%)` | :arrow_up: |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5410/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.02% <0%> (-0.17%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5410?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5410?src=pr=footer). 
Last update 
[34056f8...1cb108a](https://codecov.io/gh/apache/airflow/pull/5410?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3503) GoogleCloudStorageHook delete return success when nothing was done

2019-06-12 Thread Kamil Bregula (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3503?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862093#comment-16862093
 ] 

Kamil Bregula commented on AIRFLOW-3503:


[~yohei]

Hello

Do you plan to continue working on this change? I am organising tickets in JIRA 
related to GCP and I would like to know the current status of this change.

This functionality seems to be interesting. The community would have been happy 
if you had completed this change.

Can I help you in this?

Regards,

> GoogleCloudStorageHook  delete return success when nothing was done
> ---
>
> Key: AIRFLOW-3503
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3503
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Affects Versions: 1.10.1
>Reporter: lot
>Assignee: Yohei Onishi
>Priority: Major
>  Labels: bigquery, gcp, hooks
>
> I'm loading files to BigQuery from Storage using:
>  
> {{gcs_export_uri = BQ_TABLE_NAME + '/' + EXEC_TIMESTAMP_PATH + '/*' 
> gcs_to_bigquery_op = GoogleCloudStorageToBigQueryOperator( dag=dag, 
> task_id='load_products_to_BigQuery', bucket=GCS_BUCKET_ID, 
> destination_project_dataset_table=table_name_template, 
> source_format='NEWLINE_DELIMITED_JSON', source_objects=[gcs_export_uri], 
> src_fmt_configs=\{'ignoreUnknownValues': True}, 
> create_disposition='CREATE_IF_NEEDED', write_disposition='WRITE_TRUNCATE', 
> skip_leading_rows = 1, google_cloud_storage_conn_id=CONNECTION_ID, 
> bigquery_conn_id=CONNECTION_ID)}}
>  
> After that I want to delete the files so I do:
> {{def delete_folder():}}
> {{    """}}
> {{    Delete files Google cloud storage}}
> {{    """}}
> {{    hook = GoogleCloudStorageHook(}}
> {{    google_cloud_storage_conn_id=CONNECTION_ID)}}
> {{    hook.delete(}}
> {{    bucket=GCS_BUCKET_ID,}}
> {{    object=gcs_export_uri)}}
>  
>  
> {{This runs with PythonOperator.}}
> {{The task marked as Success even though nothing was deleted.}}
> {{Log:}}
> [2018-12-12 11:31:29,247] \{base_task_runner.py:98} INFO - Subtask: 
> [2018-12-12 11:31:29,247] \{transport.py:151} INFO - Attempting refresh to 
> obtain initial access_token [2018-12-12 11:31:29,249] 
> \{base_task_runner.py:98} INFO - Subtask: [2018-12-12 11:31:29,249] 
> \{client.py:795} INFO - Refreshing access_token [2018-12-12 11:31:29,584] 
> \{base_task_runner.py:98} INFO - Subtask: [2018-12-12 11:31:29,583] 
> \{python_operator.py:90} INFO - Done. Returned value was: None
>  
>  
> I expect the function to fail and return something like "file was not found" 
> if there is nothing to delete Or let the user decide with specific flag if he 
> wants the function to fail or success if files were not found.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] mik-laj commented on issue #4316: [AIRFLOW-3511] Create GCP Memorystore Redis Hook

2019-06-12 Thread GitBox
mik-laj commented on issue #4316:  [AIRFLOW-3511] Create GCP Memorystore Redis 
Hook
URL: https://github.com/apache/airflow/pull/4316#issuecomment-501269431
 
 
   It sounds good. I mark it as a my task to do.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow-ci] potiuk commented on issue #5: AIRFLOW-2193- Added r-base and rpy2 for ROperator

2019-06-12 Thread GitBox
potiuk commented on issue #5: AIRFLOW-2193- Added r-base and rpy2 for ROperator 
URL: https://github.com/apache/airflow-ci/pull/5#issuecomment-501268208
 
 
   I agree with @ashb and @dimberman. And I also have a concrete proposal how 
to address such needs as @matkalinowski (if I understand it correctly).
   
   What I plan to take a look after AIP-10 (docker) and  AI-7 hopefully 
(Breeze)  is to come back to propoal of AIP-4 (Support for system tests for 
external systems). There I wanted to add a way to run system tests that could 
be run via CI (but not necessarily running in default CI of Airflow) to 
communicate with external systems and maybe we could extend that so that those 
R-related tests are treated as "external systems" (even though R is not an 
external system but it can be treated as such) 
   
   The Gist of AIP-4 will be to let people add "System Tests" testing external 
systems (like GCP) which should be very easy to run automatically (so likely 
using the same pytest approach and TestCase base class) but to skip them by 
default. This way we could have some custom CI setup per team that would run 
the tests automatically (if they choose to configure the external systems). So 
the idea that I vaguely have in mind is to be able to a) add extra requirements 
to be added to image (in this case R) before tests are run and b) setup 
environment so that tests for external system can be run easily.
   
   This way we could keep R-tests or GCP tests or AWS tests in the code but 
without running them all the time (but each team working on such tests can 
easily build CI setup that will run tests for their work).
   
   Would that be something that could address your needs @matkalinowski (and 
address your concerns @ashb @dimberman ) ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-4477) Add v2.models to BigQueryHook

2019-06-12 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4477?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Yuan closed AIRFLOW-4477.
--
Resolution: Won't Do

> Add v2.models to BigQueryHook
> -
>
> Key: AIRFLOW-4477
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4477
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>  Labels: bigquery
>
> Add v2.models to BigQueryHook
> Reference: [https://cloud.google.com/bigquery/docs/reference/rest/v2/models]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4477) Add v2.models to BigQueryHook

2019-06-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4477?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862075#comment-16862075
 ] 

ASF GitHub Bot commented on AIRFLOW-4477:
-

ryanyuan commented on pull request #5268: [AIRFLOW-4477] Add models methods to 
BQHook
URL: https://github.com/apache/airflow/pull/5268
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add v2.models to BigQueryHook
> -
>
> Key: AIRFLOW-4477
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4477
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>  Labels: bigquery
>
> Add v2.models to BigQueryHook
> Reference: [https://cloud.google.com/bigquery/docs/reference/rest/v2/models]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ryanyuan commented on issue #5268: [AIRFLOW-4477] Add models methods to BQHook

2019-06-12 Thread GitBox
ryanyuan commented on issue #5268: [AIRFLOW-4477] Add models methods to BQHook
URL: https://github.com/apache/airflow/pull/5268#issuecomment-501262116
 
 
   @mik-laj Agree. Closing this PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #5268: [AIRFLOW-4477] Add models methods to BQHook

2019-06-12 Thread GitBox
mik-laj commented on issue #5268: [AIRFLOW-4477] Add models methods to BQHook
URL: https://github.com/apache/airflow/pull/5268#issuecomment-501260662
 
 
   Do you think it's worth closing the ticket now? Now it is a ticket without a 
proper use case.  it can introduce noise on Jira 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ryanyuan commented on issue #5268: [AIRFLOW-4477] Add models methods to BQHook

2019-06-12 Thread GitBox
ryanyuan commented on issue #5268: [AIRFLOW-4477] Add models methods to BQHook
URL: https://github.com/apache/airflow/pull/5268#issuecomment-501258806
 
 
   @mik-laj I am not going to work on this until there are more endpoints 
exposed because so far we found we could run the query directly to achieve what 
the API can do.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4781) Added the ability to specify ports in kubernetesOperator

2019-06-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16862067#comment-16862067
 ] 

ASF GitHub Bot commented on AIRFLOW-4781:
-

rweverwijk commented on pull request #5410: [AIRFLOW-4781] Added the ability to 
specify ports in kubernetesOperator
URL: https://github.com/apache/airflow/pull/5410
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Added the ability to specify ports in kubernetesOperator
> 
>
> Key: AIRFLOW-4781
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4781
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Ron van Weverwijk
>Assignee: Ron van Weverwijk
>Priority: Minor
> Fix For: 1.10.4, 2.0.0
>
>
> In kubernetes you have the ability to specify which ports to open to the 
> container.
>  
> {code:java}
> containers:
> - name: task-pv-container
>   image: nginx
>   ports:
> - containerPort: 80
>   name: "http-server"
> {code}
> In this issue we want to add that support to the kubernetesOperator and 
> `PodRequestFactory`
> With the support in the PodRequestFactory we can add functionality to build 
> an operator that start a container in kubernetes and interact over the opened 
> port.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ryanyuan commented on issue #4316: [AIRFLOW-3511] Create GCP Memorystore Redis Hook

2019-06-12 Thread GitBox
ryanyuan commented on issue #4316:  [AIRFLOW-3511] Create GCP Memorystore Redis 
Hook
URL: https://github.com/apache/airflow/pull/4316#issuecomment-501257200
 
 
   I planned to rewrite this using the python library instead of using the 
discovery API. Do you want to take it since you are also going to do the 
operators for it? @mik-laj 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Work started] (AIRFLOW-4781) Added the ability to specify ports in kubernetesOperator

2019-06-12 Thread Ron van Weverwijk (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4781?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-4781 started by Ron van Weverwijk.
--
> Added the ability to specify ports in kubernetesOperator
> 
>
> Key: AIRFLOW-4781
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4781
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Ron van Weverwijk
>Assignee: Ron van Weverwijk
>Priority: Minor
> Fix For: 1.10.4, 2.0.0
>
>
> In kubernetes you have the ability to specify which ports to open to the 
> container.
>  
> {code:java}
> containers:
> - name: task-pv-container
>   image: nginx
>   ports:
> - containerPort: 80
>   name: "http-server"
> {code}
> In this issue we want to add that support to the kubernetesOperator and 
> `PodRequestFactory`
> With the support in the PodRequestFactory we can add functionality to build 
> an operator that start a container in kubernetes and interact over the opened 
> port.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] Fokko commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
Fokko commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging 
Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r292769821
 
 

 ##
 File path: .travis.yml
 ##
 @@ -16,27 +16,34 @@
 # specific language governing permissions and limitations
 # under the License.
 #
+sudo: true
 dist: xenial
 language: python
-python:
-  - "3.6"
 env:
   global:
-- TRAVIS_CACHE=$HOME/.travis_cache/
+- BUILD_ID=${TRAVIS_BUILD_ID}
+- BRANCH_NAME=${TRAVIS_BRANCH}
   matrix:
-- TOX_ENV=py27-backend_mysql-env_docker
-- TOX_ENV=py27-backend_sqlite-env_docker
-- TOX_ENV=py27-backend_postgres-env_docker
-- TOX_ENV=py35-backend_mysql-env_docker PYTHON_VERSION=3
-- TOX_ENV=py35-backend_sqlite-env_docker PYTHON_VERSION=3
-- TOX_ENV=py35-backend_postgres-env_docker PYTHON_VERSION=3
-- TOX_ENV=py27-backend_postgres-env_kubernetes KUBERNETES_VERSION=v1.9.0
-- TOX_ENV=py35-backend_postgres-env_kubernetes KUBERNETES_VERSION=v1.13.0 
PYTHON_VERSION=3
-
+- BACKEND=mysql ENV=docker
+- BACKEND=postgres ENV=docker
+- BACKEND=sqlite ENV=docker
+- BACKEND=postgres ENV=kubernetes KUBERNETES_VERSION=v1.9.0
+- BACKEND=postgres ENV=kubernetes KUBERNETES_VERSION=v1.13.0
+python:
+  - '3.6'
+  - '3.5'
 
 Review comment:
   3.5 is still maintained and will be for a while. Maybe we can mix a bit, so 
do Postgres with 3.5 and Mysql with 3.6


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] Fokko commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging Image - Travis CI tests [Step 3/3]

2019-06-12 Thread GitBox
Fokko commented on a change in pull request #4938: [AIRFLOW-4117] Multi-staging 
Image - Travis CI tests [Step 3/3]
URL: https://github.com/apache/airflow/pull/4938#discussion_r292772594
 
 

 ##
 File path: scripts/ci/ci_before_install.sh
 ##
 @@ -0,0 +1,34 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -xeuo pipefail
+
+MY_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+export KUBERNETES_VERSION=${KUBERNETES_VERSION:=}
+# Required for K8s v1.10.x. See
+# https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
+if [[ ! -z "${KUBERNETES_VERSION}" ]]; then
+sudo mount --make-shared /
+sudo service docker restart
+fi
+pip install --upgrade pip
 
 Review comment:
   Not sure if we should do this. Maybe we can just use the pip version out of 
the box. The reason behind this, in the past we've seen breaking pip versions, 
and it is not required to use the latest version of pip.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow-ci] dimberman commented on issue #5: AIRFLOW-2193- Added r-base and rpy2 for ROperator

2019-06-12 Thread GitBox
dimberman commented on issue #5: AIRFLOW-2193- Added r-base and rpy2 for 
ROperator 
URL: https://github.com/apache/airflow-ci/pull/5#issuecomment-501232274
 
 
   I second @ash on this. Having to debug R tests would add an extra barrier 
for contributors since it's not a common language for system developers.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #3115: [AIRFLOW-2193] Add ROperator for using R

2019-06-12 Thread GitBox
ashb commented on issue #3115: [AIRFLOW-2193] Add ROperator for using R
URL: https://github.com/apache/airflow/pull/3115#issuecomment-501230990
 
 
   As I mentioned in the other PR - I really want to avoid running R in the 
Airflow unit tests - we don't want to test R or the bindings (as that is the 
job of the other projects test suites), just that we call it as expected.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5297: AIRFLOW-2143 - Fix TaskTries graph counts off-by-1

2019-06-12 Thread GitBox
ashb commented on a change in pull request #5297: AIRFLOW-2143 - Fix TaskTries 
graph counts off-by-1
URL: https://github.com/apache/airflow/pull/5297#discussion_r292861897
 
 

 ##
 File path: CONTRIBUTING.md
 ##
 @@ -158,10 +158,7 @@ There are three ways to setup an Apache Airflow 
development environment.
   Start a docker container through Compose for development to avoid installing 
the packages directly on your system. The following will give you a shell 
inside a container, run all required service containers (MySQL, PostgresSQL, 
krb5 and so on) and install all the dependencies:
 
   ```bash
-  docker-compose -f scripts/ci/docker-compose.yml run airflow-testing bash
-  # From the container
-  export TOX_ENV=py35-backend_mysql-env_docker
-  /app/scripts/ci/run-ci.sh
+  docker-compose -f scripts/ci/docker-compose.yml run -e 
TOX_ENV=py35-backend_mysql-env_docker airflow-testing /app/scripts/ci/run-ci.sh
 
 Review comment:
   Please make a separate PR


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow-ci] ashb commented on issue #5: AIRFLOW-2193- Added r-base and rpy2 for ROperator

2019-06-12 Thread GitBox
ashb commented on issue #5: AIRFLOW-2193- Added r-base and rpy2 for ROperator 
URL: https://github.com/apache/airflow-ci/pull/5#issuecomment-501228557
 
 
   I'd really rather we didn't run R in the airflow unit test suite - that 
isn't our job.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-4781) Added the ability to specify ports in kubernetesOperator

2019-06-12 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4781?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong reassigned AIRFLOW-4781:
-

Assignee: Ron van Weverwijk

> Added the ability to specify ports in kubernetesOperator
> 
>
> Key: AIRFLOW-4781
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4781
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Ron van Weverwijk
>Assignee: Ron van Weverwijk
>Priority: Minor
> Fix For: 1.10.4, 2.0.0
>
>
> In kubernetes you have the ability to specify which ports to open to the 
> container.
>  
> {code:java}
> containers:
> - name: task-pv-container
>   image: nginx
>   ports:
> - containerPort: 80
>   name: "http-server"
> {code}
> In this issue we want to add that support to the kubernetesOperator and 
> `PodRequestFactory`
> With the support in the PodRequestFactory we can add functionality to build 
> an operator that start a container in kubernetes and interact over the opened 
> port.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4781) Added the ability to specify ports in kubernetesOperator

2019-06-12 Thread Ron van Weverwijk (JIRA)
Ron van Weverwijk created AIRFLOW-4781:
--

 Summary: Added the ability to specify ports in kubernetesOperator
 Key: AIRFLOW-4781
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4781
 Project: Apache Airflow
  Issue Type: New Feature
  Components: operators
Affects Versions: 1.10.3
Reporter: Ron van Weverwijk


In kubernetes you have the ability to specify which ports to open to the 
container.

 
{code:java}
containers:
- name: task-pv-container
  image: nginx
  ports:
- containerPort: 80
  name: "http-server"
{code}
In this issue we want to add that support to the kubernetesOperator and 
`PodRequestFactory`

With the support in the PodRequestFactory we can add functionality to build an 
operator that start a container in kubernetes and interact over the opened port.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4780) CLI connections add option -i for JSON file

2019-06-12 Thread raphael auv (JIRA)
raphael auv created AIRFLOW-4780:


 Summary: CLI connections add option -i for JSON file
 Key: AIRFLOW-4780
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4780
 Project: Apache Airflow
  Issue Type: Improvement
  Components: cli
Affects Versions: 1.10.4
Reporter: raphael auv
Assignee: raphael auv


We have a ton's of connections and we would like to register them with a json 
file , like variables and pools 

today we have to call the cli for each connection , wich take more than 4 
minutes.

So It would be nice to have a -i option to import a JSON file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] mik-laj commented on issue #4316: [AIRFLOW-3511] Create GCP Memorystore Redis Hook

2019-06-12 Thread GitBox
mik-laj commented on issue #4316:  [AIRFLOW-3511] Create GCP Memorystore Redis 
Hook
URL: https://github.com/apache/airflow/pull/4316#issuecomment-501180801
 
 
   Hello
   
   Do you plan to continue working on this change? I am organising tickets in 
JIRA related to GCP and I would like to know the current status of this change. 
   
   This functionality seems to be interesting. The community would have been 
happy if you had completed this change. Can I help you in this?
   
   Regards,


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3333) New features enable transferring of files or data from GCS to a SFTP remote path and SFTP to GCS path.

2019-06-12 Thread Kamil Bregula (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16861897#comment-16861897
 ] 

Kamil Bregula commented on AIRFLOW-:


Hello

Do you plan to continue working on this change? I am organising tickets in JIRA 
related to GCP and I would like to know the current status of this change.

Regards,

> New features enable transferring of files or data from GCS to a SFTP remote 
> path and SFTP to GCS path. 
> ---
>
> Key: AIRFLOW-
> URL: https://issues.apache.org/jira/browse/AIRFLOW-
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, gcp
>Reporter: Pulin Pathneja
>Assignee: Pulin Pathneja
>Priority: Major
>
> New features enable transferring of files or data from GCS(Google Cloud 
> Storage) to a SFTP remote path and SFTP to GCS(Google Cloud Storage) path. 
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] mik-laj commented on issue #5333: [AIRFLOW-4583] Fix writing GCP keyfile to tempfile in python3

2019-06-12 Thread GitBox
mik-laj commented on issue #5333: [AIRFLOW-4583] Fix writing GCP keyfile to 
tempfile in python3
URL: https://github.com/apache/airflow/pull/5333#issuecomment-501179770
 
 
   Hello
   
   Do you plan to continue working on this change? I am organising tickets in 
JIRA related to GCP  and I would like to know the current status of this change.
   
   Regards,


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4428) Externally triggered DagRun marked as 'success' state but all tasks for it are 'None' state

2019-06-12 Thread Leonardo (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16861894#comment-16861894
 ] 

Leonardo commented on AIRFLOW-4428:
---

[~Chinta] I didn't find any workaround, for the moment I just downgraded to 
1.10.2

> Externally triggered DagRun marked as 'success' state but all tasks for it 
> are 'None' state
> ---
>
> Key: AIRFLOW-4428
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4428
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG, DagRun, scheduler
>Affects Versions: 1.10.3
> Environment: LocalExecutor, 16cpu amazon linux, airflow 1.10.3, mysql 
> RDS meta
>Reporter: t oo
>Priority: Critical
> Fix For: 1.10.4
>
> Attachments: Screenshot from 2019-05-10 10-42-53.png, graph.png
>
>
> Dagrun status showing success for a given execdate but:
> *  all tasks are white in graph/tree view
> *  on cli running airflow task_state for any of the tasks showing in the 
> graph/tree view returns None
> * taskinstance view shows no tasks for the given dagid
> This dag has 800 tasks, non_pooled_task_slot_count is 2000, max_tis_per_query 
> is 2000, no pools created/assigned to the dag
> This is an externally triggered dag, all my dags are purely externally 
> triggered.
> For execdates 20150101, 20160101, 20170101 getting this issue but 20180101 
> works. Maybe it has something to do with this line in my dag mentioning 2017 
> ---> default_args = {'owner': 'airflow','start_date': dt.datetime(2017, 6, 
> 1),'retries': 0,'retry_delay': dt.timedelta(minutes=5),}
> As per comments below this issue was introduced in 1.10.3. release, i suspect 
> the below jira caused it:
> * https://github.com/apache/airflow/pull/4808/files  
> (https://issues.apache.org/jira/browse/AIRFLOW-3982)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3722) Improve BiqQueryHook test coverage

2019-06-12 Thread Kamil Bregula (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3722?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16861892#comment-16861892
 ] 

Kamil Bregula commented on AIRFLOW-3722:


Hello

Do you plan to continue working on this change? I am organising tickets in JIRA 
related to GCP  and I would like to know the current status of this change.

Regards,

> Improve BiqQueryHook test coverage
> --
>
> Key: AIRFLOW-3722
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3722
> Project: Apache Airflow
>  Issue Type: Test
>  Components: gcp
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Major
>
> There are currently many lines not being tested.
>  
> This Ticket will improve the overall test coverage of this Hook.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >