[GitHub] ron819 commented on issue #1964: [AIRFLOW-722] Add Celery queue sensor

2018-11-13 Thread GitBox
ron819 commented on issue #1964: [AIRFLOW-722] Add Celery queue sensor
URL: 
https://github.com/apache/incubator-airflow/pull/1964#issuecomment-438566820
 
 
   @duffn are you still working on this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ron819 removed a comment on issue #2450: [Airflow-1413] Fix FTPSensor failing on error message with unexpected text.

2018-11-13 Thread GitBox
ron819 removed a comment on issue #2450: [Airflow-1413] Fix FTPSensor failing 
on error message with unexpected text.
URL: 
https://github.com/apache/incubator-airflow/pull/2450#issuecomment-429600699
 
 
   @Fokko 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3338) Task runner fail to dump log if stdout of substask contains Chinese characters

2018-11-13 Thread Cheng Yichao (JIRA)
Cheng Yichao created AIRFLOW-3338:
-

 Summary: Task runner fail to dump log if stdout of substask 
contains Chinese characters
 Key: AIRFLOW-3338
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3338
 Project: Apache Airflow
  Issue Type: Bug
  Components: core
Affects Versions: 1.8.2, 1.8.1
Reporter: Cheng Yichao


Code:
{code:java}
# At /airflow/task_runner/base_task_runner.py
def _read_task_logs(self, stream):
  while True:
line = stream.readline().decode('utf-8')
if len(line) == 0:
   break
self.logger.info('Subtask: {}'.format(line.rstrip('\n'))){code}
Error message:
{code:java}
Traceback (most recent call last):
File "", line 1, in 
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-2: 
ordinal not in range(128){code}
Behavior:

When a subtask tries to print Chinese characters, the above exception is throw.

The problem is that if the argument of 'format' is Unicode string, then the 
format string is also need to be a Unicode string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3338) Task runner fail to dump log if stdout of substask contains Chinese characters

2018-11-13 Thread Cheng Yichao (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cheng Yichao updated AIRFLOW-3338:
--
Description: 
Code:
{code:java}
# At /airflow/task_runner/base_task_runner.py
def _read_task_logs(self, stream):
  while True:
line = stream.readline().decode('utf-8')
if len(line) == 0:
   break
self.logger.info('Subtask: {}'.format(line.rstrip('\n'))){code}
Error message:
{code:java}
Traceback (most recent call last):
File "", line 1, in 
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-2: 
ordinal not in range(128){code}
Behavior:

When a subtask tries to print Chinese characters, the above exception is throw.

The problem is that if the argument of 'format' is Unicode string, then the 
format string is also need to be a Unicode string.

The problem is fixed after version 1.9.0.

  was:
Code:
{code:java}
# At /airflow/task_runner/base_task_runner.py
def _read_task_logs(self, stream):
  while True:
line = stream.readline().decode('utf-8')
if len(line) == 0:
   break
self.logger.info('Subtask: {}'.format(line.rstrip('\n'))){code}
Error message:
{code:java}
Traceback (most recent call last):
File "", line 1, in 
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-2: 
ordinal not in range(128){code}
Behavior:

When a subtask tries to print Chinese characters, the above exception is throw.

The problem is that if the argument of 'format' is Unicode string, then the 
format string is also need to be a Unicode string.


> Task runner fail to dump log if stdout of substask contains Chinese characters
> --
>
> Key: AIRFLOW-3338
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3338
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: core
>Affects Versions: 1.8.1, 1.8.2
>Reporter: Cheng Yichao
>Priority: Major
>
> Code:
> {code:java}
> # At /airflow/task_runner/base_task_runner.py
> def _read_task_logs(self, stream):
>   while True:
> line = stream.readline().decode('utf-8')
> if len(line) == 0:
>break
> self.logger.info('Subtask: {}'.format(line.rstrip('\n'))){code}
> Error message:
> {code:java}
> Traceback (most recent call last):
> File "", line 1, in 
> UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-2: 
> ordinal not in range(128){code}
> Behavior:
> When a subtask tries to print Chinese characters, the above exception is 
> throw.
> The problem is that if the argument of 'format' is Unicode string, then the 
> format string is also need to be a Unicode string.
> The problem is fixed after version 1.9.0.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3337) "About" page version info is not available

2018-11-13 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3337?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16686141#comment-16686141
 ] 

jack commented on AIRFLOW-3337:
---

I think this was fixed by 
[https://github.com/apache/incubator-airflow/pull/4072]  upgrading to 1.10.1 
once released should solve the issue.

[~kaxilnaik] can you check it?

> "About" page version info is not available
> --
>
> Key: AIRFLOW-3337
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3337
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Dmytro Kulyk
>Priority: Minor
> Attachments: image-2018-11-14-01-00-58-743.png
>
>
> From the Airflow 1.10.0 ui, click about and the resulting page shows version 
> and git version as "Not available"
> Version has been upgraded from 1.9 over 
> {code}
> pip install apache-airflow=1.10.0
> {code}
>   !image-2018-11-14-01-00-58-743.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets feature to work as described

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets 
feature to work as described
URL: 
https://github.com/apache/incubator-airflow/pull/4156#issuecomment-436867007
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=h1)
 Report
   > Merging 
[#4156](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr&el=desc)
 will **increase** coverage by `0.05%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4156/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4156  +/-   ##
   ==
   + Coverage   77.66%   77.71%   +0.05% 
   ==
 Files 199  199  
 Lines   1629016296   +6 
   ==
   + Hits1265212665  +13 
   + Misses   3638 3631   -7
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/lineage/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL19faW5pdF9fLnB5)
 | `96.92% <100%> (+0.31%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.32% <0%> (+0.08%)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   | 
[airflow/lineage/datasets.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL2RhdGFzZXRzLnB5)
 | `87.32% <0%> (+2.81%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=footer).
 Last update 
[e6291e8...78cabb0](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets feature to work as described

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets 
feature to work as described
URL: 
https://github.com/apache/incubator-airflow/pull/4156#issuecomment-436867007
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=h1)
 Report
   > Merging 
[#4156](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr&el=desc)
 will **increase** coverage by `0.05%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4156/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4156  +/-   ##
   ==
   + Coverage   77.66%   77.71%   +0.05% 
   ==
 Files 199  199  
 Lines   1629016296   +6 
   ==
   + Hits1265212665  +13 
   + Misses   3638 3631   -7
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/lineage/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL19faW5pdF9fLnB5)
 | `96.92% <100%> (+0.31%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.32% <0%> (+0.08%)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   | 
[airflow/lineage/datasets.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL2RhdGFzZXRzLnB5)
 | `87.32% <0%> (+2.81%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=footer).
 Last update 
[e6291e8...78cabb0](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets feature to work as described

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets 
feature to work as described
URL: 
https://github.com/apache/incubator-airflow/pull/4156#issuecomment-436867007
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=h1)
 Report
   > Merging 
[#4156](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr&el=desc)
 will **increase** coverage by `0.05%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4156/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4156  +/-   ##
   ==
   + Coverage   77.66%   77.71%   +0.05% 
   ==
 Files 199  199  
 Lines   1629016296   +6 
   ==
   + Hits1265212665  +13 
   + Misses   3638 3631   -7
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/lineage/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL19faW5pdF9fLnB5)
 | `96.92% <100%> (+0.31%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.32% <0%> (+0.08%)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   | 
[airflow/lineage/datasets.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL2RhdGFzZXRzLnB5)
 | `87.32% <0%> (+2.81%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=footer).
 Last update 
[e6291e8...78cabb0](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gerardo commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-11-13 Thread GitBox
gerardo commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported 
Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-438530066
 
 
   @Fokko @kaxil @tedmiston 
   I've done some more work on 
https://github.com/apache/incubator-airflow-ci/pull/4, but I've found a few 
issues with the Python3.6 upgrade, plus other issues with some assumptions that 
the test suite makes:
   
   - 
[`tests.operators.test_virtualenv_operator.test_python_3`](https://github.com/apache/incubator-airflow/blob/6a7f3887460c91bc4da90fe8d31836a225324a1a/tests/operators/test_virtualenv_operator.py#L124)
 fails when run with the `python2` image 
([logs](https://travis-ci.com/gerardo/incubator-airflow/jobs/157826745#L4443)). 
Using `skipIf` to check what python version is present should be enough. And 
maybe we should do the same with [test_python_2 and 
test_python_2.7](https://github.com/apache/incubator-airflow/blob/6a7f3887460c91bc4da90fe8d31836a225324a1a/tests/operators/test_virtualenv_operator.py#L124)
   - 
[tests.operators.test_virtualenv_operator.test_string_args](https://github.com/apache/incubator-airflow/blob/6a7f3887460c91bc4da90fe8d31836a225324a1a/tests/operators/test_virtualenv_operator.py#L170)
 is a weird test. It uses a method called `_invert_python_major_version()` to 
use the opposite version to the one available, which is causing problems with 
[python2 
tests](https://travis-ci.com/gerardo/incubator-airflow/jobs/157826745#L4503). 
Do you guys understand this test?
   - There was a change between `python3.5` and `python3.6` in the error 
message emitted when an object is not serializable 
([logs](https://travis-ci.com/gerardo/incubator-airflow/jobs/157826750#L4568)). 
It's just a matter of changing the test to check for the new message format.
   - One of the tz tests is failing as well, more specifically, 
`test_following_previous_schedule_daily_dag_CEST_to_CET` 
([logs](https://travis-ci.com/gerardo/incubator-airflow/jobs/157826750#L4367)). 
I'm not sure where to start. It seems to happen only with `python3.6`.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4121: [AIRFLOW-2568] Azure Container 
Instances operator
URL: 
https://github.com/apache/incubator-airflow/pull/4121#issuecomment-436818600
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=h1)
 Report
   > Merging 
[#4121](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr&el=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `50%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4121/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4121  +/-   ##
   ==
   - Coverage   77.66%   77.66%   -0.01% 
   ==
 Files 199  199  
 Lines   1629016294   +4 
   ==
   + Hits1265212655   +3 
   - Misses   3638 3639   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4121/diff?src=pr&el=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==)
 | `33.33% <0%> (-0.27%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4121/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.24% <66.66%> (ø)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=footer).
 Last update 
[e6291e8...c65dc8b](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4121: [AIRFLOW-2568] Azure Container 
Instances operator
URL: 
https://github.com/apache/incubator-airflow/pull/4121#issuecomment-436818600
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=h1)
 Report
   > Merging 
[#4121](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr&el=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `50%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4121/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4121  +/-   ##
   ==
   - Coverage   77.66%   77.66%   -0.01% 
   ==
 Files 199  199  
 Lines   1629016294   +4 
   ==
   + Hits1265212655   +3 
   - Misses   3638 3639   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4121/diff?src=pr&el=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==)
 | `33.33% <0%> (-0.27%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4121/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.24% <66.66%> (ø)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=footer).
 Last update 
[e6291e8...c65dc8b](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4121: [AIRFLOW-2568] Azure Container 
Instances operator
URL: 
https://github.com/apache/incubator-airflow/pull/4121#issuecomment-436818600
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=h1)
 Report
   > Merging 
[#4121](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr&el=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `75%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4121/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4121  +/-   ##
   ==
   - Coverage   77.66%   77.66%   -0.01% 
   ==
 Files 199  199  
 Lines   1629016294   +4 
   ==
   + Hits1265212654   +2 
   - Misses   3638 3640   +2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4121/diff?src=pr&el=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==)
 | `33.33% <0%> (-0.27%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4121/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.24% <100%> (ø)` | :arrow_up: |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/incubator-airflow/pull/4121/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `88.68% <0%> (-0.37%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=footer).
 Last update 
[e6291e8...c65dc8b](https://codecov.io/gh/apache/incubator-airflow/pull/4121?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] omusavi commented on issue #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-13 Thread GitBox
omusavi commented on issue #4121: [AIRFLOW-2568] Azure Container Instances 
operator
URL: 
https://github.com/apache/incubator-airflow/pull/4121#issuecomment-438516567
 
 
   Removed extra line and rebased onto master so it is all up to date. Thanks 
@Fokko 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4179: [AIRFLOW-3332] Add insert_all to allow inserting rows into BigQuery table

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4179: [AIRFLOW-3332] Add insert_all to 
allow inserting rows into BigQuery table
URL: 
https://github.com/apache/incubator-airflow/pull/4179#issuecomment-438116748
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4179?src=pr&el=h1)
 Report
   > Merging 
[#4179](https://codecov.io/gh/apache/incubator-airflow/pull/4179?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/7ee30b6fac022dd56641a200710b6fc60eb5021d?src=pr&el=desc)
 will **increase** coverage by `1.27%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4179/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4179?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4179  +/-   ##
   ==
   + Coverage   76.39%   77.67%   +1.27% 
   ==
 Files 199  199  
 Lines   1627416290  +16 
   ==
   + Hits1243312653 +220 
   + Misses   3841 3637 -204
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4179?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.59% <0%> (ø)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.27% <0%> (+0.12%)` | :arrow_up: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `72.32% <0%> (+0.14%)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `69.35% <0%> (+0.18%)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (+0.36%)` | :arrow_up: |
   | 
[airflow/www\_rbac/app.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy9hcHAucHk=)
 | `97.08% <0%> (+0.97%)` | :arrow_up: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `97.67% <0%> (+1.16%)` | :arrow_up: |
   | 
[airflow/task/task\_runner/base\_task\_runner.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy90YXNrL3Rhc2tfcnVubmVyL2Jhc2VfdGFza19ydW5uZXIucHk=)
 | `79.31% <0%> (+1.72%)` | :arrow_up: |
   | 
[airflow/hooks/hive\_hooks.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oaXZlX2hvb2tzLnB5)
 | `75.26% <0%> (+1.84%)` | :arrow_up: |
   | 
[airflow/hooks/postgres\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9wb3N0Z3Jlc19ob29rLnB5)
 | `94.44% <0%> (+2.77%)` | :arrow_up: |
   | ... and [9 
more](https://codecov.io/gh/apache/incubator-airflow/pull/4179/diff?src=pr&el=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4179?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4179?src=pr&el=footer).
 Last update 
[7ee30b6...8a204e8](https://codecov.io/gh/apache/incubator-airflow/pull/4179?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4183: Fix typo in plugin docs.

2018-11-13 Thread GitBox
codecov-io commented on issue #4183: Fix typo in plugin docs.
URL: 
https://github.com/apache/incubator-airflow/pull/4183#issuecomment-438499370
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4183?src=pr&el=h1)
 Report
   > Merging 
[#4183](https://codecov.io/gh/apache/incubator-airflow/pull/4183?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4183/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4183?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4183  +/-   ##
   ==
   + Coverage   77.66%   77.67%   +<.01% 
   ==
 Files 199  199  
 Lines   1629016290  
   ==
   + Hits1265212653   +1 
   + Misses   3638 3637   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4183?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4183/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.27% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4183?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4183?src=pr&el=footer).
 Last update 
[e6291e8...26c6a7b](https://codecov.io/gh/apache/incubator-airflow/pull/4183?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] jmcarp opened a new pull request #4183: Fix typo in plugin docs.

2018-11-13 Thread GitBox
jmcarp opened a new pull request #4183: Fix typo in plugin docs.
URL: https://github.com/apache/incubator-airflow/pull/4183
 
 
   Just a one-character typo fix, shouldn't require a ticket!


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1370) Scheduler is crashing because of IntegrityError

2018-11-13 Thread Abhishek Sinha (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1370?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685916#comment-16685916
 ] 

Abhishek Sinha commented on AIRFLOW-1370:
-

I am using Airflow 1.8.2 with Celery and Postgres

> Scheduler is crashing because of IntegrityError
> ---
>
> Key: AIRFLOW-1370
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1370
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, scheduler
>Affects Versions: 1.8.0
>Reporter: Maneesh Sharma
>Priority: Major
>
> Scheduler is crashing with multiple task running on Celery Executor. It is 
> throwing `{color:red}IntegrityError: (psycopg2.IntegrityError) duplicate key 
> value violates unique constraint "task_instance_pkey"{color}`. Below is the 
> complete stack trace of error --
> Process DagFileProcessor490-Process:
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in 
> _bootstrap
> self.run()
>   File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
> self._target(*self._args, **self._kwargs)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/jobs.py", 
> line 348, in helper
> pickle_dags)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/utils/db.py", 
> line 53, in wrapper
> result = func(*args, **kwargs)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1587, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1176, in _process_dags
> self._process_task_instances(dag, tis_out)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/jobs.py", 
> line 880, in _process_task_instances
> run.verify_integrity(session=session)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/utils/db.py", 
> line 53, in wrapper
> result = func(*args, **kwargs)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/models.py", 
> line 4117, in verify_integrity
> session.commit()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 906, in commit
> self.transaction.commit()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 461, in commit
> self._prepare_impl()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 441, in _prepare_impl
> self.session.flush()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 2171, in flush
> self._flush(objects)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 2291, in _flush
> transaction.rollback(_capture_exception=True)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py",
>  line 66, in __exit__
> compat.reraise(exc_type, exc_value, exc_tb)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 2255, in _flush
> flush_context.execute()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
>  line 389, in execute
> rec.execute(self)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
>  line 548, in execute
> uow
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
>  line 181, in save_obj
> mapper, table, insert)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
>  line 799, in _emit_insert_statements
> execute(statement, multiparams)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", 
> line 945, in execute
> return meth(self, multiparams, params)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/sql/elements.py", 
> line 263, in _execute_on_connection
> return connection._execute_clauseelement(self, multiparams, params)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", 
> line 1053, in _execute_clauseelement
> compiled_sql, distilled_params
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", 
> line 1189, in _execute_context
> context)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", 
> line 1402, in _handle_dbapi_exception
> exc_info
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/util/compat.py", 
> line 203, in raise_from_cause
> reraise(type(exception), exception, tb=exc_tb, cause=cause)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-pac

[jira] [Commented] (AIRFLOW-1039) Airflow is raising IntegrityError when during parallel DAG trigger

2018-11-13 Thread Abhishek Sinha (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1039?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685895#comment-16685895
 ] 

Abhishek Sinha commented on AIRFLOW-1039:
-

IntegrityError: (psycopg2.IntegrityError) duplicate key value violates unique 
constraint "task_instance_pkey"
DETAIL: Key (task_id, dag_id, execution_date)=(SI_PS3H, cdc, 2018-11-07 
01:41:24) already exists.
 [SQL: 'INSERT INTO task_instance (task_id, dag_id, execution_date, start_date, 
end_date, duration, state, try_number, hostname, unixname, job_id, pool, queue, 
priority_weight, operator, queued_dttm, pid) VALUES (%(task_id)s, %(dag_id)s, 
%(execution_date)s, %(start_date)s, %(end_date)s, %(duration)s, %(state)s, 
%(try_number)s, %(hostname)s, %(unixname)s, %(job_id)s, %(pool)s, %(queue)s, 
%(priority_weight)s, %(operator)s, %(queued_dttm)s, %(pid)s)'] [parameters: 
\{'execution_date': datetime.datetime(2018, 11, 7, 1, 41, 24), 'end_date': 
None, 'job_id': None, 'task_id': 'SI_PS3H', 'pid': None, 'hostname': u'', 
'queued_dttm': None, 'try_number': 0, 'queue': 'default', 'operator': None, 
'state': None, 'dag_id': 'cdc', 'duration': None, 'priority_weight': 1, 
'start_date': None, 'pool': None, 'unixname': 'infoworksuser'}]
Process DagFileProcessor11274356-Process:
Traceback (most recent call last):
 File 
"/home/ec2-user/resources/python27/lib/python2.7/multiprocessing/process.py", 
line 258, in _bootstrap
 self.run()
 File 
"/home/ec2-user/resources/python27/lib/python2.7/multiprocessing/process.py", 
line 114, in run
 self._target(*self._args, **self._kwargs)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/airflow/jobs.py",
 line 347, in helper
 pickle_dags)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/airflow/utils/db.py",
 line 53, in wrapper
 result = func(*args, **kwargs)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/airflow/jobs.py",
 line 1584, in process_file
 self._process_dags(dagbag, dags, ti_keys_to_schedule)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/airflow/jobs.py",
 line 1176, in _process_dags
 self._process_task_instances(dag, tis_out)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/airflow/jobs.py",
 line 879, in _process_task_instances
 run.verify_integrity(session=session)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/airflow/utils/db.py",
 line 53, in wrapper
 result = func(*args, **kwargs)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/airflow/models.py",
 line 4298, in verify_integrity
 session.commit()
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
 line 874, in commit
 self.transaction.commit()
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
 line 461, in commit
 self._prepare_impl()
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
 line 441, in _prepare_impl
 self.session.flush()
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
 line 2137, in flush
 self._flush(objects)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
 line 2257, in _flush
 transaction.rollback(_capture_exception=True)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py",
 line 60, in __exit__
 compat.reraise(exc_type, exc_value, exc_tb)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
 line 2221, in _flush
 flush_context.execute()
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
 line 389, in execute
 rec.execute(self)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
 line 548, in execute
 uow
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
 line 181, in save_obj
 mapper, table, insert)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
 line 799, in _emit_insert_statements
 execute(statement, multiparams)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/engine/base.py",
 line 945, in execute
 return meth(self, multiparams, params)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/sql/elements.py",
 line 263, in _execute_on_connection
 return connection._execute_clauseelement(self, multiparams, params)
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/engine/base.py",
 line 1053, in _execute_clauseelement
 compiled_sql, distilled_params
 File 
"/home/ec2-user/resources/python27/lib/python2.7/site-packages/sqlalchemy/engine/base.py",
 line 1189, i

[jira] [Commented] (AIRFLOW-1039) Airflow is raising IntegrityError when during parallel DAG trigger

2018-11-13 Thread Abhishek Sinha (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1039?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685893#comment-16685893
 ] 

Abhishek Sinha commented on AIRFLOW-1039:
-

Hi,

 

I am using Airflow version 1.8.2 with Celery and Postgres.

 

I am facing a similar issue. Attaching the stack trace. I am not sure if this 
has been fixed in the later version. Can someone please confirm?

> Airflow is raising IntegrityError when during parallel DAG trigger
> --
>
> Key: AIRFLOW-1039
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1039
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DagRun
>Affects Versions: 1.8.0
>Reporter: Matus Valo
>Priority: Minor
>
> When Two concurrent processes are trying to trigger the same dag with the 
> same execution date at the same time, the IntegrityError is thrown by 
> SQLAlchemy:
> uwsgi[15887]: [2017-03-24 12:51:38,074] {app.py:1587} ERROR - Exception on / 
> [POST]
> uwsgi[15887]: Traceback (most recent call last):
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/flask/app.py", line 
> 1988, in wsgi_app
> uwsgi[15887]: response = self.full_dispatch_request()
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/flask/app.py", line 
> 1641, in full_dispatch_request
> uwsgi[15887]: rv = self.handle_user_exception(e)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/flask/app.py", line 
> 1544, in handle_user_exception
> uwsgi[15887]: reraise(exc_type, exc_value, tb)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/flask/app.py", line 
> 1639, in full_dispatch_request
> uwsgi[15887]: rv = self.dispatch_request()
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/flask/app.py", line 
> 1625, in dispatch_request
> uwsgi[15887]: return self.view_functions[rule.endpoint](**req.view_args)
> uwsgi[15887]: File "./ws.py", line 21, in hello
> uwsgi[15887]: trigger_dag('poc_dag2', run_id=str(uuid1()), 
> conf=json.dumps({'input_files': input_files}), execution_date=datetime.now())
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/airflow/api/common/experimental/trigger_dag.py",
>  line 56, in trigger_dag
> uwsgi[15887]: external_trigger=True
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/airflow/utils/db.py", 
> line 53, in wrapper
> uwsgi[15887]: result = func(*args, **kwargs)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/airflow/models.py", 
> line 3377, in create_dagrun
> uwsgi[15887]: session.commit()
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
>  line 874, in commit
> uwsgi[15887]: self.transaction.commit()
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
>  line 461, in commit
> uwsgi[15887]: self._prepare_impl()
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
>  line 441, in _prepare_impl
> uwsgi[15887]: self.session.flush()
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
>  line 2139, in flush
> uwsgi[15887]: self._flush(objects)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
>  line 2259, in _flush
> uwsgi[15887]: transaction.rollback(_capture_exception=True)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py",
>  line 60, in __exit__
> uwsgi[15887]: compat.reraise(exc_type, exc_value, exc_tb)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
>  line 2223, in _flush
> uwsgi[15887]: flush_context.execute()
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
>  line 389, in execute
> uwsgi[15887]: rec.execute(self)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
>  line 548, in execute
> uwsgi[15887]: uow
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
>  line 181, in save_obj
> uwsgi[15887]: mapper, table, insert)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
>  line 835, in _emit_insert_statements
> uwsgi[15887]: execute(statement, params)
> uwsgi[15887]: File 
> "/home/matus/envs/airflow/lib/python2.7/site-packages/sqlalchemy/engine/base.py",
>  line 945, in execute
> uwsgi[15887]: return meth(self, multiparams, params)
> uwsgi[15887]: File 
> "/home/matus/e

[jira] [Commented] (AIRFLOW-1370) Scheduler is crashing because of IntegrityError

2018-11-13 Thread Abhishek Sinha (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1370?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685885#comment-16685885
 ] 

Abhishek Sinha commented on AIRFLOW-1370:
-

Any update on this?

 

I am facing exactly the same issue (Duplicate Key error) in one of our 
production environment. Can someone please provide help on this? It is critical 
for our project. 

> Scheduler is crashing because of IntegrityError
> ---
>
> Key: AIRFLOW-1370
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1370
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, scheduler
>Affects Versions: 1.8.0
>Reporter: Maneesh Sharma
>Priority: Major
>
> Scheduler is crashing with multiple task running on Celery Executor. It is 
> throwing `{color:red}IntegrityError: (psycopg2.IntegrityError) duplicate key 
> value violates unique constraint "task_instance_pkey"{color}`. Below is the 
> complete stack trace of error --
> Process DagFileProcessor490-Process:
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in 
> _bootstrap
> self.run()
>   File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
> self._target(*self._args, **self._kwargs)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/jobs.py", 
> line 348, in helper
> pickle_dags)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/utils/db.py", 
> line 53, in wrapper
> result = func(*args, **kwargs)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1587, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1176, in _process_dags
> self._process_task_instances(dag, tis_out)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/jobs.py", 
> line 880, in _process_task_instances
> run.verify_integrity(session=session)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/utils/db.py", 
> line 53, in wrapper
> result = func(*args, **kwargs)
>   File "/home/ubuntu/.local/lib/python2.7/site-packages/airflow/models.py", 
> line 4117, in verify_integrity
> session.commit()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 906, in commit
> self.transaction.commit()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 461, in commit
> self._prepare_impl()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 441, in _prepare_impl
> self.session.flush()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 2171, in flush
> self._flush(objects)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 2291, in _flush
> transaction.rollback(_capture_exception=True)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py",
>  line 66, in __exit__
> compat.reraise(exc_type, exc_value, exc_tb)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", 
> line 2255, in _flush
> flush_context.execute()
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
>  line 389, in execute
> rec.execute(self)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
>  line 548, in execute
> uow
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
>  line 181, in save_obj
> mapper, table, insert)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
>  line 799, in _emit_insert_statements
> execute(statement, multiparams)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", 
> line 945, in execute
> return meth(self, multiparams, params)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/sql/elements.py", 
> line 263, in _execute_on_connection
> return connection._execute_clauseelement(self, multiparams, params)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", 
> line 1053, in _execute_clauseelement
> compiled_sql, distilled_params
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", 
> line 1189, in _execute_context
> context)
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/engine/base.py", 
> line 1402, in _handle_dbapi_exception
> exc_info
>   File 
> "/home/ubuntu/.local/lib/python2.7/site-packages/sqlalchemy/util/compat.py", 
> line 2

[jira] [Updated] (AIRFLOW-3337) "About" page version info is not available

2018-11-13 Thread Dmytro Kulyk (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dmytro Kulyk updated AIRFLOW-3337:
--
Description: 
>From the Airflow 1.10.0 ui, click about and the resulting page shows version 
>and git version as "Not available"
Version has been upgraded from 1.9 over 
{code}
pip install apache-airflow=1.10.0
{code}
  !image-2018-11-14-01-00-58-743.png!

  was:
>From the Airflow 1.10.0 ui, click about and the resulting page shows version 
>and git version as "Not available"

 


> "About" page version info is not available
> --
>
> Key: AIRFLOW-3337
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3337
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Dmytro Kulyk
>Priority: Minor
> Attachments: image-2018-11-14-01-00-58-743.png
>
>
> From the Airflow 1.10.0 ui, click about and the resulting page shows version 
> and git version as "Not available"
> Version has been upgraded from 1.9 over 
> {code}
> pip install apache-airflow=1.10.0
> {code}
>   !image-2018-11-14-01-00-58-743.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3337) "About" page version info is not available

2018-11-13 Thread Dmytro Kulyk (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dmytro Kulyk updated AIRFLOW-3337:
--
Attachment: image-2018-11-14-01-00-58-743.png

> "About" page version info is not available
> --
>
> Key: AIRFLOW-3337
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3337
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Dmytro Kulyk
>Priority: Minor
> Attachments: image-2018-11-14-01-00-58-743.png
>
>
> From the Airflow 1.10.0 ui, click about and the resulting page shows version 
> and git version as "Not available"
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3337) "About" page version info is not available

2018-11-13 Thread Dmytro Kulyk (JIRA)
Dmytro Kulyk created AIRFLOW-3337:
-

 Summary: "About" page version info is not available
 Key: AIRFLOW-3337
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3337
 Project: Apache Airflow
  Issue Type: Bug
Affects Versions: 1.10.0
Reporter: Dmytro Kulyk


>From the Airflow 1.10.0 ui, click about and the resulting page shows version 
>and git version as "Not available"

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2987) "About" page version info is not available

2018-11-13 Thread Dmytro Kulyk (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685867#comment-16685867
 ] 

Dmytro Kulyk commented on AIRFLOW-2987:
---

i can see nothing about version
!image-2018-11-14-00-58-12-964.png!

> "About" page version info is not available
> --
>
> Key: AIRFLOW-2987
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2987
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Frank Maritato
>Priority: Minor
> Attachments: Screen Shot 2018-08-30 at 10.17.52 AM.png, 
> image-2018-09-06-14-44-28-246.png
>
>
> From the Airflow 1.10.0 ui, click about and the resulting page shows version 
> and git version as "Not available"
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3336) Add ability for "skipped" state to be considered success

2018-11-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685863#comment-16685863
 ] 

ASF GitHub Bot commented on AIRFLOW-3336:
-

rmn36 opened a new pull request #4182: [AIRFLOW-3336] Add new TriggerRule that 
will consider  ancestors as s…
URL: https://github.com/apache/incubator-airflow/pull/4182
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [X] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3336/) issues and 
references them in the PR title.
   
   ### Description
   
   - [X] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Take the case where a task has 2 or more upstream parents and 1 or more of 
them can skipped. If TriggerRule ALL_DONE is used then the task will trigger 
even when upstream tasks fail. However if TriggerRule ALL_SUCCESS is used the 
task won't be triggered if any upstream are skipped. This creates a gap in 
functionality where it is necessary for "skipped" to be treated as "success" so 
that the task only runs if all parents succeed or are skipped. Said another way 
this allows tasks to be run if all ancestors do NOT fail.
   
   Therefore, a new trigger rule has been added that will count skipped as 
success to close this gap in functionality.
   
   ### Tests
   
   - [X] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [X] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [X] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ability for "skipped" state to be considered success
> 
>
> Key: AIRFLOW-3336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3336
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: DAG
>Reporter: Ryan Nowacoski
>Assignee: Ryan Nowacoski
>Priority: Trivial
>  Labels: beginner, usability
>
> Take the case where a task has 2 or more upstream parents and 1 or more of 
> them can skipped. If TriggerRule ALL_DONE is used then the task will trigger 
> even when upstream tasks fail. However if TriggerRule ALL_SUCCESS is used the 
> task won't be triggered if any upstream are skipped. This creates a gap in 
> functionality where it is necessary for "skipped" to be treated as "success" 
> so that the task only runs if all parents succeed or are skipped. Said 
> another way this allows tasks to be run if all ancestors do NOT fail.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] rmn36 opened a new pull request #4182: [AIRFLOW-3336] Add new TriggerRule that will consider ancestors as s…

2018-11-13 Thread GitBox
rmn36 opened a new pull request #4182: [AIRFLOW-3336] Add new TriggerRule that 
will consider  ancestors as s…
URL: https://github.com/apache/incubator-airflow/pull/4182
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [X] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3336/) issues and 
references them in the PR title.
   
   ### Description
   
   - [X] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Take the case where a task has 2 or more upstream parents and 1 or more of 
them can skipped. If TriggerRule ALL_DONE is used then the task will trigger 
even when upstream tasks fail. However if TriggerRule ALL_SUCCESS is used the 
task won't be triggered if any upstream are skipped. This creates a gap in 
functionality where it is necessary for "skipped" to be treated as "success" so 
that the task only runs if all parents succeed or are skipped. Said another way 
this allows tasks to be run if all ancestors do NOT fail.
   
   Therefore, a new trigger rule has been added that will count skipped as 
success to close this gap in functionality.
   
   ### Tests
   
   - [X] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [X] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [X] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4101: [AIRFLOW-3272] Add base grpc hook
URL: 
https://github.com/apache/incubator-airflow/pull/4101#issuecomment-433675239
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4101?src=pr&el=h1)
 Report
   > Merging 
[#4101](https://codecov.io/gh/apache/incubator-airflow/pull/4101?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr&el=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `40%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4101/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4101?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4101  +/-   ##
   ==
   - Coverage   77.66%   77.66%   -0.01% 
   ==
 Files 199  199  
 Lines   1629016293   +3 
   ==
   + Hits1265212654   +2 
   - Misses   3638 3639   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4101?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4101/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `69.35% <0%> (ø)` | :arrow_up: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4101/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `72.32% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4101/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.2% <33.33%> (-0.04%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4101?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4101?src=pr&el=footer).
 Last update 
[e6291e8...5d7347e](https://codecov.io/gh/apache/incubator-airflow/pull/4101?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3271) Airflow RBAC Permissions modification via UI do not persist

2018-11-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3271?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685625#comment-16685625
 ] 

ASF GitHub Bot commented on AIRFLOW-3271:
-

smithakoduri opened a new pull request #4118: [AIRFLOW-3271] Airflow RBAC 
Permissions modification via UI do not persist
URL: https://github.com/apache/incubator-airflow/pull/4118
 
 
   Fix is added to not reset the database whenever a new process comes up (or 
init_roles is called). Otherwise Airflow RBAC permissions that are modified, do 
not persist for long
   
   "test_update_and_verify_permission_role" test has been added.
   
   Make sure you have checked _all_ steps below.
   
   Jira
   - [x] My PR addresses the following AIRFLOW-3271 issues and references them 
in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 -https://issues.apache.org/jira/browse/AIRFLOW-3271
   
   Description
   - [x] Here are some details about my PR, including screenshots of any UI 
changes: My PR adds a fix to make Airflow RBAC permission changes to the 
database persist. The RBAC permission will only be reset in the very beginning 
and later when init_role is called, it will keep the existing permission as is 
and will not reset it. Previously, when any change is made to the RBAC 
permissions, those changes used to get reset when init_roles is called (for 
every new process creation). Because of which, any new RBAC permission added 
via UI used to be missing in the database after sometime. 
   
   ### Tests
   
   - [x] My PR adds the following unit tests:
   test_update_and_verify_permission_role
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
   No new functionality
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Airflow RBAC Permissions modification via UI do not persist
> ---
>
> Key: AIRFLOW-3271
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3271
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Smitha Koduri
>Assignee: Smitha Koduri
>Priority: Major
> Fix For: 1.10.2
>
>
> After upgrading Airflow to 1.10, we have noticed that when attempting to add 
> a new permission-role mapping (via UI), initially it gets successfully added 
> to db. But later, the entry doesn't persist in the db. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] smithakoduri opened a new pull request #4118: [AIRFLOW-3271] Airflow RBAC Permissions modification via UI do not persist

2018-11-13 Thread GitBox
smithakoduri opened a new pull request #4118: [AIRFLOW-3271] Airflow RBAC 
Permissions modification via UI do not persist
URL: https://github.com/apache/incubator-airflow/pull/4118
 
 
   Fix is added to not reset the database whenever a new process comes up (or 
init_roles is called). Otherwise Airflow RBAC permissions that are modified, do 
not persist for long
   
   "test_update_and_verify_permission_role" test has been added.
   
   Make sure you have checked _all_ steps below.
   
   Jira
   - [x] My PR addresses the following AIRFLOW-3271 issues and references them 
in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 -https://issues.apache.org/jira/browse/AIRFLOW-3271
   
   Description
   - [x] Here are some details about my PR, including screenshots of any UI 
changes: My PR adds a fix to make Airflow RBAC permission changes to the 
database persist. The RBAC permission will only be reset in the very beginning 
and later when init_role is called, it will keep the existing permission as is 
and will not reset it. Previously, when any change is made to the RBAC 
permissions, those changes used to get reset when init_roles is called (for 
every new process creation). Because of which, any new RBAC permission added 
via UI used to be missing in the database after sometime. 
   
   ### Tests
   
   - [x] My PR adds the following unit tests:
   test_update_and_verify_permission_role
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
   No new functionality
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3271) Airflow RBAC Permissions modification via UI do not persist

2018-11-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3271?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685623#comment-16685623
 ] 

ASF GitHub Bot commented on AIRFLOW-3271:
-

smithakoduri closed pull request #4118: [AIRFLOW-3271] Airflow RBAC Permissions 
modification via UI do not persist
URL: https://github.com/apache/incubator-airflow/pull/4118
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/www_rbac/security.py b/airflow/www_rbac/security.py
index 6bb67d4d83..8f9b6287ac 100644
--- a/airflow/www_rbac/security.py
+++ b/airflow/www_rbac/security.py
@@ -181,13 +181,17 @@ def init_role(self, role_name, role_vms, role_perms):
 if not role:
 role = self.add_role(role_name)
 
-role_pvms = []
-for pvm in pvms:
-if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
-role_pvms.append(pvm)
-role.permissions = list(set(role_pvms))
-self.get_session.merge(role)
-self.get_session.commit()
+if len(role.permissions) == 0:
+logging.info('Initializing permissions for role:%s in the 
database.', role_name)
+role_pvms = []
+for pvm in pvms:
+if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
+role_pvms.append(pvm)
+role.permissions = list(set(role_pvms))
+self.get_session.merge(role)
+self.get_session.commit()
+else:
+logging.info('Existing permissions for the role:%s within the 
database will persist.', role_name)
 
 def get_user_roles(self, user=None):
 """
diff --git a/tests/www_rbac/test_security.py b/tests/www_rbac/test_security.py
index 6e0b572639..9b32a86c9c 100644
--- a/tests/www_rbac/test_security.py
+++ b/tests/www_rbac/test_security.py
@@ -107,6 +107,21 @@ def test_init_role_modelview(self):
 self.assertIsNotNone(role)
 self.assertEqual(len(role_perms), len(role.permissions))
 
+def test_update_and_verify_permission_role(self):
+role_name = 'Test_Role'
+self.security_manager.init_role(role_name, [], [])
+role = self.security_manager.find_role(role_name)
+
+perm = self.security_manager.\
+find_permission_view_menu('can_edit', 'RoleModelView')
+self.security_manager.add_permission_role(role, perm)
+role_perms_len = len(role.permissions)
+
+self.security_manager.init_role(role_name, [], [])
+new_role_perms_len = len(role.permissions)
+
+self.assertEqual(role_perms_len, new_role_perms_len)
+
 def test_get_user_roles(self):
 user = mock.MagicMock()
 user.is_anonymous = False
diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index 4b6d9d7d12..746f27abd4 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -962,6 +962,9 @@ def add_permission_for_role(self):
 all_dag_role = self.appbuilder.sm.find_role('all_dag_role')
 self.appbuilder.sm.add_permission_role(all_dag_role, perm_on_all_dag)
 
+role_user = self.appbuilder.sm.find_role('User')
+self.appbuilder.sm.add_permission_role(role_user, perm_on_all_dag)
+
 read_only_perm_on_dag = self.appbuilder.sm.\
 find_permission_view_menu('can_dag_read', 'example_bash_operator')
 dag_read_only_role = self.appbuilder.sm.find_role('dag_acl_read_only')


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Airflow RBAC Permissions modification via UI do not persist
> ---
>
> Key: AIRFLOW-3271
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3271
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Smitha Koduri
>Assignee: Smitha Koduri
>Priority: Major
> Fix For: 1.10.2
>
>
> After upgrading Airflow to 1.10, we have noticed that when attempting to add 
> a new permission-role mapping (via UI), initially it gets successfully added 
> to db. But later, the entry doesn't persist in the db. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] smithakoduri closed pull request #4118: [AIRFLOW-3271] Airflow RBAC Permissions modification via UI do not persist

2018-11-13 Thread GitBox
smithakoduri closed pull request #4118: [AIRFLOW-3271] Airflow RBAC Permissions 
modification via UI do not persist
URL: https://github.com/apache/incubator-airflow/pull/4118
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/www_rbac/security.py b/airflow/www_rbac/security.py
index 6bb67d4d83..8f9b6287ac 100644
--- a/airflow/www_rbac/security.py
+++ b/airflow/www_rbac/security.py
@@ -181,13 +181,17 @@ def init_role(self, role_name, role_vms, role_perms):
 if not role:
 role = self.add_role(role_name)
 
-role_pvms = []
-for pvm in pvms:
-if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
-role_pvms.append(pvm)
-role.permissions = list(set(role_pvms))
-self.get_session.merge(role)
-self.get_session.commit()
+if len(role.permissions) == 0:
+logging.info('Initializing permissions for role:%s in the 
database.', role_name)
+role_pvms = []
+for pvm in pvms:
+if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
+role_pvms.append(pvm)
+role.permissions = list(set(role_pvms))
+self.get_session.merge(role)
+self.get_session.commit()
+else:
+logging.info('Existing permissions for the role:%s within the 
database will persist.', role_name)
 
 def get_user_roles(self, user=None):
 """
diff --git a/tests/www_rbac/test_security.py b/tests/www_rbac/test_security.py
index 6e0b572639..9b32a86c9c 100644
--- a/tests/www_rbac/test_security.py
+++ b/tests/www_rbac/test_security.py
@@ -107,6 +107,21 @@ def test_init_role_modelview(self):
 self.assertIsNotNone(role)
 self.assertEqual(len(role_perms), len(role.permissions))
 
+def test_update_and_verify_permission_role(self):
+role_name = 'Test_Role'
+self.security_manager.init_role(role_name, [], [])
+role = self.security_manager.find_role(role_name)
+
+perm = self.security_manager.\
+find_permission_view_menu('can_edit', 'RoleModelView')
+self.security_manager.add_permission_role(role, perm)
+role_perms_len = len(role.permissions)
+
+self.security_manager.init_role(role_name, [], [])
+new_role_perms_len = len(role.permissions)
+
+self.assertEqual(role_perms_len, new_role_perms_len)
+
 def test_get_user_roles(self):
 user = mock.MagicMock()
 user.is_anonymous = False
diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index 4b6d9d7d12..746f27abd4 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -962,6 +962,9 @@ def add_permission_for_role(self):
 all_dag_role = self.appbuilder.sm.find_role('all_dag_role')
 self.appbuilder.sm.add_permission_role(all_dag_role, perm_on_all_dag)
 
+role_user = self.appbuilder.sm.find_role('User')
+self.appbuilder.sm.add_permission_role(role_user, perm_on_all_dag)
+
 read_only_perm_on_dag = self.appbuilder.sm.\
 find_permission_view_menu('can_dag_read', 'example_bash_operator')
 dag_read_only_role = self.appbuilder.sm.find_role('dag_acl_read_only')


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4181: [AIRFLOW-XXX] Remove duplicated line in changelog

2018-11-13 Thread GitBox
codecov-io commented on issue #4181: [AIRFLOW-XXX] Remove duplicated line in 
changelog
URL: 
https://github.com/apache/incubator-airflow/pull/4181#issuecomment-438345083
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=h1)
 Report
   > Merging 
[#4181](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/86a83bfff3777ad228b515c6b58ee2ffd7250a26?src=pr&el=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4181/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#4181   +/-   ##
   ===
 Coverage   77.67%   77.67%   
   ===
 Files 199  199   
 Lines   1629016290   
   ===
 Hits1265312653   
 Misses   3637 3637
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=footer).
 Last update 
[86a83bf...78ae38f](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4181: [AIRFLOW-XXX] Remove duplicated line in changelog

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4181: [AIRFLOW-XXX] Remove duplicated 
line in changelog
URL: 
https://github.com/apache/incubator-airflow/pull/4181#issuecomment-438345083
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=h1)
 Report
   > Merging 
[#4181](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/86a83bfff3777ad228b515c6b58ee2ffd7250a26?src=pr&el=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4181/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#4181   +/-   ##
   ===
 Coverage   77.67%   77.67%   
   ===
 Files 199  199   
 Lines   1629016290   
   ===
 Hits1265312653   
 Misses   3637 3637
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=footer).
 Last update 
[86a83bf...78ae38f](https://codecov.io/gh/apache/incubator-airflow/pull/4181?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3336) Add ability for "skipped" state to be considered success

2018-11-13 Thread Ryan Nowacoski (JIRA)
Ryan Nowacoski created AIRFLOW-3336:
---

 Summary: Add ability for "skipped" state to be considered success
 Key: AIRFLOW-3336
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3336
 Project: Apache Airflow
  Issue Type: Improvement
  Components: DAG
Reporter: Ryan Nowacoski
Assignee: Ryan Nowacoski


Take the case where a task has 2 or more upstream parents and 1 or more of them 
can skipped. If TriggerRule ALL_DONE is used then the task will trigger even 
when upstream tasks fail. However if TriggerRule ALL_SUCCESS is used the task 
won't be triggered if any upstream are skipped. This creates a gap in 
functionality where it is necessary for "skipped" to be treated as "success" so 
that the task only runs if all parents succeed or are skipped. Said another way 
this allows tasks to be run if all ancestors do NOT fail.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4111: [AIRFLOW-3266] Add AWS Athena 
Operator and hook
URL: 
https://github.com/apache/incubator-airflow/pull/4111#issuecomment-433705746
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4111?src=pr&el=h1)
 Report
   > Merging 
[#4111](https://codecov.io/gh/apache/incubator-airflow/pull/4111?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/86a83bfff3777ad228b515c6b58ee2ffd7250a26?src=pr&el=desc)
 will **decrease** coverage by `1%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4111/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4111?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4111  +/-   ##
   ==
   - Coverage   77.67%   76.67%   -1.01% 
   ==
 Files 199  199  
 Lines   1629016212  -78 
   ==
   - Hits1265312430 -223 
   - Misses   3637 3782 +145
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4111?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/generic\_transfer.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ2VuZXJpY190cmFuc2Zlci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/www\_rbac/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy91dGlscy5weQ==)
 | `68.94% <0%> (-5.15%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `79.83% <0%> (-3.23%)` | :arrow_down: |
   | 
[airflow/hooks/postgres\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9wb3N0Z3Jlc19ob29rLnB5)
 | `91.66% <0%> (-2.78%)` | :arrow_down: |
   | 
[airflow/hooks/mysql\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9teXNxbF9ob29rLnB5)
 | `90.38% <0%> (-2.6%)` | :arrow_down: |
   | 
[airflow/hooks/hive\_hooks.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oaXZlX2hvb2tzLnB5)
 | `73.42% <0%> (-1.85%)` | :arrow_down: |
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `91.27% <0%> (-1.35%)` | :arrow_down: |
   | ... and [21 
more](https://codecov.io/gh/apache/incubator-airflow/pull/4111/diff?src=pr&el=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4111?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4111?src=pr&el=footer).
 Last update 
[86a83bf...aa4a46e](https://codecov.io/gh/apache/incubator-airflow/pull/4111?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb closed pull request #4181: [AIRFLOW-XXX] Remove duplicated line in changelog

2018-11-13 Thread GitBox
ashb closed pull request #4181: [AIRFLOW-XXX] Remove duplicated line in 
changelog
URL: https://github.com/apache/incubator-airflow/pull/4181
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index b4ee1755b4..8020bfbb70 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -11,7 +11,6 @@ AIRFLOW 1.10.0, 2018-08-03
 [AIRFLOW-2710] Clarify fernet key value in documentation
 [AIRFLOW-2606] Fix DB schema and SQLAlchemy model
 [AIRFLOW-2646] Fix setup.py not to install snakebite on Python3
-[AIRFLOW-2512][AIRFLOW-2522] Use google-auth instead of oauth2client
 [AIRFLOW-2604] Add index to task_fail
 [AIRFLOW-2650] Mark SchedulerJob as succeed when hitting Ctrl-c
 [AIRFLOW-2678] Fix db schema unit test to remove checking fab models


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4181: [AIRFLOW-XXX] Remove duplicated line in changelog

2018-11-13 Thread GitBox
ashb commented on issue #4181: [AIRFLOW-XXX] Remove duplicated line in changelog
URL: 
https://github.com/apache/incubator-airflow/pull/4181#issuecomment-438332338
 
 
   :D


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bskim45 opened a new pull request #4181: [AIRFLOW-XXX] Remove duplicated line in changelog

2018-11-13 Thread GitBox
bskim45 opened a new pull request #4181: [AIRFLOW-XXX] Remove duplicated line 
in changelog
URL: https://github.com/apache/incubator-airflow/pull/4181
 
 
   I accidentally found duplicated line in changelog 😃  
([L14](https://github.com/apache/incubator-airflow/blob/abc9ebb973ade4d39780af293ff70217074300d0/CHANGELOG.txt#L14)
 / 
[L48](https://github.com/apache/incubator-airflow/blob/abc9ebb973ade4d39780af293ff70217074300d0/CHANGELOG.txt#L48))
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-2731) Airflow psutil dependency is out of date

2018-11-13 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2731?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2731:
---
Fix Version/s: 1.10.2

If we do a 1.10.2 lets include this in there - this version of psutil is 
packaged as a wheel so is much easier to install!

> Airflow psutil dependency is out of date
> 
>
> Key: AIRFLOW-2731
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2731
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: George Leslie-Waksman
>Priority: Minor
> Fix For: 2.0.0, 1.10.2
>
>
> Airflow is pinned to psutil<5.0.0 and the current version is 5.4.6



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] phani8996 commented on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
phani8996 commented on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and 
hook
URL: 
https://github.com/apache/incubator-airflow/pull/4111#issuecomment-438315381
 
 
   @ashb @Fokko Thanks for guiding me through this. I am squashing my commits. 
Please merge this PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Comment Edited] (AIRFLOW-1327) LocalExecutor won't reschedule on concurrency limit hit

2018-11-13 Thread pranav agrawal (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685374#comment-16685374
 ] 

pranav agrawal edited comment on AIRFLOW-1327 at 11/13/18 3:31 PM:
---

we are also hitting this issue many times in our Airflow setup, please assist 
workaround.
 airflow version: 1.9.0

using CeleryExecutore


was (Author: pranav.agrawal1):
we are also hitting this issue many times in our Airflow setup, please assist 
workaround.
airflow version: 1.10.0

using CeleryExecutore

> LocalExecutor won't reschedule on concurrency limit hit
> ---
>
> Key: AIRFLOW-1327
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1327
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.8.1
> Environment: LocalExecutor
>Reporter: Dennis Muth
>Priority: Major
> Attachments: Airflow_logs.png, ti_unscheduled.png
>
>
> For several days we are trying to migrate from airflow 1.7.1.3 to 1.8.1.
> Unfortunately we ran into a serious issue that seems to be scheduler related 
> (we are using the LocalExecutor one).
> When running a SubDag some Task instances get queued (queues are defined), 
> switch to running and some time later finish. Well, thats how it should be.
> But: Some task instances get queued up, print some cryptic warning message 
> (we get to this in a sec) and then get no state (NONE).
> The warning message:
> {code}
> FIXME: Rescheduling due to concurrency limits reached at task runtime. 
> Attempt 1 of 2. State set to NONE.
> {code}
> This suggests that a limit is too low and that this instance will be picked 
> up later by the scheduler for processing, when there are probably more slots 
> available. 
> We have waited for quite some time now, but the task is not re-scheduled.
> When I rerun the subdag some previous failed task instances (state = None) 
> will now succeed, but other - previously successful ones - will fail. Weird...
> I've attached some screenshots to make this more transparent to you, too.
> Is this a bug or just on purpose? Do we need to switch to the CeleryExecutor?
> Please do not hesitate if you need additional logs or other stuff.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1327) LocalExecutor won't reschedule on concurrency limit hit

2018-11-13 Thread pranav agrawal (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685374#comment-16685374
 ] 

pranav agrawal commented on AIRFLOW-1327:
-

we are also hitting this issue many times in our Airflow setup, please assist 
workaround.
airflow version: 1.10.0

using CeleryExecutore

> LocalExecutor won't reschedule on concurrency limit hit
> ---
>
> Key: AIRFLOW-1327
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1327
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.8.1
> Environment: LocalExecutor
>Reporter: Dennis Muth
>Priority: Major
> Attachments: Airflow_logs.png, ti_unscheduled.png
>
>
> For several days we are trying to migrate from airflow 1.7.1.3 to 1.8.1.
> Unfortunately we ran into a serious issue that seems to be scheduler related 
> (we are using the LocalExecutor one).
> When running a SubDag some Task instances get queued (queues are defined), 
> switch to running and some time later finish. Well, thats how it should be.
> But: Some task instances get queued up, print some cryptic warning message 
> (we get to this in a sec) and then get no state (NONE).
> The warning message:
> {code}
> FIXME: Rescheduling due to concurrency limits reached at task runtime. 
> Attempt 1 of 2. State set to NONE.
> {code}
> This suggests that a limit is too low and that this instance will be picked 
> up later by the scheduler for processing, when there are probably more slots 
> available. 
> We have waited for quite some time now, but the task is not re-scheduled.
> When I rerun the subdag some previous failed task instances (state = None) 
> will now succeed, but other - previously successful ones - will fail. Weird...
> I've attached some screenshots to make this more transparent to you, too.
> Is this a bug or just on purpose? Do we need to switch to the CeleryExecutor?
> Please do not hesitate if you need additional logs or other stuff.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


svn commit: r30879 - /dev/incubator/airflow/1.10.1rc1/

2018-11-13 Thread ash
Author: ash
Date: Tue Nov 13 15:24:31 2018
New Revision: 30879

Log:
Replace incorrectly published artefacts

These were built off the wrong git commit so wern't correct. I have re-used the 
version as no announcment email went out for this

Added:

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz  
 (with props)

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.asc

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.md5

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.sha512

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz
   (with props)

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.asc

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.md5

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.sha512

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz
==
Binary file - no diff available.

Propchange: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz
--
svn:mime-type = application/octet-stream

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.asc
==
--- 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.asc
 (added)
+++ 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.asc
 Tue Nov 13 15:24:31 2018
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+
+iQJDBAABCAAtFiEEXMrqx1jtZMoyPwU7gHxzGoyCoJUFAlvq7EMPHGFzaEBhcGFj
+aGUub3JnAAoJEIB8cxqMgqCVVKAQAIANlemksc0IHt2zPF+ST1rPGKhkdDdjQIfs
+RWKqg1ebAAfsoxinW0uPdwgrDozzIR2QIt180zAdfeblraQfUrFDAZiKr+pdwKsK
+huafGm00gT3VnGogkJW21yGOg53PIdqpqb3MbWQS//2DKiwSQ2YTCXPhOX2kIdma
+UrtlilNrtfaYe6rNVyBEToLQIutLifrc+8Z/HYzio0VpkLE1ydoTTkrRvCNOPQKk
+9uFqTbOFIQt7/8m0Xd9EhZPnx4E77Y6qeTUwNvVUeOHNmrfKNcwMMMa9k4NhKwZ6
+fN4tdf0CULESLAV5mxa++ppIF5XdyJ4sPzn8A9vZStPbcNi5iTrwv+yOC3cHPuk8
+MGGiTlnyFN7a34qXR0Ol5oTyXT75LaHQih9YwpDJ7PQA6/d+9EBOH4A08+f+ctQL
+q6lzlF/4OSw1Kcly2bNcZjmRkrdZWnQUwyh7+n7OteQtoqwUwvSns4ajPnqYsLlV
+vEABdYr8LrRMS60+2HaRRkQqqGSWWlKeAVD2bGY29YkMRcDatqyn+Dbqx/mdTZjE
+PzpSRaXISWuq4aXyePnsNl6fhY7+7muAroew0zaVHfWkL+7KK7GLPecF/tUfJXIG
+9xmoUwgzY3CctCKarxiBI103JXLFpZfg2vCB/4sOmBsIG/8M0DUyLmglIoiU5TmK
+vY2DjVBE
+=Depw
+-END PGP SIGNATURE-

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.md5
==
--- 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.md5
 (added)
+++ 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.md5
 Tue Nov 13 15:24:31 2018
@@ -0,0 +1,2 @@
+../airflow-dist/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz: 
+69 A3 BD C9 A2 33 54 B3  E9 AA 51 96 97 0B E0 A3

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.sha512
==
--- 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.sha512
 (added)
+++ 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.sha512
 Tue Nov 13 15:24:31 2018
@@ -0,0 +1,3 @@
+../airflow-dist/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz: 
+69E64F29 77ACEC68 DAA7A028 58B39C1E 0B99E1EE B840681D 5154B5B3 D623BE2F 
BDB9F101
+ 44BF9457 E8763953 B9F0C365 48091569 7AE41E2C 7BCD059C FDADF489

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz
==
Binary file - no diff available.

Propchange: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz
--
svn:mime-type = application/octet-stream

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.asc
==
--- 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.asc
 (added)
+++ 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.asc
 Tue Nov 13 15:24:31 2018
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+
+iQJDBAABCAAtFiEEXMrqx1jtZMoyPwU7gHxzGoyCoJUFAlvq7EAPHGFzaEBhcGFj
+aGUub3JnAAoJEIB8cxqMgqCVQKcP/RZpCfr61TxOtW7XtZSB7ZB4M+8rlAkC4KDK
+fU3YSp42c03XK68TewYSpyas0JgVsxcP+ZLREWB/kCo3LkA9jpOLMG0N9ZzLR+/H
+y/9rJuUWSbhp2mBvoSdecAiIhZ5t2VaJ+5SEQ9akwgg0zkrKVmW71PVbjpO612oy
+A+dn3rE37RlcIOJfcR2UJQLB7CYZMGqFT

svn commit: r30878 - /dev/incubator/airflow/1.10.1rc1/

2018-11-13 Thread ash
Author: ash
Date: Tue Nov 13 15:13:59 2018
New Revision: 30878

Log:
Airflow 1.10.1 RC1

Added:
dev/incubator/airflow/1.10.1rc1/

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz  
 (with props)

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.asc

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.md5

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.sha512

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz
   (with props)

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.asc

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.md5

dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.sha512

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz
==
Binary file - no diff available.

Propchange: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz
--
svn:mime-type = application/octet-stream

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.asc
==
--- 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.asc
 (added)
+++ 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.asc
 Tue Nov 13 15:13:59 2018
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+
+iQJDBAABCAAtFiEEXMrqx1jtZMoyPwU7gHxzGoyCoJUFAlvq6ZYPHGFzaEBhcGFj
+aGUub3JnAAoJEIB8cxqMgqCVqZsP/jNva2N/GjQhSXinJ17RTo8PjsE7vYP7n9UE
+MpMXGehy9OSPDbfj59+u8JTYQjW3oI3xW6eKoy0jLfaYmIpZaWDmv10tsVRegf2G
+g0n2HNzuf/U0+P1xZSIx7z8twSaHzwqkUvVfLzMQir5ZPVALrYUCnNR2ndSsT2Sw
+bs3rFw6RUa3hDehJb+037ONPqvGCLnLsnp9f3Kn9MRtMJNAN4D34sRtY6e3xFofC
+HY6nMAmTpR53uth2uViYNYUiOm7H0+KH8Gw+brFhYlslm4BnRyv6I8Y+Ajg2QhLf
+VuO69B9f72X6Wpn21eKCZHofNBdrngRWUofHgvDR+O0qH5K6Aa1YNoqriIBjmYFO
+hTQ2+RlnQt9UzbCFTeE8gnvG/hPRBMxXpx4fAvFP6J/4hbv0Mr25Twd7/RlH2kVy
+YOJC753qJom96Hdc+md3AsToBpfxSrji4fkHKl1Js2rmh8BhdXzeATC1YiCLR5vW
+omfIMP8lK0C5ICOs0Llg4anWEUpayrd6kgXrtYTD1V8IkkwD1tI8x/nC5wEXfyv8
+i+zYdLG7lj9lyV75wQNkNKYQXsOoPArciyXVZP5kSHNPr5dqVrCcBbS6AZyjtWer
+DTnpgiDhpa44fViw2CCBosi1wN53YDN0AfaLJenmB55O+YkLV+SFqke3fB5mApNq
+FVCm5aRq
+=AIQy
+-END PGP SIGNATURE-

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.md5
==
--- 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.md5
 (added)
+++ 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.md5
 Tue Nov 13 15:13:59 2018
@@ -0,0 +1,2 @@
+../airflow-dist/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz: 
+1E 63 83 1E C9 70 00 27  53 67 C6 BB 40 A6 3E 4F

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.sha512
==
--- 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.sha512
 (added)
+++ 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz.sha512
 Tue Nov 13 15:13:59 2018
@@ -0,0 +1,3 @@
+../airflow-dist/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-bin.tar.gz: 
+01FCFAA1 B6F2873E 26DEF043 A522BDA4 A7F3C722 827F676E D3236190 54D39E6A 
125CBD99
+ 39596519 5FB85AEE E5A48183 E4308267 45E15196 706C96D0 9DF07ECA

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz
==
Binary file - no diff available.

Propchange: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz
--
svn:mime-type = application/octet-stream

Added: 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.asc
==
--- 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.asc
 (added)
+++ 
dev/incubator/airflow/1.10.1rc1/apache-airflow-1.10.1rc1+incubating-source.tar.gz.asc
 Tue Nov 13 15:13:59 2018
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+
+iQJDBAABCAAtFiEEXMrqx1jtZMoyPwU7gHxzGoyCoJUFAlvq6ZoPHGFzaEBhcGFj
+aGUub3JnAAoJEIB8cxqMgqCVS2cP/i3DSHM2siilGIO7+J+w8y6i/nHk1WQ0WpGL
+BYUCxPPms3HMVxH+Ku5eZatjto8NCpha4wUf/1LgB7gSF0sJzjA9TF5bYTeTpwnQ
+ttaBx6kKdYqUxBh4ir7mtOYny2dXq8TkJi5qvCPV4YFwjypJARIlP1jYKXiwN15S
+ohUiS80mP5nSCg6hqQpMm8t0kOsXh0s3f/j7HbmAOZ/NzJY6N7U4BVBAkBeYW++g
+RUPrNfzOcahyrKgv3N++CDVyoSxOBpSTUf9gP+n3kK7+sDzEKJL/yOI3iJgS0y/l
+zeiDZtivQY8f93gMJ

[jira] [Commented] (AIRFLOW-3323) Support Basic Authentication for Flower

2018-11-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3323?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685319#comment-16685319
 ] 

ASF GitHub Bot commented on AIRFLOW-3323:
-

ashb closed pull request #4166: [AIRFLOW-3323] Support HTTP basic 
authentication for Airflow Flower
URL: https://github.com/apache/incubator-airflow/pull/4166
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index cfc6c6b8d6..5ddac2f886 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -1276,6 +1276,10 @@ def flower(args):
 if args.url_prefix:
 url_prefix = '--url-prefix=' + args.url_prefix
 
+basic_auth = ''
+if args.basic_auth:
+basic_auth = '--basic_auth=' + args.basic_auth
+
 flower_conf = ''
 if args.flower_conf:
 flower_conf = '--conf=' + args.flower_conf
@@ -1297,7 +1301,7 @@ def flower(args):
 
 with ctx:
 os.execvp("flower", ['flower', '-b',
- broka, address, port, api, flower_conf, 
url_prefix])
+ broka, address, port, api, flower_conf, 
url_prefix, basic_auth])
 
 stdout.close()
 stderr.close()
@@ -1306,7 +1310,7 @@ def flower(args):
 signal.signal(signal.SIGTERM, sigint_handler)
 
 os.execvp("flower", ['flower', '-b',
- broka, address, port, api, flower_conf, 
url_prefix])
+ broka, address, port, api, flower_conf, 
url_prefix, basic_auth])
 
 
 @cli_utils.action_logging
@@ -1823,6 +1827,12 @@ class CLIFactory(object):
 ("-u", "--url_prefix"),
 default=conf.get('celery', 'FLOWER_URL_PREFIX'),
 help="URL prefix for Flower"),
+'flower_basic_auth': Arg(
+("-ba", "--basic_auth"),
+default=conf.get('celery', 'FLOWER_BASIC_AUTH'),
+help=("Securing Flower with Basic Authentication. "
+  "Accepts user:password pairs separated by a comma. "
+  "Example: flower_basic_auth = 
user1:password1,user2:password2")),
 'task_params': Arg(
 ("-tp", "--task_params"),
 help="Sends a JSON params dict to the task"),
@@ -2070,7 +2080,7 @@ class CLIFactory(object):
 'func': flower,
 'help': "Start a Celery Flower",
 'args': ('flower_hostname', 'flower_port', 'flower_conf', 
'flower_url_prefix',
- 'broker_api', 'pid', 'daemon', 'stdout', 'stderr', 
'log_file'),
+ 'flower_basic_auth', 'broker_api', 'pid', 'daemon', 
'stdout', 'stderr', 'log_file'),
 }, {
 'func': version,
 'help': "Show the version",
diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 5c2d2e1512..4d73fdf51d 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -392,6 +392,11 @@ flower_url_prefix =
 # This defines the port that Celery Flower runs on
 flower_port = 
 
+# Securing Flower with Basic Authentication
+# Accepts user:password pairs separated by a comma
+# Example: flower_basic_auth = user1:password1,user2:password2
+flower_basic_auth =
+
 # Default queue that tasks get assigned to and that worker listen on.
 default_queue = default
 
diff --git a/docs/security.rst b/docs/security.rst
index c14cd1c2c3..e332221347 100644
--- a/docs/security.rst
+++ b/docs/security.rst
@@ -402,3 +402,22 @@ not set.
 
 [core]
 default_impersonation = airflow
+
+
+Flower Authentication
+-
+
+Basic authentication for Celery Flower is supported.
+
+You can specify the details either as an optional argument in the Flower 
process launching
+command, or as a configuration item in your ``airflow.cfg``. For both cases, 
please provide
+`user:password` pairs separated by a comma.
+
+.. code-block:: bash
+
+airflow flower --basic_auth=user1:password1,user2:password2
+
+.. code-block:: bash
+
+[celery]
+flower_basic_auth = user1:password1,user2:password2
diff --git a/scripts/ci/kubernetes/kube/configmaps.yaml 
b/scripts/ci/kubernetes/kube/configmaps.yaml
index ab44931e59..93a6364f86 100644
--- a/scripts/ci/kubernetes/kube/configmaps.yaml
+++ b/scripts/ci/kubernetes/kube/configmaps.yaml
@@ -253,6 +253,11 @@ data:
 # This defines the port that Celery Flower runs on
 flower_port = 
 
+# Securing Flower with Basic Authentication
+# Accepts user:password pairs separated by a comma
+# Example: flower_basic_auth = user1:password1,user2:password2
+flower_basic_auth =
+
 # Default queue that 

[jira] [Resolved] (AIRFLOW-3323) Support Basic Authentication for Flower

2018-11-13 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3323?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3323.

   Resolution: Fixed
Fix Version/s: 2.0.0

> Support Basic Authentication for Flower
> ---
>
> Key: AIRFLOW-3323
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3323
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: celery
>Affects Versions: 1.10.0
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Critical
> Fix For: 2.0.0
>
>
> The current `airflow flower` doesn't come with any authentication. This may 
> make essential information exposed to in an untrusted environment.
> Currently Flower itself supports
>  * HTTP Basic Authentication
>  * Google OAuth 2.0
>  * GitHub OAuth
> Given Flower is not really the most essential component of Airflow, we don't 
> have to support all its authentication methods. But may be good to at least 
> support Basic Authentication.
>  
> This ticket adds support to Basic Authentication for `Airflow Flower`.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb closed pull request #4166: [AIRFLOW-3323] Support HTTP basic authentication for Airflow Flower

2018-11-13 Thread GitBox
ashb closed pull request #4166: [AIRFLOW-3323] Support HTTP basic 
authentication for Airflow Flower
URL: https://github.com/apache/incubator-airflow/pull/4166
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index cfc6c6b8d6..5ddac2f886 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -1276,6 +1276,10 @@ def flower(args):
 if args.url_prefix:
 url_prefix = '--url-prefix=' + args.url_prefix
 
+basic_auth = ''
+if args.basic_auth:
+basic_auth = '--basic_auth=' + args.basic_auth
+
 flower_conf = ''
 if args.flower_conf:
 flower_conf = '--conf=' + args.flower_conf
@@ -1297,7 +1301,7 @@ def flower(args):
 
 with ctx:
 os.execvp("flower", ['flower', '-b',
- broka, address, port, api, flower_conf, 
url_prefix])
+ broka, address, port, api, flower_conf, 
url_prefix, basic_auth])
 
 stdout.close()
 stderr.close()
@@ -1306,7 +1310,7 @@ def flower(args):
 signal.signal(signal.SIGTERM, sigint_handler)
 
 os.execvp("flower", ['flower', '-b',
- broka, address, port, api, flower_conf, 
url_prefix])
+ broka, address, port, api, flower_conf, 
url_prefix, basic_auth])
 
 
 @cli_utils.action_logging
@@ -1823,6 +1827,12 @@ class CLIFactory(object):
 ("-u", "--url_prefix"),
 default=conf.get('celery', 'FLOWER_URL_PREFIX'),
 help="URL prefix for Flower"),
+'flower_basic_auth': Arg(
+("-ba", "--basic_auth"),
+default=conf.get('celery', 'FLOWER_BASIC_AUTH'),
+help=("Securing Flower with Basic Authentication. "
+  "Accepts user:password pairs separated by a comma. "
+  "Example: flower_basic_auth = 
user1:password1,user2:password2")),
 'task_params': Arg(
 ("-tp", "--task_params"),
 help="Sends a JSON params dict to the task"),
@@ -2070,7 +2080,7 @@ class CLIFactory(object):
 'func': flower,
 'help': "Start a Celery Flower",
 'args': ('flower_hostname', 'flower_port', 'flower_conf', 
'flower_url_prefix',
- 'broker_api', 'pid', 'daemon', 'stdout', 'stderr', 
'log_file'),
+ 'flower_basic_auth', 'broker_api', 'pid', 'daemon', 
'stdout', 'stderr', 'log_file'),
 }, {
 'func': version,
 'help': "Show the version",
diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 5c2d2e1512..4d73fdf51d 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -392,6 +392,11 @@ flower_url_prefix =
 # This defines the port that Celery Flower runs on
 flower_port = 
 
+# Securing Flower with Basic Authentication
+# Accepts user:password pairs separated by a comma
+# Example: flower_basic_auth = user1:password1,user2:password2
+flower_basic_auth =
+
 # Default queue that tasks get assigned to and that worker listen on.
 default_queue = default
 
diff --git a/docs/security.rst b/docs/security.rst
index c14cd1c2c3..e332221347 100644
--- a/docs/security.rst
+++ b/docs/security.rst
@@ -402,3 +402,22 @@ not set.
 
 [core]
 default_impersonation = airflow
+
+
+Flower Authentication
+-
+
+Basic authentication for Celery Flower is supported.
+
+You can specify the details either as an optional argument in the Flower 
process launching
+command, or as a configuration item in your ``airflow.cfg``. For both cases, 
please provide
+`user:password` pairs separated by a comma.
+
+.. code-block:: bash
+
+airflow flower --basic_auth=user1:password1,user2:password2
+
+.. code-block:: bash
+
+[celery]
+flower_basic_auth = user1:password1,user2:password2
diff --git a/scripts/ci/kubernetes/kube/configmaps.yaml 
b/scripts/ci/kubernetes/kube/configmaps.yaml
index ab44931e59..93a6364f86 100644
--- a/scripts/ci/kubernetes/kube/configmaps.yaml
+++ b/scripts/ci/kubernetes/kube/configmaps.yaml
@@ -253,6 +253,11 @@ data:
 # This defines the port that Celery Flower runs on
 flower_port = 
 
+# Securing Flower with Basic Authentication
+# Accepts user:password pairs separated by a comma
+# Example: flower_basic_auth = user1:password1,user2:password2
+flower_basic_auth =
+
 # Default queue that tasks get assigned to and that worker listen on.
 default_queue = default
 


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use

[GitHub] ashb commented on a change in pull request #4166: [AIRFLOW-3323] Support HTTP basic authentication for Airflow Flower

2018-11-13 Thread GitBox
ashb commented on a change in pull request #4166: [AIRFLOW-3323] Support HTTP 
basic authentication for Airflow Flower
URL: https://github.com/apache/incubator-airflow/pull/4166#discussion_r233066697
 
 

 ##
 File path: docs/security.rst
 ##
 @@ -402,3 +402,22 @@ not set.
 
 [core]
 default_impersonation = airflow
+
+
+Flower Authentication
+-
+
+Basic authentication for Celery Flower is supported.
+
+You can specify the details either as an optional argument in the Flower 
process launching
+command, or as a configuration item in your ``airflow.cfg``. For both cases, 
please provide
+`user:password` pairs separated by a comma.
+
+.. code-block:: bash
+
+airflow flower --basic_auth=user1:password1,user2:password2
 
 Review comment:
   Fair point.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on a change in pull request #4166: [AIRFLOW-3323] Support HTTP basic authentication for Airflow Flower

2018-11-13 Thread GitBox
XD-DENG commented on a change in pull request #4166: [AIRFLOW-3323] Support 
HTTP basic authentication for Airflow Flower
URL: https://github.com/apache/incubator-airflow/pull/4166#discussion_r233065097
 
 

 ##
 File path: docs/security.rst
 ##
 @@ -402,3 +402,22 @@ not set.
 
 [core]
 default_impersonation = airflow
+
+
+Flower Authentication
+-
+
+Basic authentication for Celery Flower is supported.
+
+You can specify the details either as an optional argument in the Flower 
process launching
+command, or as a configuration item in your ``airflow.cfg``. For both cases, 
please provide
+`user:password` pairs separated by a comma.
+
+.. code-block:: bash
+
+airflow flower --basic_auth=user1:password1,user2:password2
 
 Review comment:
   In addition, actually both options provided here (specifying the Flower auth 
details as an CLI argument, or a configuration item in `airflow.cfg`) will 
expose the details in `ps` output, given eventually the Flower process is 
launched by `os.execvp` 
(https://github.com/apache/incubator-airflow/blob/6097f829ac5a4442180018ed56fa1b695badb131/airflow/bin/cli.py#L1299-L1300)
   
   But as shared, this is how this Flower basic authentication works (while 
implementing OAuth for Flower in Airflow scenario is not really necessary. Not 
mentioning the Flower in Airflow is "streaking" at this moment)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #2460: [AIRFLOW-1424] make the next execution date of DAGs visible

2018-11-13 Thread GitBox
ashb commented on issue #2460: [AIRFLOW-1424] make the next execution date of 
DAGs visible
URL: 
https://github.com/apache/incubator-airflow/pull/2460#issuecomment-438284090
 
 
   That'll happen, in this case (since I've already been involed) there's no 
need to rebase to fix them just yet.
   
   I need to take a look at what that function in Scheduler does, and if it's 
actually doing more than we want. For instance if there are too many active DAG 
runs already for that DAG it will return None: do we want None displayed in the 
UI in that case or not.
   
   Also: need to check if creating an instance of SchedulerJob will end up 
creating a row in the Job table or not? (If it does then we don't want that, 
and the logic should be moved elsewhere)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4139: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4139: [AIRFLOW-2715] Pick up the region 
setting while launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4139#issuecomment-438279332
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=h1)
 Report
   > Merging 
[#4139](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/dc0eb58e97178b050b79584f18d8b9bd2c3dea5f?src=pr&el=desc)
 will **increase** coverage by `2.16%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4139/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4139  +/-   ##
   ==
   + Coverage   77.46%   79.62%   +2.16% 
   ==
 Files 199  199  
 Lines   1627218638+2366 
   ==
   + Hits1260514841+2236 
   - Misses   3667 3797 +130
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/app.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy9hcHAucHk=)
 | `97.95% <0%> (+0.9%)` | :arrow_up: |
   | 
[airflow/www/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3cvdXRpbHMucHk=)
 | `90.32% <0%> (+0.97%)` | :arrow_up: |
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `92.61% <0%> (+1.34%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `94.3% <0%> (+2.1%)` | :arrow_up: |
   | 
[airflow/security/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy9zZWN1cml0eS91dGlscy5weQ==)
 | `31.25% <0%> (+2.3%)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `74.28% <0%> (+5.27%)` | :arrow_up: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `78.05% <0%> (+6.17%)` | :arrow_up: |
   | 
[airflow/www\_rbac/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy91dGlscy5weQ==)
 | `75.57% <0%> (+6.62%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=footer).
 Last update 
[dc0eb58...c1a88a6](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4139: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-13 Thread GitBox
codecov-io commented on issue #4139: [AIRFLOW-2715] Pick up the region setting 
while launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4139#issuecomment-438279332
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=h1)
 Report
   > Merging 
[#4139](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/dc0eb58e97178b050b79584f18d8b9bd2c3dea5f?src=pr&el=desc)
 will **increase** coverage by `2.16%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4139/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4139  +/-   ##
   ==
   + Coverage   77.46%   79.62%   +2.16% 
   ==
 Files 199  199  
 Lines   1627218638+2366 
   ==
   + Hits1260514841+2236 
   - Misses   3667 3797 +130
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/app.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy9hcHAucHk=)
 | `97.95% <0%> (+0.9%)` | :arrow_up: |
   | 
[airflow/www/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3cvdXRpbHMucHk=)
 | `90.32% <0%> (+0.97%)` | :arrow_up: |
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `92.61% <0%> (+1.34%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `94.3% <0%> (+2.1%)` | :arrow_up: |
   | 
[airflow/security/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy9zZWN1cml0eS91dGlscy5weQ==)
 | `31.25% <0%> (+2.3%)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `74.28% <0%> (+5.27%)` | :arrow_up: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `78.05% <0%> (+6.17%)` | :arrow_up: |
   | 
[airflow/www\_rbac/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy91dGlscy5weQ==)
 | `75.57% <0%> (+6.62%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=footer).
 Last update 
[dc0eb58...c1a88a6](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
Fokko commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438278914
 
 
   Thanks! 👍 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ron819 commented on issue #2460: [AIRFLOW-1424] make the next execution date of DAGs visible

2018-11-13 Thread GitBox
ron819 commented on issue #2460: [AIRFLOW-1424] make the next execution date of 
DAGs visible
URL: 
https://github.com/apache/incubator-airflow/pull/2460#issuecomment-438277662
 
 
   @ultrabug There are conflicts to resolve


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438276498
 
 
   I'll do that - it'll conflict and I already know how to resolve it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2779) Verify and correct licenses

2018-11-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2779?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685245#comment-16685245
 ] 

ASF GitHub Bot commented on AIRFLOW-2779:
-

Fokko closed pull request #4178: [AIRFLOW-2779] Add license headers to doc files
URL: https://github.com/apache/incubator-airflow/pull/4178
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/.rat-excludes b/.rat-excludes
index b13c1648c4..05f7b98000 100644
--- a/.rat-excludes
+++ b/.rat-excludes
@@ -1,18 +1,21 @@
+# Note: these patterns are applied to single files or directories, not full 
paths
+# coverage/* will ignore any coverage dir, but airflow/www/static/coverage/* 
will match nothing
+
 .gitignore
 .gitattributes
+.airflowignore
 .coverage
 .coveragerc
 .codecov.yml
 .eslintrc
 .eslintignore
+.flake8
 .rat-excludes
 requirements.txt
 .*log
 .travis.yml
 .*pyc
 .*lock
-docs
-.*md
 dist
 build
 airflow.egg-info
@@ -20,16 +23,23 @@ apache_airflow.egg-info
 .idea
 metastore_db
 .*sql
+.*svg
 .*csv
 CHANGELOG.txt
 .*zip
 .*lock
+# Generated doc files
+.*html
+_build/*
+_static/*
+.buildinfo
+searchindex.js
+
 # Apache Rat does not detect BSD-2 clause properly
 # it is compatible according to 
http://www.apache.org/legal/resolved.html#category-a
 kerberos_auth.py
 airflow_api_auth_backend_kerberos_auth_py.html
 licenses/*
-airflow/www/static/docs
 parallel.js
 underscore.js
 jquery.dataTables.min.js
@@ -39,5 +49,12 @@ bootstrap-toggle.min.js
 bootstrap-toggle.min.css
 d3.v3.min.js
 ace.js
-airflow/www_rbac/node_modules
+node_modules/*
 .*json
+coverage/*
+git_version
+flake8_diff.sh
+
+rat-results.txt
+apache-airflow-.*\+incubating-source.tar.gz.*
+apache-airflow-.*\+incubating-bin.tar.gz.*
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 08c033c028..5fd9bedbfe 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -1,3 +1,22 @@
+
+
 # Contributing
 
 Contributions are welcome and are greatly appreciated! Every
diff --git a/README.md b/README.md
index d9eebeda33..85c94aa8b5 100644
--- a/README.md
+++ b/README.md
@@ -1,3 +1,22 @@
+
+
 # Apache Airflow (Incubating)
 
 [![PyPI 
version](https://badge.fury.io/py/apache-airflow.svg)](https://badge.fury.io/py/apache-airflow)
diff --git a/TODO.md b/TODO.md
index 780ca20722..f49d99ce6f 100644
--- a/TODO.md
+++ b/TODO.md
@@ -1,3 +1,22 @@
+
+
  Roadmap items
 
 * UI page answering "Why isn't this task instance running?"
diff --git a/UPDATING.md b/UPDATING.md
index 909cd2649b..af448cfff8 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -1,3 +1,22 @@
+
+
 # Updating Airflow
 
 This file documents any backwards-incompatible changes in Airflow and
diff --git a/airflow/contrib/example_dags/example_twitter_README.md 
b/airflow/contrib/example_dags/example_twitter_README.md
index 0f3aededd4..67e95581d7 100644
--- a/airflow/contrib/example_dags/example_twitter_README.md
+++ b/airflow/contrib/example_dags/example_twitter_README.md
@@ -1,3 +1,22 @@
+
+
 # Example Twitter DAG
 
 ***Introduction:*** This example dag depicts a typical ETL process and is a 
perfect use case automation scenario for Airflow. Please note that the main 
scripts associated with the tasks are returning None. The purpose of this DAG 
is to demonstrate how to write a functional DAG within Airflow.
diff --git a/airflow/www/templates/airflow/variables/README.md 
b/airflow/www/templates/airflow/variables/README.md
index bf4d80b684..fcd00ad30f 100644
--- a/airflow/www/templates/airflow/variables/README.md
+++ b/airflow/www/templates/airflow/variables/README.md
@@ -1,3 +1,22 @@
+
+
 # Variable Editor
 
 This folder contains forms used to edit values in the "Variable" key-value
diff --git a/dev/README.md b/dev/README.md
index 0b3797ee0a..b9e9138468 100755
--- a/dev/README.md
+++ b/dev/README.md
@@ -1,3 +1,22 @@
+
+
 # Development Tools
 
 ## Airflow Pull Request Tool
diff --git a/docs/api.rst b/docs/api.rst
index 4ea19c8969..194809abc6 100644
--- a/docs/api.rst
+++ b/docs/api.rst
@@ -1,3 +1,20 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+..http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the Lic

[GitHub] Fokko commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
Fokko commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438276071
 
 
   Who will cherry-pick it onto v1-10-test? :D


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko closed pull request #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
Fokko closed pull request #4178: [AIRFLOW-2779] Add license headers to doc files
URL: https://github.com/apache/incubator-airflow/pull/4178
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/.rat-excludes b/.rat-excludes
index b13c1648c4..05f7b98000 100644
--- a/.rat-excludes
+++ b/.rat-excludes
@@ -1,18 +1,21 @@
+# Note: these patterns are applied to single files or directories, not full 
paths
+# coverage/* will ignore any coverage dir, but airflow/www/static/coverage/* 
will match nothing
+
 .gitignore
 .gitattributes
+.airflowignore
 .coverage
 .coveragerc
 .codecov.yml
 .eslintrc
 .eslintignore
+.flake8
 .rat-excludes
 requirements.txt
 .*log
 .travis.yml
 .*pyc
 .*lock
-docs
-.*md
 dist
 build
 airflow.egg-info
@@ -20,16 +23,23 @@ apache_airflow.egg-info
 .idea
 metastore_db
 .*sql
+.*svg
 .*csv
 CHANGELOG.txt
 .*zip
 .*lock
+# Generated doc files
+.*html
+_build/*
+_static/*
+.buildinfo
+searchindex.js
+
 # Apache Rat does not detect BSD-2 clause properly
 # it is compatible according to 
http://www.apache.org/legal/resolved.html#category-a
 kerberos_auth.py
 airflow_api_auth_backend_kerberos_auth_py.html
 licenses/*
-airflow/www/static/docs
 parallel.js
 underscore.js
 jquery.dataTables.min.js
@@ -39,5 +49,12 @@ bootstrap-toggle.min.js
 bootstrap-toggle.min.css
 d3.v3.min.js
 ace.js
-airflow/www_rbac/node_modules
+node_modules/*
 .*json
+coverage/*
+git_version
+flake8_diff.sh
+
+rat-results.txt
+apache-airflow-.*\+incubating-source.tar.gz.*
+apache-airflow-.*\+incubating-bin.tar.gz.*
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 08c033c028..5fd9bedbfe 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -1,3 +1,22 @@
+
+
 # Contributing
 
 Contributions are welcome and are greatly appreciated! Every
diff --git a/README.md b/README.md
index d9eebeda33..85c94aa8b5 100644
--- a/README.md
+++ b/README.md
@@ -1,3 +1,22 @@
+
+
 # Apache Airflow (Incubating)
 
 [![PyPI 
version](https://badge.fury.io/py/apache-airflow.svg)](https://badge.fury.io/py/apache-airflow)
diff --git a/TODO.md b/TODO.md
index 780ca20722..f49d99ce6f 100644
--- a/TODO.md
+++ b/TODO.md
@@ -1,3 +1,22 @@
+
+
  Roadmap items
 
 * UI page answering "Why isn't this task instance running?"
diff --git a/UPDATING.md b/UPDATING.md
index 909cd2649b..af448cfff8 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -1,3 +1,22 @@
+
+
 # Updating Airflow
 
 This file documents any backwards-incompatible changes in Airflow and
diff --git a/airflow/contrib/example_dags/example_twitter_README.md 
b/airflow/contrib/example_dags/example_twitter_README.md
index 0f3aededd4..67e95581d7 100644
--- a/airflow/contrib/example_dags/example_twitter_README.md
+++ b/airflow/contrib/example_dags/example_twitter_README.md
@@ -1,3 +1,22 @@
+
+
 # Example Twitter DAG
 
 ***Introduction:*** This example dag depicts a typical ETL process and is a 
perfect use case automation scenario for Airflow. Please note that the main 
scripts associated with the tasks are returning None. The purpose of this DAG 
is to demonstrate how to write a functional DAG within Airflow.
diff --git a/airflow/www/templates/airflow/variables/README.md 
b/airflow/www/templates/airflow/variables/README.md
index bf4d80b684..fcd00ad30f 100644
--- a/airflow/www/templates/airflow/variables/README.md
+++ b/airflow/www/templates/airflow/variables/README.md
@@ -1,3 +1,22 @@
+
+
 # Variable Editor
 
 This folder contains forms used to edit values in the "Variable" key-value
diff --git a/dev/README.md b/dev/README.md
index 0b3797ee0a..b9e9138468 100755
--- a/dev/README.md
+++ b/dev/README.md
@@ -1,3 +1,22 @@
+
+
 # Development Tools
 
 ## Airflow Pull Request Tool
diff --git a/docs/api.rst b/docs/api.rst
index 4ea19c8969..194809abc6 100644
--- a/docs/api.rst
+++ b/docs/api.rst
@@ -1,3 +1,20 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+..http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
 Experimental Rest API
 =
 
diff --git a/docs/cli.rst b/docs/cli.rst
index f05cbfbe27..4d68d0eef3 100644
--- a/docs/cli.rst
+++ b/docs/cli.rst

[GitHub] ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438273732
 
 
   @Fokko I've got all the excludes that I needed for passing on Travis, if 
there are extra ones that we need for local dev we can add those later


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-13 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r233044770
 
 

 ##
 File path: airflow/contrib/hooks/azure_container_instance_hook.py
 ##
 @@ -0,0 +1,92 @@
+
 
 Review comment:
   Trim this line please.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3309) Missing Mongo DB connection type

2018-11-13 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-3309.
---
   Resolution: Fixed
Fix Version/s: (was: 1.10.2)
   2.0.0

> Missing Mongo DB connection type
> 
>
> Key: AIRFLOW-3309
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3309
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: database
>Affects Versions: 1.10.0
>Reporter: John Cheng
>Assignee: John Cheng
>Priority: Minor
> Fix For: 2.0.0
>
>
> Unable to choose Mongo DB on the admin console connection page.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3245) resolve_template_files not processing lists

2018-11-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3245?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685204#comment-16685204
 ] 

ASF GitHub Bot commented on AIRFLOW-3245:
-

Fokko closed pull request #4086: [AIRFLOW-3245] fix list processing in 
resolve_template_files
URL: https://github.com/apache/incubator-airflow/pull/4086
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index fa33609852..35ece3617f 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -2902,14 +2902,24 @@ def resolve_template_files(self):
 # Getting the content of files for template_field / template_ext
 for attr in self.template_fields:
 content = getattr(self, attr)
-if content is not None and \
-isinstance(content, six.string_types) and \
+if content is None:
+continue
+elif isinstance(content, six.string_types) and \
 any([content.endswith(ext) for ext in self.template_ext]):
 env = self.dag.get_template_env()
 try:
 setattr(self, attr, env.loader.get_source(env, content)[0])
 except Exception as e:
 self.log.exception(e)
+elif isinstance(content, list):
+env = self.dag.get_template_env()
+for i in range(len(content)):
+if isinstance(content[i], six.string_types) and \
+any([content[i].endswith(ext) for ext in 
self.template_ext]):
+try:
+content[i] = env.loader.get_source(env, 
content[i])[0]
+except Exception as e:
+self.log.exception(e)
 self.prepare_template()
 
 @property
diff --git a/tests/models.py b/tests/models.py
index b16055b380..9f439605e1 100644
--- a/tests/models.py
+++ b/tests/models.py
@@ -454,6 +454,51 @@ def jinja_udf(name):
 result = task.render_template('', "{{ 'world' | hello}}", dict())
 self.assertEqual(result, 'Hello world')
 
+def test_resolve_template_files_value(self):
+
+with NamedTemporaryFile(suffix='.template') as f:
+f.write('{{ ds }}'.encode('utf8'))
+f.flush()
+template_dir = os.path.dirname(f.name)
+template_file = os.path.basename(f.name)
+
+dag = DAG('test-dag',
+  start_date=DEFAULT_DATE,
+  template_searchpath=template_dir)
+
+with dag:
+task = DummyOperator(task_id='op1')
+
+task.test_field = template_file
+task.template_fields = ('test_field',)
+task.template_ext = ('.template',)
+task.resolve_template_files()
+
+self.assertEqual(task.test_field, '{{ ds }}')
+
+def test_resolve_template_files_list(self):
+
+with NamedTemporaryFile(suffix='.template') as f:
+f = NamedTemporaryFile(suffix='.template')
+f.write('{{ ds }}'.encode('utf8'))
+f.flush()
+template_dir = os.path.dirname(f.name)
+template_file = os.path.basename(f.name)
+
+dag = DAG('test-dag',
+  start_date=DEFAULT_DATE,
+  template_searchpath=template_dir)
+
+with dag:
+task = DummyOperator(task_id='op1')
+
+task.test_field = [template_file, 'some_string']
+task.template_fields = ('test_field',)
+task.template_ext = ('.template',)
+task.resolve_template_files()
+
+self.assertEqual(task.test_field, ['{{ ds }}', 'some_string'])
+
 def test_cycle(self):
 # test empty
 dag = DAG(


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> resolve_template_files not processing lists
> ---
>
> Key: AIRFLOW-3245
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3245
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Marcin Szymanski
>Assignee: Marcin Szymanski
>Priority: Major
> Fix For: 2.0.0
>
>
> Currently, resolve_template_files does not process template field value if it 
> is a list - only strings

[jira] [Resolved] (AIRFLOW-3245) resolve_template_files not processing lists

2018-11-13 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-3245.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

> resolve_template_files not processing lists
> ---
>
> Key: AIRFLOW-3245
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3245
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Marcin Szymanski
>Assignee: Marcin Szymanski
>Priority: Major
> Fix For: 2.0.0
>
>
> Currently, resolve_template_files does not process template field value if it 
> is a list - only strings are processed. Lists are correctly processed at a 
> further stage of template parsing, and are a common use case for submitting 
> multiple SQL statements



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko commented on issue #4139: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-13 Thread GitBox
Fokko commented on issue #4139: [AIRFLOW-2715] Pick up the region setting while 
launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4139#issuecomment-438268600
 
 
   @janhicken Restarting Travis, can you rebase onto master? It might be that 
master was failing at that time.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko closed pull request #4086: [AIRFLOW-3245] fix list processing in resolve_template_files

2018-11-13 Thread GitBox
Fokko closed pull request #4086: [AIRFLOW-3245] fix list processing in 
resolve_template_files
URL: https://github.com/apache/incubator-airflow/pull/4086
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index fa33609852..35ece3617f 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -2902,14 +2902,24 @@ def resolve_template_files(self):
 # Getting the content of files for template_field / template_ext
 for attr in self.template_fields:
 content = getattr(self, attr)
-if content is not None and \
-isinstance(content, six.string_types) and \
+if content is None:
+continue
+elif isinstance(content, six.string_types) and \
 any([content.endswith(ext) for ext in self.template_ext]):
 env = self.dag.get_template_env()
 try:
 setattr(self, attr, env.loader.get_source(env, content)[0])
 except Exception as e:
 self.log.exception(e)
+elif isinstance(content, list):
+env = self.dag.get_template_env()
+for i in range(len(content)):
+if isinstance(content[i], six.string_types) and \
+any([content[i].endswith(ext) for ext in 
self.template_ext]):
+try:
+content[i] = env.loader.get_source(env, 
content[i])[0]
+except Exception as e:
+self.log.exception(e)
 self.prepare_template()
 
 @property
diff --git a/tests/models.py b/tests/models.py
index b16055b380..9f439605e1 100644
--- a/tests/models.py
+++ b/tests/models.py
@@ -454,6 +454,51 @@ def jinja_udf(name):
 result = task.render_template('', "{{ 'world' | hello}}", dict())
 self.assertEqual(result, 'Hello world')
 
+def test_resolve_template_files_value(self):
+
+with NamedTemporaryFile(suffix='.template') as f:
+f.write('{{ ds }}'.encode('utf8'))
+f.flush()
+template_dir = os.path.dirname(f.name)
+template_file = os.path.basename(f.name)
+
+dag = DAG('test-dag',
+  start_date=DEFAULT_DATE,
+  template_searchpath=template_dir)
+
+with dag:
+task = DummyOperator(task_id='op1')
+
+task.test_field = template_file
+task.template_fields = ('test_field',)
+task.template_ext = ('.template',)
+task.resolve_template_files()
+
+self.assertEqual(task.test_field, '{{ ds }}')
+
+def test_resolve_template_files_list(self):
+
+with NamedTemporaryFile(suffix='.template') as f:
+f = NamedTemporaryFile(suffix='.template')
+f.write('{{ ds }}'.encode('utf8'))
+f.flush()
+template_dir = os.path.dirname(f.name)
+template_file = os.path.basename(f.name)
+
+dag = DAG('test-dag',
+  start_date=DEFAULT_DATE,
+  template_searchpath=template_dir)
+
+with dag:
+task = DummyOperator(task_id='op1')
+
+task.test_field = [template_file, 'some_string']
+task.template_fields = ('test_field',)
+task.template_ext = ('.template',)
+task.resolve_template_files()
+
+self.assertEqual(task.test_field, ['{{ ds }}', 'some_string'])
+
 def test_cycle(self):
 # test empty
 dag = DAG(


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4090: [AIRFLOW-3250] Fix for Redis Hook for not authorised connection calls

2018-11-13 Thread GitBox
Fokko commented on issue #4090: [AIRFLOW-3250] Fix for Redis Hook for not 
authorised connection calls
URL: 
https://github.com/apache/incubator-airflow/pull/4090#issuecomment-438259430
 
 
   @smentek If you're interested, you can also test against a real Redis by 
adding it to the docker-compose: 
https://github.com/apache/incubator-airflow/blob/master/scripts/ci/docker-compose.yml
   What do you think?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
codecov-io edited a comment on issue #4178: [AIRFLOW-2779] Add license headers 
to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438041046
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4178?src=pr&el=h1)
 Report
   > Merging 
[#4178](https://codecov.io/gh/apache/incubator-airflow/pull/4178?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/1852afe772dc53901bd60bd86c4f1c205195ca4b?src=pr&el=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4178/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4178?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4178  +/-   ##
   ==
   + Coverage   77.64%   77.65%   +0.01% 
   ==
 Files 199  199  
 Lines   1627716277  
   ==
   + Hits1263812640   +2 
   + Misses   3639 3637   -2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4178?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4178/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.12% <0%> (+0.04%)` | :arrow_up: |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/incubator-airflow/pull/4178/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `89.05% <0%> (+0.36%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4178?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4178?src=pr&el=footer).
 Last update 
[1852afe...449533f](https://codecov.io/gh/apache/incubator-airflow/pull/4178?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] phani8996 commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
phani8996 commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS 
Athena Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111#discussion_r233023537
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from uuid import uuid4
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.aws_athena_hook import AWSAthenaHook
+
+
+class AWSAthenaOperator(BaseOperator):
+"""
+Airflow operator to run presto queries on athena.
+
+:param query: Presto to be run on athena. (templated)
+:type query: str
+:param database: Database to select. (templated)
+:type database: str
+:param output_location: s3 path to write the query results into. 
(templated)
+:type output_location: str
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+ui_color = '#44b5e2'
+template_fields = ('query', 'database', 'output_location')
+
+@apply_defaults
+def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+super(AWSAthenaOperator, self).__init__(*args, **kwargs)
+self.query = query
+self.database = database
+self.output_location = output_location
+self.aws_conn_id = aws_conn_id
+self.client_request_token = client_request_token or str(uuid4())
+self.query_execution_context = query_execution_context or {}
+self.result_configuration = result_configuration or {}
+self.sleep_time = sleep_time
+self.query_execution_id = None
+self.hook = None
+
+def get_hook(self):
+return AWSAthenaHook(self.aws_conn_id, self.sleep_time)
+
+def execute(self, context):
+"""
+Run Presto Query on Athena
+"""
+self.hook = self.get_hook()
+self.hook.get_conn()
+
+self.query_execution_context['Database'] = self.database
+self.result_configuration['OutputLocation'] = self.output_location
 
 Review comment:
   @ashb can you review final changes and approve this pr. Its been open for a 
long time. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3335) Bulk backfill & faster mark_success

2018-11-13 Thread belgacea (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3335?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

belgacea updated AIRFLOW-3335:
--
Description: 
I'm using Airflow to schedule Spark jobs and I wanted to be able to `backfill` 
a large time range (to catch up dags that are far beyond their schedules). I 
used the `backfill` command with the `mark_success` argument and I thought all 
the dagruns would be marked as successful in a second, but airflow seems to 
mark dags one by one (with some parallelization, using the 
`parallelism`/`dag_concurrency` configuration). Each dag take approximately 2 
seconds to be marked as succeed and this makes the backfill process really slow 
for a large time range (or for small `time intervals`).

Is there a way to speed up the `mark_success` bakfilling ? And also is there a 
way to tell to Airflow scheduler to backfill dags with a single instance per 
task using the specified backfill time range (`start_date` + `end_date`) and 
then mark as succeed all dagruns within the time range ? 

Note : The dag I tried to backfill doesn't `depends_on_past`.

  was:
I'm using Airflow to schedule Spark jobs and I wanted to be able to `backfill` 
a large time range (to catch up dags that are far beyond their schedules). I 
used the `backfill` command with the `mark_success` argument and I was thinking 
that all dagrun will be marked as succeed in a second, but airflow seems to 
mark dags one by one (with some parallelization, using the 
`parallelism`/`dag_concurrency` configuration). Each dag take approximately 2 
seconds to be marked as succeed and this makes the backfill process really slow 
for a large time range (or for small `time intervals`).

Is there a way to speed up the `mark_success` bakfilling ? And also is there a 
way to tell to Airflow scheduler to backfill dags with a single instance per 
task using the specified backfill time range (`start_date` + `end_date`) and 
then mark as succeed all dagruns within the time range ? 

Note : The dag I tried to backfill doesn't `depends_on_past`.


> Bulk backfill & faster mark_success
> ---
>
> Key: AIRFLOW-3335
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3335
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: backfill
>Reporter: belgacea
>Priority: Major
>  Labels: features, performance
>
> I'm using Airflow to schedule Spark jobs and I wanted to be able to 
> `backfill` a large time range (to catch up dags that are far beyond their 
> schedules). I used the `backfill` command with the `mark_success` argument 
> and I thought all the dagruns would be marked as successful in a second, but 
> airflow seems to mark dags one by one (with some parallelization, using the 
> `parallelism`/`dag_concurrency` configuration). Each dag take approximately 2 
> seconds to be marked as succeed and this makes the backfill process really 
> slow for a large time range (or for small `time intervals`).
> Is there a way to speed up the `mark_success` bakfilling ? And also is there 
> a way to tell to Airflow scheduler to backfill dags with a single instance 
> per task using the specified backfill time range (`start_date` + `end_date`) 
> and then mark as succeed all dagruns within the time range ? 
> Note : The dag I tried to backfill doesn't `depends_on_past`.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438236380
 
 
   Tests failed because I applied the diff from 1-10-test to master wrong, so 
rat didn't run. Take two.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3334) Eliminate need for "Troubleshooting: Jinja template not found... Add a space after the script name when directly calling a Bash script with the bash_command argument.

2018-11-13 Thread Steven Ramey (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685069#comment-16685069
 ] 

Steven Ramey commented on AIRFLOW-3334:
---

Thanks for getting back so quickly. That would be excellent. 

> Eliminate need for "Troubleshooting: Jinja template not found... Add a space 
> after the script name when directly calling a Bash script with the 
> bash_command argument. 
> ---
>
> Key: AIRFLOW-3334
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3334
> Project: Apache Airflow
>  Issue Type: Wish
>Reporter: Steven Ramey
>Priority: Trivial
>
> I've been training in some colleagues at my work on Airflow. A problem and 
> silly thing to go over in a tutorial to someone is the known error of: "Jinja 
> template not found."
> https://airflow.apache.org/howto/operator.html?highlight=operator#jinja-template-not-found
> I have no idea how much work goes into it, but would it be possible to 
> eliminate this feature of Airflow where a space at the end of a string is 
> necessary and the BashOperator simply reads in the string correctly, space or 
> not? Thanks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3335) Bulk backfill & faster mark_success

2018-11-13 Thread belgacea (JIRA)
belgacea created AIRFLOW-3335:
-

 Summary: Bulk backfill & faster mark_success
 Key: AIRFLOW-3335
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3335
 Project: Apache Airflow
  Issue Type: Improvement
  Components: backfill
Reporter: belgacea


I'm using Airflow to schedule Spark jobs and I wanted to be able to `backfill` 
a large time range (to catch up dags that are far beyond their schedules). I 
used the `backfill` command with the `mark_success` argument and I was thinking 
that all dagrun will be marked as succeed in a second, but airflow seems to 
mark dags one by one (with some parallelization, using the 
`parallelism`/`dag_concurrency` configuration). Each dag take approximately 2 
seconds to be marked as succeed and this makes the backfill process really slow 
for a large time range (or for small `time intervals`).

Is there a way to speed up the `mark_success` bakfilling ? And also is there a 
way to tell to Airflow scheduler to backfill dags with a single instance per 
task using the specified backfill time range (`start_date` + `end_date`) and 
then mark as succeed all dagruns within the time range ? 

Note : The dag I tried to backfill doesn't `depends_on_past`.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3334) Eliminate need for "Troubleshooting: Jinja template not found... Add a space after the script name when directly calling a Bash script with the bash_command argument.

2018-11-13 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685065#comment-16685065
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3334:


Oh yes I can see that being annoying.

I wonder if we can "simply" handle this by catching the error and not trying to 
template the script in that case. I wonder if this would break anything.

> Eliminate need for "Troubleshooting: Jinja template not found... Add a space 
> after the script name when directly calling a Bash script with the 
> bash_command argument. 
> ---
>
> Key: AIRFLOW-3334
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3334
> Project: Apache Airflow
>  Issue Type: Wish
>Reporter: Steven Ramey
>Priority: Trivial
>
> I've been training in some colleagues at my work on Airflow. A problem and 
> silly thing to go over in a tutorial to someone is the known error of: "Jinja 
> template not found."
> https://airflow.apache.org/howto/operator.html?highlight=operator#jinja-template-not-found
> I have no idea how much work goes into it, but would it be possible to 
> eliminate this feature of Airflow where a space at the end of a string is 
> necessary and the BashOperator simply reads in the string correctly, space or 
> not? Thanks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ckljohn commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
ckljohn commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS 
Athena Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111#discussion_r233000355
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from uuid import uuid4
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.aws_athena_hook import AWSAthenaHook
+
+
+class AWSAthenaOperator(BaseOperator):
+"""
+Airflow operator to run presto queries on athena.
+
+:param query: Presto to be run on athena. (templated)
+:type query: str
+:param database: Database to select. (templated)
+:type database: str
+:param output_location: s3 path to write the query results into. 
(templated)
+:type output_location: str
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+ui_color = '#44b5e2'
+template_fields = ('query', 'database', 'output_location')
+
+@apply_defaults
+def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+super(AWSAthenaOperator, self).__init__(*args, **kwargs)
+self.query = query
+self.database = database
+self.output_location = output_location
+self.aws_conn_id = aws_conn_id
+self.client_request_token = client_request_token or str(uuid4())
+self.query_execution_context = query_execution_context or {}
+self.result_configuration = result_configuration or {}
+self.sleep_time = sleep_time
+self.query_execution_id = None
+self.hook = None
+
+def get_hook(self):
+return AWSAthenaHook(self.aws_conn_id, self.sleep_time)
+
+def execute(self, context):
+"""
+Run Presto Query on Athena
+"""
+self.hook = self.get_hook()
+self.hook.get_conn()
+
+self.query_execution_context['Database'] = self.database
+self.result_configuration['OutputLocation'] = self.output_location
 
 Review comment:
   Just thought it should be done as early as it can. Anyway, I can handle it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3334) Eliminate need for "Troubleshooting: Jinja template not found... Add a space after the script name when directly calling a Bash script with the bash_command argument.

2018-11-13 Thread Steven Ramey (JIRA)
Steven Ramey created AIRFLOW-3334:
-

 Summary: Eliminate need for "Troubleshooting: Jinja template not 
found... Add a space after the script name when directly calling a Bash script 
with the bash_command argument. 
 Key: AIRFLOW-3334
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3334
 Project: Apache Airflow
  Issue Type: Wish
Reporter: Steven Ramey


I've been training in some colleagues at my work on Airflow. A problem and 
silly thing to go over in a tutorial to someone is the known error of: "Jinja 
template not found."

https://airflow.apache.org/howto/operator.html?highlight=operator#jinja-template-not-found

I have no idea how much work goes into it, but would it be possible to 
eliminate this feature of Airflow where a space at the end of a string is 
necessary and the BashOperator simply reads in the string correctly, space or 
not? Thanks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ckljohn commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
ckljohn commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS 
Athena Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111#discussion_r232998263
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from uuid import uuid4
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.aws_athena_hook import AWSAthenaHook
+
+
+class AWSAthenaOperator(BaseOperator):
+"""
+Airflow operator to run presto queries on athena.
+
+:param query: Presto to be run on athena. (templated)
+:type query: str
+:param database: Database to select. (templated)
+:type database: str
+:param output_location: s3 path to write the query results into. 
(templated)
+:type output_location: str
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+ui_color = '#44b5e2'
+template_fields = ('query', 'database', 'output_location')
+
+@apply_defaults
+def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+super(AWSAthenaOperator, self).__init__(*args, **kwargs)
+self.query = query
+self.database = database
+self.output_location = output_location
+self.aws_conn_id = aws_conn_id
+self.client_request_token = client_request_token or str(uuid4())
+self.query_execution_context = query_execution_context or {}
+self.result_configuration = result_configuration or {}
+self.sleep_time = sleep_time
+self.query_execution_id = None
+self.hook = None
+
+def get_hook(self):
+return AWSAthenaHook(self.aws_conn_id, self.sleep_time)
+
+def execute(self, context):
+"""
+Run Presto Query on Athena
+"""
+self.hook = self.get_hook()
+self.hook.get_conn()
+
+self.query_execution_context['Database'] = self.database
+self.result_configuration['OutputLocation'] = self.output_location
 
 Review comment:
   I want to run Athena query in `pre_execute` before `execute`. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on a change in pull request #4166: [AIRFLOW-3323] Support HTTP basic authentication for Airflow Flower

2018-11-13 Thread GitBox
XD-DENG commented on a change in pull request #4166: [AIRFLOW-3323] Support 
HTTP basic authentication for Airflow Flower
URL: https://github.com/apache/incubator-airflow/pull/4166#discussion_r232996615
 
 

 ##
 File path: docs/security.rst
 ##
 @@ -402,3 +402,22 @@ not set.
 
 [core]
 default_impersonation = airflow
+
+
+Flower Authentication
+-
+
+Basic authentication for Celery Flower is supported.
+
+You can specify the details either as an optional argument in the Flower 
process launching
+command, or as a configuration item in your ``airflow.cfg``. For both cases, 
please provide
+`user:password` pairs separated by a comma.
+
+.. code-block:: bash
+
+airflow flower --basic_auth=user1:password1,user2:password2
 
 Review comment:
   Nice point! It’s raw password though.
   
   The reason I would like to use to “argue” is that this is how Flower itself 
works (please refer to Flower doc 
https://flower.readthedocs.io/en/latest/auth.html).
   
   No matter if we’re running Flower for Airflow or other Celery project, as 
long as we’re using this http basic authentication, everything will be exposed 
in ps output.
   
   I think we are assuming that nobody except Admin or Op should have access to 
the server or pod running Flower process, then ps output is also somehow safe.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
ashb commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena 
Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111#discussion_r232994644
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from uuid import uuid4
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.aws_athena_hook import AWSAthenaHook
+
+
+class AWSAthenaOperator(BaseOperator):
+"""
+Airflow operator to run presto queries on athena.
+
+:param query: Presto to be run on athena. (templated)
+:type query: str
+:param database: Database to select. (templated)
+:type database: str
+:param output_location: s3 path to write the query results into. 
(templated)
+:type output_location: str
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+ui_color = '#44b5e2'
+template_fields = ('query', 'database', 'output_location')
+
+@apply_defaults
+def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+super(AWSAthenaOperator, self).__init__(*args, **kwargs)
+self.query = query
+self.database = database
+self.output_location = output_location
+self.aws_conn_id = aws_conn_id
+self.client_request_token = client_request_token or str(uuid4())
+self.query_execution_context = query_execution_context or {}
+self.result_configuration = result_configuration or {}
+self.sleep_time = sleep_time
+self.query_execution_id = None
+self.hook = None
+
+def get_hook(self):
+return AWSAthenaHook(self.aws_conn_id, self.sleep_time)
+
+def execute(self, context):
+"""
+Run Presto Query on Athena
+"""
+self.hook = self.get_hook()
+self.hook.get_conn()
+
+self.query_execution_context['Database'] = self.database
+self.result_configuration['OutputLocation'] = self.output_location
 
 Review comment:
   @ckljohn I don't understand what you mean, sorry.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on a change in pull request #4166: [AIRFLOW-3323] Support HTTP basic authentication for Airflow Flower

2018-11-13 Thread GitBox
ashb commented on a change in pull request #4166: [AIRFLOW-3323] Support HTTP 
basic authentication for Airflow Flower
URL: https://github.com/apache/incubator-airflow/pull/4166#discussion_r232993741
 
 

 ##
 File path: docs/security.rst
 ##
 @@ -402,3 +402,22 @@ not set.
 
 [core]
 default_impersonation = airflow
+
+
+Flower Authentication
+-
+
+Basic authentication for Celery Flower is supported.
+
+You can specify the details either as an optional argument in the Flower 
process launching
+command, or as a configuration item in your ``airflow.cfg``. For both cases, 
please provide
+`user:password` pairs separated by a comma.
+
+.. code-block:: bash
+
+airflow flower --basic_auth=user1:password1,user2:password2
 
 Review comment:
   The CLI option should probably be avoided as it would show up in `ps` output.
   
   Is it also the raw password, or is it the salted/hashed output from 
`htpasswrd` or similar?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ckljohn commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
ckljohn commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS 
Athena Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111#discussion_r232993497
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from uuid import uuid4
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.aws_athena_hook import AWSAthenaHook
+
+
+class AWSAthenaOperator(BaseOperator):
+"""
+Airflow operator to run presto queries on athena.
+
+:param query: Presto to be run on athena. (templated)
+:type query: str
+:param database: Database to select. (templated)
+:type database: str
+:param output_location: s3 path to write the query results into. 
(templated)
+:type output_location: str
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+ui_color = '#44b5e2'
+template_fields = ('query', 'database', 'output_location')
+
+@apply_defaults
+def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+super(AWSAthenaOperator, self).__init__(*args, **kwargs)
+self.query = query
+self.database = database
+self.output_location = output_location
+self.aws_conn_id = aws_conn_id
+self.client_request_token = client_request_token or str(uuid4())
+self.query_execution_context = query_execution_context or {}
+self.result_configuration = result_configuration or {}
+self.sleep_time = sleep_time
+self.query_execution_id = None
+self.hook = None
+
+def get_hook(self):
+return AWSAthenaHook(self.aws_conn_id, self.sleep_time)
+
+def execute(self, context):
+"""
+Run Presto Query on Athena
+"""
+self.hook = self.get_hook()
+self.hook.get_conn()
+
+self.query_execution_context['Database'] = self.database
+self.result_configuration['OutputLocation'] = self.output_location
 
 Review comment:
   Originally, I want to extend the operator to run pre, post SQL.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
ashb commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena 
Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111#discussion_r232992126
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from uuid import uuid4
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.aws_athena_hook import AWSAthenaHook
+
+
+class AWSAthenaOperator(BaseOperator):
+"""
+Airflow operator to run presto queries on athena.
+
+:param query: Presto to be run on athena. (templated)
+:type query: str
+:param database: Database to select. (templated)
+:type database: str
+:param output_location: s3 path to write the query results into. 
(templated)
+:type output_location: str
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+ui_color = '#44b5e2'
+template_fields = ('query', 'database', 'output_location')
+
+@apply_defaults
+def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+super(AWSAthenaOperator, self).__init__(*args, **kwargs)
+self.query = query
+self.database = database
+self.output_location = output_location
+self.aws_conn_id = aws_conn_id
+self.client_request_token = client_request_token or str(uuid4())
+self.query_execution_context = query_execution_context or {}
+self.result_configuration = result_configuration or {}
+self.sleep_time = sleep_time
+self.query_execution_id = None
+self.hook = None
+
+def get_hook(self):
+return AWSAthenaHook(self.aws_conn_id, self.sleep_time)
+
+def execute(self, context):
+"""
+Run Presto Query on Athena
+"""
+self.hook = self.get_hook()
+self.hook.get_conn()
+
+self.query_execution_context['Database'] = self.database
+self.result_configuration['OutputLocation'] = self.output_location
 
 Review comment:
   It's probably worth adding a comment why this is done here and not in 
`__init__` to avoid this question in the future.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438226413
 
 
   @Fokko Where is that message from? I have only been checking the licenses 
against files in git (i.e. in a clean checkout)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438224554
 
 
   I'm updating the .rat-excludes to not ignore docs or md files now


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on issue #4166: [AIRFLOW-3323] Support HTTP basic authentication for Airflow Flower

2018-11-13 Thread GitBox
XD-DENG commented on issue #4166: [AIRFLOW-3323] Support HTTP basic 
authentication for Airflow Flower
URL: 
https://github.com/apache/incubator-airflow/pull/4166#issuecomment-438221398
 
 
   @ashb @Fokko @kaxil a gentle ping, PTAL


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] phani8996 commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
phani8996 commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS 
Athena Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111#discussion_r232985039
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from uuid import uuid4
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.aws_athena_hook import AWSAthenaHook
+
+
+class AWSAthenaOperator(BaseOperator):
+"""
+Airflow operator to run presto queries on athena.
+
+:param query: Presto to be run on athena. (templated)
+:type query: str
+:param database: Database to select. (templated)
+:type database: str
+:param output_location: s3 path to write the query results into. 
(templated)
+:type output_location: str
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+ui_color = '#44b5e2'
+template_fields = ('query', 'database', 'output_location')
+
+@apply_defaults
+def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+super(AWSAthenaOperator, self).__init__(*args, **kwargs)
+self.query = query
+self.database = database
+self.output_location = output_location
+self.aws_conn_id = aws_conn_id
+self.client_request_token = client_request_token or str(uuid4())
+self.query_execution_context = query_execution_context or {}
+self.result_configuration = result_configuration or {}
+self.sleep_time = sleep_time
+self.query_execution_id = None
+self.hook = None
+
+def get_hook(self):
+return AWSAthenaHook(self.aws_conn_id, self.sleep_time)
+
+def execute(self, context):
+"""
+Run Presto Query on Athena
+"""
+self.hook = self.get_hook()
+self.hook.get_conn()
+
+self.query_execution_context['Database'] = self.database
+self.result_configuration['OutputLocation'] = self.output_location
 
 Review comment:
   As advised by @ashb connections are not supposed to be initiated in 
`__init__`. For the above operator, `database` and `output_location` are 
template_fields and templating will be applied after `__init__` and before 
`execute`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ckljohn commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-13 Thread GitBox
ckljohn commented on a change in pull request #4111: [AIRFLOW-3266] Add AWS 
Athena Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111#discussion_r232983074
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from uuid import uuid4
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.aws_athena_hook import AWSAthenaHook
+
+
+class AWSAthenaOperator(BaseOperator):
+"""
+Airflow operator to run presto queries on athena.
+
+:param query: Presto to be run on athena. (templated)
+:type query: str
+:param database: Database to select. (templated)
+:type database: str
+:param output_location: s3 path to write the query results into. 
(templated)
+:type output_location: str
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+ui_color = '#44b5e2'
+template_fields = ('query', 'database', 'output_location')
+
+@apply_defaults
+def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+super(AWSAthenaOperator, self).__init__(*args, **kwargs)
+self.query = query
+self.database = database
+self.output_location = output_location
+self.aws_conn_id = aws_conn_id
+self.client_request_token = client_request_token or str(uuid4())
+self.query_execution_context = query_execution_context or {}
+self.result_configuration = result_configuration or {}
+self.sleep_time = sleep_time
+self.query_execution_id = None
+self.hook = None
+
+def get_hook(self):
+return AWSAthenaHook(self.aws_conn_id, self.sleep_time)
+
+def execute(self, context):
+"""
+Run Presto Query on Athena
+"""
+self.hook = self.get_hook()
+self.hook.get_conn()
+
+self.query_execution_context['Database'] = self.database
+self.result_configuration['OutputLocation'] = self.output_location
 
 Review comment:
   L72-73 should be in `__init__`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3275) Google Cloud SQL Query operator

2018-11-13 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3275?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik updated AIRFLOW-3275:

Priority: Trivial  (was: Major)

> Google Cloud SQL Query operator
> ---
>
> Key: AIRFLOW-3275
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3275
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib
>Reporter: Jarek Potiuk
>Priority: Trivial
>
> Operator that performs a DDL or DML SQL query in Google Cloud SQL instance. 
> The DQL (retrieving data from Google Cloud SQL) is not supported - you might 
> run the SELECT queries but results of those queries are discarded.
> You should be able specify various connectivity methods to connect to running 
> instance - starting from Public IP plain connection through Public IP with 
> SSL or both TCP and socket connection via Cloud SQL Proxy. The proxy should 
> be downloaded and started/stopped dynamically as needed by the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil commented on issue #4179: [AIRFLOW-3332] Add insert_all to allow inserting rows into BigQuery table

2018-11-13 Thread GitBox
kaxil commented on issue #4179: [AIRFLOW-3332] Add insert_all to allow 
inserting rows into BigQuery table
URL: 
https://github.com/apache/incubator-airflow/pull/4179#issuecomment-438203471
 
 
   Please add some mock tests and follow the commit guidelines


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil closed pull request #4180: Add Iflix - We are using Airflow

2018-11-13 Thread GitBox
kaxil closed pull request #4180: Add Iflix - We are using Airflow
URL: https://github.com/apache/incubator-airflow/pull/4180
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/README.md b/README.md
index d9eebeda33..a4b2c18560 100644
--- a/README.md
+++ b/README.md
@@ -192,6 +192,7 @@ Currently **officially** using Airflow:
 1. [Hootsuite](https://github.com/hootsuite)
 1. [Hostnfly](https://www.hostnfly.com/) 
[[@CyrilLeMat](https://github.com/CyrilLeMat) & 
[@pierrechopin](https://github.com/pierrechopin) & 
[@alexisrosuel](https://github.com/alexisrosuel)]
 1. [HotelQuickly](https://github.com/HotelQuickly) 
[[@zinuzoid](https://github.com/zinuzoid)]
+1. [Iflix](https://piay.iflix.com) 
[[@ChaturvediSulabh](https://github.com/ChaturvediSulabh)]
 1. [IFTTT](https://www.ifttt.com/) 
[[@apurvajoshi](https://github.com/apurvajoshi)]
 1. [iHeartRadio](http://www.iheart.com/)[[@yiwang](https://github.com/yiwang)]
 1. [imgix](https://www.imgix.com/) [[@dclubb](https://github.com/dclubb)]


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3275) Google Cloud SQL Query operator

2018-11-13 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3275?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-3275.
-
   Resolution: Fixed
Fix Version/s: 2.0.0

Resolved by https://github.com/apache/incubator-airflow/pull/4170

> Google Cloud SQL Query operator
> ---
>
> Key: AIRFLOW-3275
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3275
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, gcp
>Reporter: Jarek Potiuk
>Priority: Trivial
> Fix For: 2.0.0
>
>
> Operator that performs a DDL or DML SQL query in Google Cloud SQL instance. 
> The DQL (retrieving data from Google Cloud SQL) is not supported - you might 
> run the SELECT queries but results of those queries are discarded.
> You should be able specify various connectivity methods to connect to running 
> instance - starting from Public IP plain connection through Public IP with 
> SSL or both TCP and socket connection via Cloud SQL Proxy. The proxy should 
> be downloaded and started/stopped dynamically as needed by the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3275) Google Cloud SQL Query operator

2018-11-13 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3275?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik updated AIRFLOW-3275:

Component/s: gcp

> Google Cloud SQL Query operator
> ---
>
> Key: AIRFLOW-3275
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3275
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, gcp
>Reporter: Jarek Potiuk
>Priority: Trivial
> Fix For: 2.0.0
>
>
> Operator that performs a DDL or DML SQL query in Google Cloud SQL instance. 
> The DQL (retrieving data from Google Cloud SQL) is not supported - you might 
> run the SELECT queries but results of those queries are discarded.
> You should be able specify various connectivity methods to connect to running 
> instance - starting from Public IP plain connection through Public IP with 
> SSL or both TCP and socket connection via Cloud SQL Proxy. The proxy should 
> be downloaded and started/stopped dynamically as needed by the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3275) Google Cloud SQL Query operator

2018-11-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3275?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16684953#comment-16684953
 ] 

ASF GitHub Bot commented on AIRFLOW-3275:
-

kaxil closed pull request #4170: [AIRFLOW-3275] Implement Google Cloud SQL 
Query operator
URL: https://github.com/apache/incubator-airflow/pull/4170
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/example_dags/example_gcp_compute.py 
b/airflow/contrib/example_dags/example_gcp_compute.py
index 51a55b6a99..928e9744b6 100644
--- a/airflow/contrib/example_dags/example_gcp_compute.py
+++ b/airflow/contrib/example_dags/example_gcp_compute.py
@@ -21,15 +21,15 @@
 Example Airflow DAG that starts, stops and sets the machine type of a Google 
Compute
 Engine instance.
 
-This DAG relies on the following Airflow variables
-https://airflow.apache.org/concepts.html#variables
+This DAG relies on the following OS environment variables
+
 * PROJECT_ID - Google Cloud Platform project where the Compute Engine instance 
exists.
 * ZONE - Google Cloud Platform zone where the instance exists.
 * INSTANCE - Name of the Compute Engine instance.
 * SHORT_MACHINE_TYPE_NAME - Machine type resource name to set, e.g. 
'n1-standard-1'.
 See https://cloud.google.com/compute/docs/machine-types
 """
-
+import os
 import datetime
 
 import airflow
@@ -38,17 +38,17 @@
 GceInstanceStopOperator, GceSetMachineTypeOperator
 
 # [START howto_operator_gce_args_common]
-PROJECT_ID = models.Variable.get('PROJECT_ID', 'example-airflow')
-ZONE = models.Variable.get('ZONE', 'europe-west1-b')
-INSTANCE = models.Variable.get('INSTANCE', 'test-instance')
+PROJECT_ID = os.environ.get('PROJECT_ID', 'example-project')
+ZONE = os.environ.get('ZONE', 'europe-west1-b')
+INSTANCE = os.environ.get('INSTANCE', 'testinstance')
+# [END howto_operator_gce_args_common]
 
 default_args = {
 'start_date': airflow.utils.dates.days_ago(1)
 }
-# [END howto_operator_gce_args_common]
 
 # [START howto_operator_gce_args_set_machine_type]
-SHORT_MACHINE_TYPE_NAME = models.Variable.get('SHORT_MACHINE_TYPE_NAME', 
'n1-standard-1')
+SHORT_MACHINE_TYPE_NAME = os.environ.get('SHORT_MACHINE_TYPE_NAME', 
'n1-standard-1')
 SET_MACHINE_TYPE_BODY = {
 'machineType': 'zones/{}/machineTypes/{}'.format(ZONE, 
SHORT_MACHINE_TYPE_NAME)
 }
diff --git a/airflow/contrib/example_dags/example_gcp_compute_igm.py 
b/airflow/contrib/example_dags/example_gcp_compute_igm.py
index dc24259f9f..3e4543c60d 100644
--- a/airflow/contrib/example_dags/example_gcp_compute_igm.py
+++ b/airflow/contrib/example_dags/example_gcp_compute_igm.py
@@ -50,11 +50,11 @@
 # [START howto_operator_compute_igm_common_args]
 PROJECT_ID = os.environ.get('PROJECT_ID', 'example-project')
 ZONE = os.environ.get('ZONE', 'europe-west1-b')
+# [END howto_operator_compute_igm_common_args]
 
 default_args = {
 'start_date': airflow.utils.dates.days_ago(1)
 }
-# [END howto_operator_compute_igm_common_args]
 
 # [START howto_operator_compute_template_copy_args]
 TEMPLATE_NAME = os.environ.get('TEMPLATE_NAME', 'instance-template-test')
diff --git a/airflow/contrib/example_dags/example_gcp_function_delete.py 
b/airflow/contrib/example_dags/example_gcp_function_delete.py
index d87eed39c5..642e3a744c 100644
--- a/airflow/contrib/example_dags/example_gcp_function_delete.py
+++ b/airflow/contrib/example_dags/example_gcp_function_delete.py
@@ -19,13 +19,13 @@
 
 """
 Example Airflow DAG that deletes a Google Cloud Function.
-This DAG relies on the following Airflow variables
-https://airflow.apache.org/concepts.html#variables
+This DAG relies on the following OS environment variables
 * PROJECT_ID - Google Cloud Project where the Cloud Function exists.
 * LOCATION - Google Cloud Functions region where the function exists.
 * ENTRYPOINT - Name of the executable function in the source code.
 """
 
+import os
 import datetime
 
 import airflow
@@ -33,17 +33,18 @@
 from airflow.contrib.operators.gcp_function_operator import 
GcfFunctionDeleteOperator
 
 # [START howto_operator_gcf_delete_args]
-PROJECT_ID = models.Variable.get('PROJECT_ID', 'example-airflow')
-LOCATION = models.Variable.get('LOCATION', 'europe-west1')
-ENTRYPOINT = models.Variable.get('ENTRYPOINT', 'helloWorld')
+PROJECT_ID = os.environ.get('PROJECT_ID', 'example-project')
+LOCATION = os.environ.get('LOCATION', 'europe-west1')
+ENTRYPOINT = os.environ.get('ENTRYPOINT', 'helloWorld')
 # A fully-qualified name of the function to delete
 
 FUNCTION_NAME = 'projects/{}/locations/{}/functions/{}'.format(PROJECT_ID, 
LOCATION,
ENTRYPOINT)
+# [END howto_operator_gcf_delete_args]
+
 default_ar

[GitHub] kaxil closed pull request #4170: [AIRFLOW-3275] Implement Google Cloud SQL Query operator

2018-11-13 Thread GitBox
kaxil closed pull request #4170: [AIRFLOW-3275] Implement Google Cloud SQL 
Query operator
URL: https://github.com/apache/incubator-airflow/pull/4170
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/example_dags/example_gcp_compute.py 
b/airflow/contrib/example_dags/example_gcp_compute.py
index 51a55b6a99..928e9744b6 100644
--- a/airflow/contrib/example_dags/example_gcp_compute.py
+++ b/airflow/contrib/example_dags/example_gcp_compute.py
@@ -21,15 +21,15 @@
 Example Airflow DAG that starts, stops and sets the machine type of a Google 
Compute
 Engine instance.
 
-This DAG relies on the following Airflow variables
-https://airflow.apache.org/concepts.html#variables
+This DAG relies on the following OS environment variables
+
 * PROJECT_ID - Google Cloud Platform project where the Compute Engine instance 
exists.
 * ZONE - Google Cloud Platform zone where the instance exists.
 * INSTANCE - Name of the Compute Engine instance.
 * SHORT_MACHINE_TYPE_NAME - Machine type resource name to set, e.g. 
'n1-standard-1'.
 See https://cloud.google.com/compute/docs/machine-types
 """
-
+import os
 import datetime
 
 import airflow
@@ -38,17 +38,17 @@
 GceInstanceStopOperator, GceSetMachineTypeOperator
 
 # [START howto_operator_gce_args_common]
-PROJECT_ID = models.Variable.get('PROJECT_ID', 'example-airflow')
-ZONE = models.Variable.get('ZONE', 'europe-west1-b')
-INSTANCE = models.Variable.get('INSTANCE', 'test-instance')
+PROJECT_ID = os.environ.get('PROJECT_ID', 'example-project')
+ZONE = os.environ.get('ZONE', 'europe-west1-b')
+INSTANCE = os.environ.get('INSTANCE', 'testinstance')
+# [END howto_operator_gce_args_common]
 
 default_args = {
 'start_date': airflow.utils.dates.days_ago(1)
 }
-# [END howto_operator_gce_args_common]
 
 # [START howto_operator_gce_args_set_machine_type]
-SHORT_MACHINE_TYPE_NAME = models.Variable.get('SHORT_MACHINE_TYPE_NAME', 
'n1-standard-1')
+SHORT_MACHINE_TYPE_NAME = os.environ.get('SHORT_MACHINE_TYPE_NAME', 
'n1-standard-1')
 SET_MACHINE_TYPE_BODY = {
 'machineType': 'zones/{}/machineTypes/{}'.format(ZONE, 
SHORT_MACHINE_TYPE_NAME)
 }
diff --git a/airflow/contrib/example_dags/example_gcp_compute_igm.py 
b/airflow/contrib/example_dags/example_gcp_compute_igm.py
index dc24259f9f..3e4543c60d 100644
--- a/airflow/contrib/example_dags/example_gcp_compute_igm.py
+++ b/airflow/contrib/example_dags/example_gcp_compute_igm.py
@@ -50,11 +50,11 @@
 # [START howto_operator_compute_igm_common_args]
 PROJECT_ID = os.environ.get('PROJECT_ID', 'example-project')
 ZONE = os.environ.get('ZONE', 'europe-west1-b')
+# [END howto_operator_compute_igm_common_args]
 
 default_args = {
 'start_date': airflow.utils.dates.days_ago(1)
 }
-# [END howto_operator_compute_igm_common_args]
 
 # [START howto_operator_compute_template_copy_args]
 TEMPLATE_NAME = os.environ.get('TEMPLATE_NAME', 'instance-template-test')
diff --git a/airflow/contrib/example_dags/example_gcp_function_delete.py 
b/airflow/contrib/example_dags/example_gcp_function_delete.py
index d87eed39c5..642e3a744c 100644
--- a/airflow/contrib/example_dags/example_gcp_function_delete.py
+++ b/airflow/contrib/example_dags/example_gcp_function_delete.py
@@ -19,13 +19,13 @@
 
 """
 Example Airflow DAG that deletes a Google Cloud Function.
-This DAG relies on the following Airflow variables
-https://airflow.apache.org/concepts.html#variables
+This DAG relies on the following OS environment variables
 * PROJECT_ID - Google Cloud Project where the Cloud Function exists.
 * LOCATION - Google Cloud Functions region where the function exists.
 * ENTRYPOINT - Name of the executable function in the source code.
 """
 
+import os
 import datetime
 
 import airflow
@@ -33,17 +33,18 @@
 from airflow.contrib.operators.gcp_function_operator import 
GcfFunctionDeleteOperator
 
 # [START howto_operator_gcf_delete_args]
-PROJECT_ID = models.Variable.get('PROJECT_ID', 'example-airflow')
-LOCATION = models.Variable.get('LOCATION', 'europe-west1')
-ENTRYPOINT = models.Variable.get('ENTRYPOINT', 'helloWorld')
+PROJECT_ID = os.environ.get('PROJECT_ID', 'example-project')
+LOCATION = os.environ.get('LOCATION', 'europe-west1')
+ENTRYPOINT = os.environ.get('ENTRYPOINT', 'helloWorld')
 # A fully-qualified name of the function to delete
 
 FUNCTION_NAME = 'projects/{}/locations/{}/functions/{}'.format(PROJECT_ID, 
LOCATION,
ENTRYPOINT)
+# [END howto_operator_gcf_delete_args]
+
 default_args = {
 'start_date': airflow.utils.dates.days_ago(1)
 }
-# [END howto_operator_gcf_delete_args]
 
 with models.DAG(
 'example_gcp_function_delete',
diff --git a/airflow/contrib/example_dags/example_gcp_function_deploy_delete.py 
b/airflow/contrib/example

[GitHub] kaxil commented on issue #4170: [AIRFLOW-3275] Implement Google Cloud SQL Query operator

2018-11-13 Thread GitBox
kaxil commented on issue #4170: [AIRFLOW-3275] Implement Google Cloud SQL Query 
operator
URL: 
https://github.com/apache/incubator-airflow/pull/4170#issuecomment-438201656
 
 
   Thanks @potiuk 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
ashb commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438196739
 
 
   > @ashb looks good. How did you ensure you covered all docs?
   
   `find . -iname '*.rst' -or -iname '*.md'` - so not foolproof


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3333) New features enable transferring of files or data from GCS to a SFTP remote path and SFTP to GCS path.

2018-11-13 Thread Pulin Pathneja (JIRA)
Pulin Pathneja created AIRFLOW-:
---

 Summary: New features enable transferring of files or data from 
GCS to a SFTP remote path and SFTP to GCS path. 
 Key: AIRFLOW-
 URL: https://issues.apache.org/jira/browse/AIRFLOW-
 Project: Apache Airflow
  Issue Type: New Feature
Reporter: Pulin Pathneja
Assignee: Pulin Pathneja


New features enable transferring of files or data from S3 to a SFTP remote path 
and SFTP to S3 path. 
 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko edited a comment on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
Fokko edited a comment on issue #4178: [AIRFLOW-2779] Add license headers to 
doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438188535
 
 
   I see:
   ```
   [WARNING] Files with unapproved licenses:
 lang/csharp/README.md
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/FixedTest.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/SimpleCallback.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/TestRecordWithUnion.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/AllTestRecordPartial.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/TestRecordExtensions.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/AllTestRecord.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/TestError.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/TestRecord.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/AllEnum.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/MailCallback.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/Message.cs
 lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/Kind.cs
 lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/MD5.cs
 lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/Mail.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/Simple.cs
   ```
   
   Would it be possible to add the license header to the generated file? 
Possibly you can make this part of the template. In the case of the readme, you 
can add it as an exception: 
https://github.com/apache/avro/blob/master/pom.xml#L229


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files

2018-11-13 Thread GitBox
Fokko commented on issue #4178: [AIRFLOW-2779] Add license headers to doc files
URL: 
https://github.com/apache/incubator-airflow/pull/4178#issuecomment-438188535
 
 
   I see:
   ```
   [WARNING] Files with unapproved licenses:
 lang/csharp/README.md
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/FixedTest.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/SimpleCallback.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/TestRecordWithUnion.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/AllTestRecordPartial.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/TestRecordExtensions.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/AllTestRecord.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/TestError.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/TestRecord.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/AllEnum.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/MailCallback.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/Message.cs
 lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/Kind.cs
 lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/MD5.cs
 lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/Mail.cs
 
lang/csharp/src/apache/ipc.test/GeneratedFiles/org/apache/avro/test/Simple.cs
   ```
   
   Would it be possible to add the license header to the generated file? 
Possibly you can make this part of the template.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ChaturvediSulabh opened a new pull request #4180: Add Iflix - We are using Airflow

2018-11-13 Thread GitBox
ChaturvediSulabh opened a new pull request #4180: Add Iflix - We are using 
Airflow
URL: https://github.com/apache/incubator-airflow/pull/4180
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] potiuk commented on issue #4170: [AIRFLOW-3275] Implement Google Cloud SQL Query operator

2018-11-13 Thread GitBox
potiuk commented on issue #4170: [AIRFLOW-3275] Implement Google Cloud SQL 
Query operator
URL: 
https://github.com/apache/incubator-airflow/pull/4170#issuecomment-438176102
 
 
   Yeah. I expected this. I should have warned you :). I rebased it, resolved 
all the conflicts, reviewed all the documentation after conflict resolution 
(there were quite a number of "consistency" changes across all the GCE-releated 
operators in this PR when it comes to documentation). I re-run all the relevant 
tests, including the integration ones, so all should be fine.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >