[jira] [Work started] (AIRFLOW-3322) Qubole Hook: Change hook to fetch command args dynamically from qds_sdk

2018-11-08 Thread Joy Lal Chattaraj (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3322?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3322 started by Joy Lal Chattaraj.
--
> Qubole Hook: Change hook to fetch command args dynamically from qds_sdk 
> 
>
> Key: AIRFLOW-3322
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3322
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Joy Lal Chattaraj
>Assignee: Joy Lal Chattaraj
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Joylal4896 opened a new pull request #4165: [AIR-3322] Qubole Hook: Change hook to fetch command args dynamically from qds_sdk

2018-11-08 Thread GitBox
Joylal4896 opened a new pull request #4165: [AIR-3322] Qubole Hook: Change hook 
to fetch command args dynamically from qds_sdk
URL: https://github.com/apache/incubator-airflow/pull/4165
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3322) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3322) Qubole Hook: Change hook to fetch command args dynamically from qds_sdk

2018-11-08 Thread Joy Lal Chattaraj (JIRA)
Joy Lal Chattaraj created AIRFLOW-3322:
--

 Summary: Qubole Hook: Change hook to fetch command args 
dynamically from qds_sdk 
 Key: AIRFLOW-3322
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3322
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Joy Lal Chattaraj
Assignee: Joy Lal Chattaraj






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io commented on issue #4164: BigQueryHook check if dataset exists

2018-11-08 Thread GitBox
codecov-io commented on issue #4164: BigQueryHook check if dataset exists
URL: 
https://github.com/apache/incubator-airflow/pull/4164#issuecomment-437261404
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4164?src=pr&el=h1)
 Report
   > Merging 
[#4164](https://codecov.io/gh/apache/incubator-airflow/pull/4164?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/52ee34c5fc7818797e0a7463d896e80e1482cbfd?src=pr&el=desc)
 will **decrease** coverage by `0.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4164/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4164?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4164  +/-   ##
   ==
   - Coverage   77.68%   77.66%   -0.02% 
   ==
 Files 199  199  
 Lines   1627316273  
   ==
   - Hits1264112638   -3 
   - Misses   3632 3635   +3
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4164?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4164/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (-0.28%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4164?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4164?src=pr&el=footer).
 Last update 
[52ee34c...23d324a](https://codecov.io/gh/apache/incubator-airflow/pull/4164?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ryanyuan opened a new pull request #4164: BigQueryHook check if dataset exists

2018-11-08 Thread GitBox
ryanyuan opened a new pull request #4164: BigQueryHook check if dataset exists
URL: https://github.com/apache/incubator-airflow/pull/4164
 
 
   Add a function to BigQueryHook to check the existence of a dataset. 
   Return True if the dataset exists.
   Return False if 404 error received.
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following 
[Airflow-3318](https://issues.apache.org/jira/browse/AIRFLOW-3318) issues and 
references them in the PR title. 
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   To check the existence of a dataset in BigQuery, existing BigQueryHook only 
supports either 1) using get_datasets_list() to get all the datasets and then 
searching the target dataset from the list; or 2) using get_dataset().
   
   However, with get_dataset(), it raises AirflowException whenever an 
HttpError received. So it has no capabilities to determine if the dataset 
exists or not.
   
   To solve this issue, I've added a function to return a boolean value 
representing the existence of the dataset.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
tests.contrib.hooks.test_bigquery_hook:TestDatasetsOperations.test_check_dataset_exists
   
tests.contrib.hooks.test_bigquery_hook:TestDatasetsOperations.test_check_dataset_exists_not_exist
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3321) Ability to change schedule interval for a dag from airflow UI

2018-11-08 Thread Anshu Agarwal (JIRA)
Anshu Agarwal created AIRFLOW-3321:
--

 Summary: Ability to change schedule interval for a dag from 
airflow UI
 Key: AIRFLOW-3321
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3321
 Project: Apache Airflow
  Issue Type: New Feature
  Components: webapp
Affects Versions: 1.10.0
Reporter: Anshu Agarwal


There are situations when we need to experiment with dag schedule interval on 
production without code deployment. Having schedule interval edit option in the 
airflow webserver UI can help to achieve this without any code deployment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets feature to work as described

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets 
feature to work as described
URL: 
https://github.com/apache/incubator-airflow/pull/4156#issuecomment-436867007
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=h1)
 Report
   > Merging 
[#4156](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/52ee34c5fc7818797e0a7463d896e80e1482cbfd?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4156/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4156  +/-   ##
   ==
   + Coverage   77.68%   77.68%   +<.01% 
   ==
 Files 199  199  
 Lines   1627316278   +5 
   ==
   + Hits1264112646   +5 
 Misses   3632 3632
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/lineage/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL19faW5pdF9fLnB5)
 | `96.87% <100%> (+0.26%)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (-0.28%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.23% <0%> (+0.04%)` | :arrow_up: |
   | 
[airflow/lineage/datasets.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL2RhdGFzZXRzLnB5)
 | `87.32% <0%> (+2.81%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=footer).
 Last update 
[52ee34c...71462f6](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets feature to work as described

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4156: [AIRFLOW-3314] Changed auto inlets 
feature to work as described
URL: 
https://github.com/apache/incubator-airflow/pull/4156#issuecomment-436867007
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=h1)
 Report
   > Merging 
[#4156](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/52ee34c5fc7818797e0a7463d896e80e1482cbfd?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4156/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4156  +/-   ##
   ==
   + Coverage   77.68%   77.68%   +<.01% 
   ==
 Files 199  199  
 Lines   1627316278   +5 
   ==
   + Hits1264112646   +5 
 Misses   3632 3632
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/lineage/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL19faW5pdF9fLnB5)
 | `96.87% <100%> (+0.26%)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (-0.28%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.23% <0%> (+0.04%)` | :arrow_up: |
   | 
[airflow/lineage/datasets.py](https://codecov.io/gh/apache/incubator-airflow/pull/4156/diff?src=pr&el=tree#diff-YWlyZmxvdy9saW5lYWdlL2RhdGFzZXRzLnB5)
 | `87.32% <0%> (+2.81%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=footer).
 Last update 
[52ee34c...71462f6](https://codecov.io/gh/apache/incubator-airflow/pull/4156?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3320) Sagemaker operator never ends when having "Stopped" status

2018-11-08 Thread John Cheng (JIRA)
John Cheng created AIRFLOW-3320:
---

 Summary: Sagemaker operator never ends when having "Stopped" status
 Key: AIRFLOW-3320
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3320
 Project: Apache Airflow
  Issue Type: Bug
  Components: aws
Reporter: John Cheng
Assignee: Yang Yu


Sagemaker operator never ends when having "Stopped" status.
{code:java}
[2018-11-08 20:15:27,864] {logging_mixin.py:95} INFO - [2018-11-08 
20:15:27,864] {sagemaker_hook.py:129} INFO - Job still running for 2690 
seconds... current status is InProgress
[2018-11-08 20:15:32,917] {logging_mixin.py:95} INFO - [2018-11-08 
20:15:32,917] {sagemaker_hook.py:129} INFO - Job still running for 2695 
seconds... current status is Stopping
[2018-11-08 20:15:37,963] {logging_mixin.py:95} INFO - [2018-11-08 
20:15:37,963] {sagemaker_hook.py:129} INFO - Job still running for 2700 
seconds... current status is Stopping
[2018-11-08 20:15:43,012] {logging_mixin.py:95} INFO - [2018-11-08 
20:15:43,012] {sagemaker_hook.py:129} INFO - Job still running for 2705 
seconds... current status is Stopping
[2018-11-08 20:15:48,060] {logging_mixin.py:95} INFO - [2018-11-08 
20:15:48,060] {sagemaker_hook.py:129} INFO - Job still running for 2710 
seconds... current status is Stopping
[2018-11-08 20:15:53,108] {logging_mixin.py:95} INFO - [2018-11-08 
20:15:53,108] {sagemaker_hook.py:129} INFO - Job still running for 2715 
seconds... current status is Stopping
[2018-11-08 20:15:58,153] {logging_mixin.py:95} INFO - [2018-11-08 
20:15:58,153] {sagemaker_hook.py:129} INFO - Job still running for 2720 
seconds... current status is Stopping
[2018-11-08 20:16:03,206] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:03,206] {sagemaker_hook.py:129} INFO - Job still running for 2725 
seconds... current status is Stopping
[2018-11-08 20:16:08,255] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:08,255] {sagemaker_hook.py:129} INFO - Job still running for 2730 
seconds... current status is Stopping
[2018-11-08 20:16:13,298] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:13,298] {sagemaker_hook.py:129} INFO - Job still running for 2735 
seconds... current status is Stopping
[2018-11-08 20:16:18,349] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:18,349] {sagemaker_hook.py:129} INFO - Job still running for 2740 
seconds... current status is Stopping
[2018-11-08 20:16:23,398] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:23,398] {sagemaker_hook.py:129} INFO - Job still running for 2745 
seconds... current status is Stopping
[2018-11-08 20:16:28,445] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:28,445] {sagemaker_hook.py:129} INFO - Job still running for 2750 
seconds... current status is Stopping
[2018-11-08 20:16:33,497] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:33,497] {sagemaker_hook.py:129} INFO - Job still running for 2755 
seconds... current status is Stopping
[2018-11-08 20:16:38,545] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:38,545] {sagemaker_hook.py:129} INFO - Job still running for 2760 
seconds... current status is Stopping
[2018-11-08 20:16:43,593] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:43,593] {sagemaker_hook.py:129} INFO - Job still running for 2765 
seconds... current status is Stopping
[2018-11-08 20:16:48,639] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:48,639] {sagemaker_hook.py:129} INFO - Job still running for 2770 
seconds... current status is Stopping
[2018-11-08 20:16:53,724] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:53,724] {sagemaker_hook.py:129} INFO - Job still running for 2775 
seconds... current status is Stopping
[2018-11-08 20:16:58,773] {logging_mixin.py:95} INFO - [2018-11-08 
20:16:58,773] {sagemaker_hook.py:129} INFO - Job still running for 2780 
seconds... current status is Stopping
[2018-11-08 20:17:03,835] {logging_mixin.py:95} INFO - [2018-11-08 
20:17:03,835] {sagemaker_hook.py:129} INFO - Job still running for 2785 
seconds... current status is Stopping
[2018-11-08 20:17:08,880] {logging_mixin.py:95} INFO - [2018-11-08 
20:17:08,880] {sagemaker_hook.py:129} INFO - Job still running for 2790 
seconds... current status is Stopping
[2018-11-08 20:17:13,931] {logging_mixin.py:95} INFO - [2018-11-08 
20:17:13,931] {sagemaker_hook.py:129} INFO - Job still running for 2795 
seconds... current status is Stopping
[2018-11-08 20:17:18,976] {logging_mixin.py:95} INFO - [2018-11-08 
20:17:18,976] {sagemaker_hook.py:129} INFO - Job still running for 2800 
seconds... current status is Stopping
[2018-11-08 20:17:24,020] {logging_mixin.py:95} INFO - [2018-11-08 
20:17:24,020] {sagemaker_hook.py:129} INFO - Job still running for 2805 
seconds... current status is Stopping
[2018-11-08 20:17:29,070] {logging_mixin.py:95} INFO - [2018-11-08 
20:17:29,070] {sagemaker_hook.py:129} INFO - Job still running for 2810 
seconds... current status is Stopping
[2018-11-08 20:17:34,119] {logging_mi

[GitHub] codecov-io commented on issue #4163: [AIRFLOW-3319] - KubernetsExecutor: Need in try_number in labels if getting them later

2018-11-08 Thread GitBox
codecov-io commented on issue #4163: [AIRFLOW-3319] - KubernetsExecutor: Need 
in try_number in  labels if getting them later
URL: 
https://github.com/apache/incubator-airflow/pull/4163#issuecomment-437223354
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4163?src=pr&el=h1)
 Report
   > Merging 
[#4163](https://codecov.io/gh/apache/incubator-airflow/pull/4163?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/52ee34c5fc7818797e0a7463d896e80e1482cbfd?src=pr&el=desc)
 will **decrease** coverage by `0.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4163/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4163?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4163  +/-   ##
   ==
   - Coverage   77.68%   77.66%   -0.02% 
   ==
 Files 199  199  
 Lines   1627316273  
   ==
   - Hits1264112638   -3 
   - Misses   3632 3635   +3
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4163?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4163/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (-0.28%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4163?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4163?src=pr&el=footer).
 Last update 
[52ee34c...588e292](https://codecov.io/gh/apache/incubator-airflow/pull/4163?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on a change in pull request #4118: [AIRFLOW-3271] Airflow RBAC Permissions modification via UI do not persist

2018-11-08 Thread GitBox
XD-DENG commented on a change in pull request #4118: [AIRFLOW-3271] Airflow 
RBAC Permissions modification via UI do not persist
URL: https://github.com/apache/incubator-airflow/pull/4118#discussion_r232119883
 
 

 ##
 File path: airflow/www_rbac/security.py
 ##
 @@ -181,13 +181,21 @@ def init_role(self, role_name, role_vms, role_perms):
 if not role:
 role = self.add_role(role_name)
 
-role_pvms = []
-for pvm in pvms:
-if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
-role_pvms.append(pvm)
-role.permissions = list(set(role_pvms))
-self.get_session.merge(role)
-self.get_session.commit()
+if len(role.permissions) == 0:
+logging.info('Initializing permissions for role:%s in the 
database.', role_name)
+role_pvms = []
+for pvm in pvms:
+if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
+role_pvms.append(pvm)
+role.permissions = list(set(role_pvms))
+self.get_session.merge(role)
+self.get_session.commit()
+else:
+logging.info('Existing permissions for the role:%s within the 
database will persist.', role_name)
+for pvm in pvms:
+if pvm not in role.permissions:
+if pvm.view_menu.name in role_vms and pvm.permission.name 
in role_perms:
+role.permissions.append(pvm)
 
 Review comment:
   Glad to see the test passes!
   
   Just wondering how the `else` chunk works without session merge/commit?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on a change in pull request #4118: [AIRFLOW-3271] Airflow RBAC Permissions modification via UI do not persist

2018-11-08 Thread GitBox
XD-DENG commented on a change in pull request #4118: [AIRFLOW-3271] Airflow 
RBAC Permissions modification via UI do not persist
URL: https://github.com/apache/incubator-airflow/pull/4118#discussion_r232119883
 
 

 ##
 File path: airflow/www_rbac/security.py
 ##
 @@ -181,13 +181,21 @@ def init_role(self, role_name, role_vms, role_perms):
 if not role:
 role = self.add_role(role_name)
 
-role_pvms = []
-for pvm in pvms:
-if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
-role_pvms.append(pvm)
-role.permissions = list(set(role_pvms))
-self.get_session.merge(role)
-self.get_session.commit()
+if len(role.permissions) == 0:
+logging.info('Initializing permissions for role:%s in the 
database.', role_name)
+role_pvms = []
+for pvm in pvms:
+if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
+role_pvms.append(pvm)
+role.permissions = list(set(role_pvms))
+self.get_session.merge(role)
+self.get_session.commit()
+else:
+logging.info('Existing permissions for the role:%s within the 
database will persist.', role_name)
+for pvm in pvms:
+if pvm not in role.permissions:
+if pvm.view_menu.name in role_vms and pvm.permission.name 
in role_perms:
+role.permissions.append(pvm)
 
 Review comment:
   Glad to see the test passes!
   
   Just wondering why the `else` chunk works without session merge/commit.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3319) Kubernetes Executor attempts to get the "try_number" from labels but fails

2018-11-08 Thread Bo Blanton (JIRA)
Bo Blanton created AIRFLOW-3319:
---

 Summary: Kubernetes Executor attempts to get the "try_number" from 
labels but fails
 Key: AIRFLOW-3319
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3319
 Project: Apache Airflow
  Issue Type: Bug
  Components: executor, kubernetes
Reporter: Bo Blanton


The {{_labels_to_key}} function attempts to use the `try_number` from the k8 
labels, however no such label is applied to the pod resulting in an exception.

Modify pod executor to add this label to fix.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3318) Add a function to BigQueryHook to check the existence of a dataset.

2018-11-08 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3318?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3318 started by Ryan Yuan.
--
> Add a function to BigQueryHook to check the existence of a dataset.
> ---
>
> Key: AIRFLOW-3318
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3318
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, gcp, hooks
>Affects Versions: 1.10.0
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>
> To check the existence of a dataset in BigQuery, existing BigQueryHook only 
> supports either 1) using get_datasets_list() to get all the datasets and then 
> searching the target dataset from the list; or 2) using get_dataset().
> However, with get_dataset(), it raises AirflowException whenever an HttpError 
> received. So it has no capabilities to determine if the dataset exists or not.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3318) Add a function to BigQueryHook to check the existence of a dataset.

2018-11-08 Thread Ryan Yuan (JIRA)
Ryan Yuan created AIRFLOW-3318:
--

 Summary: Add a function to BigQueryHook to check the existence of 
a dataset.
 Key: AIRFLOW-3318
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3318
 Project: Apache Airflow
  Issue Type: New Feature
  Components: contrib, gcp, hooks
Affects Versions: 1.10.0
Reporter: Ryan Yuan
Assignee: Ryan Yuan


To check the existence of a dataset in BigQuery, existing BigQueryHook only 
supports either 1) using get_datasets_list() to get all the datasets and then 
searching the target dataset from the list; or 2) using get_dataset().

However, with get_dataset(), it raises AirflowException whenever an HttpError 
received. So it has no capabilities to determine if the dataset exists or not.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] wyndhblb opened a new pull request #4163: [AIRFLOW-XXX] - Need in try_number in labels if getting them later

2018-11-08 Thread GitBox
wyndhblb opened a new pull request #4163: [AIRFLOW-XXX] - Need in try_number in 
 labels if getting them later
URL: https://github.com/apache/incubator-airflow/pull/4163
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [ X] Here are some details about my PR, including screenshots of any UI 
changes:
   
   The `_labels_to_key` function attempts to use the try_number from the k8 
labels, however no such label is applied to the pod resulting in an exception.
   
   Modify pod executor to add this label to fix.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ X] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3165) Document use of interpolation by ConfigParser

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3165?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3165:
---
Fix Version/s: (was: 2.0.0)
   1.10.1

> Document use of interpolation by ConfigParser
> -
>
> Key: AIRFLOW-3165
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3165
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: Documentation
>Reporter: Bolke de Bruin
>Priority: Major
> Fix For: 1.10.1
>
>
> The config parser interpolates '%' in variables. This can lead to issues when 
> specifiying passwords. As we cant disable inerpolation on a per variable we 
> need to document that people should not use a % sign in their passwords.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3103) Update flask-login

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3103:
---
Fix Version/s: 1.10.1

> Update flask-login
> --
>
> Key: AIRFLOW-3103
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3103
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: authentication
>Reporter: Josh Carp
>Priority: Minor
> Fix For: 1.10.1
>
>
> Airflow uses a release of flask-login from 2014. Flask-login has fixed some 
> bugs and added some features since then, so we should upgrade. Note: 
> flask-appbuilder also pins to an old version of flask-login, so we'll have to 
> update that library as well; PR submitted at 
> https://github.com/dpgaspar/Flask-AppBuilder/pull/811.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb edited a comment on issue #4162: [AIRFLOW-XXX] Speed up RBAC view tests

2018-11-08 Thread GitBox
ashb edited a comment on issue #4162: [AIRFLOW-XXX] Speed up RBAC view tests
URL: 
https://github.com/apache/incubator-airflow/pull/4162#issuecomment-437187177
 
 
   I'm not certain, but this might have taken 15mins run time (not elapsed 
time) off the tests on Travis?!
   
   Nah, too noisy to tell. Anyway, it made it quicker _for me_ when running 
tests locally so it's def helpful.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4162: [AIRFLOW-XXX] Speed up RBAC view tests

2018-11-08 Thread GitBox
ashb commented on issue #4162: [AIRFLOW-XXX] Speed up RBAC view tests
URL: 
https://github.com/apache/incubator-airflow/pull/4162#issuecomment-437187177
 
 
   I'm not certain, but this might have taken 15mins run time (not elapsed 
time) off the tests on Travis?!


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-2524) Airflow integration with AWS Sagemaker

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-2524.

Resolution: Fixed

> Airflow integration with AWS Sagemaker
> --
>
> Key: AIRFLOW-2524
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2524
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: aws, contrib
>Reporter: Rajeev Srinivasan
>Assignee: Yang Yu
>Priority: Major
>  Labels: AWS
> Fix For: 1.10.1
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Would it be possible to orchestrate an end to end  AWS  Sagemaker job using 
> Airflow.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-1762) Use key_file in SSHHook.create_tunnel()

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-1762:
---
Summary: Use key_file in SSHHook.create_tunnel()  (was: Use key_file in 
create_tunnel())

> Use key_file in SSHHook.create_tunnel()
> ---
>
> Key: AIRFLOW-1762
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1762
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.8.0, 1.9.0
>Reporter: Nathan McIntyre
>Assignee: Nathan McIntyre
>Priority: Major
>  Labels: patch
> Fix For: 2.0.0, 1.10.1
>
>
> In contrib/hooks/ssh_hook.py, the ssh command created by the create_tunnel() 
> method does not use the key_file attribute. This prevents the creation of 
> tunnels where a key file is required. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3265) Add support for "unix_socket" in connection extra for Mysql Hook

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3265?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3265:
---
Summary: Add support for "unix_socket" in connection extra for Mysql Hook  
(was: Mysql Hook does not support "unix_socket" extra)

> Add support for "unix_socket" in connection extra for Mysql Hook
> 
>
> Key: AIRFLOW-3265
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3265
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: database
>Reporter: Jarek Potiuk
>Assignee: Jarek Potiuk
>Priority: Minor
> Fix For: 1.10.1
>
>
> MySQL hook does not support "unix_socket" extra - which allows to specify 
> different location of linux socket than the default one. This is a blocker 
> for tools like cloud-sql-proxy that creates sockets in an arbitrary place.
> I will provide fix shortly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2476) Update tabulate dependency version to 0.8.2

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2476?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2476:
---
Summary: Update tabulate dependency version to 0.8.2  (was: tabulate 
update: 0.8.2 is tested)

> Update tabulate dependency version to 0.8.2
> ---
>
> Key: AIRFLOW-2476
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2476
> Project: Apache Airflow
>  Issue Type: Improvement
>Affects Versions: 1.8.0, 1.9.0, 1.10.0, 2.0.0
>Reporter: Ruslan Dautkhanov
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 1.10.1
>
>
> As discussed on the dev list, tabulate==0.8.2 is good to go with Airflow.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] reubenvanammers commented on a change in pull request #4156: [AIRFLOW-3314] Changed auto inlets feature to work as described

2018-11-08 Thread GitBox
reubenvanammers commented on a change in pull request #4156: [AIRFLOW-3314] 
Changed auto inlets feature to work as described
URL: https://github.com/apache/incubator-airflow/pull/4156#discussion_r232089644
 
 

 ##
 File path: airflow/lineage/__init__.py
 ##
 @@ -110,26 +111,32 @@ def wrapper(self, context, *args, **kwargs):
   for i in inlets]
 self.inlets.extend(inlets)
 
-if self._inlets['auto']:
-# dont append twice
-task_ids = set(self._inlets['task_ids']).symmetric_difference(
-self.upstream_task_ids
-)
-inlets = self.xcom_pull(context,
-task_ids=task_ids,
-dag_id=self.dag_id,
-key=PIPELINE_OUTLETS)
-inlets = [item for sublist in inlets if sublist for item in 
sublist]
-inlets = [DataSet.map_type(i['typeName'])(data=i['attributes'])
-  for i in inlets]
-self.inlets.extend(inlets)
-
-if len(self._inlets['datasets']) > 0:
-self.inlets.extend(self._inlets['datasets'])
+if self._inlets["auto"]:  # Performs a tree traversal, starting with 
the current task. If outlets
 
 Review comment:
   Sure! I wanted to keep the documentation as is since I was writing this PR 
to match the features of the documentation. On hindsight, since since it does 
add behaviour not fully covered by the existing docstring, it makes sense to 
move it into the docstring and clear the behaviour up a little.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2711) zendesk hook doesn't handle search endpoint properly

2018-11-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2711?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16680472#comment-16680472
 ] 

ASF GitHub Bot commented on AIRFLOW-2711:
-

stale[bot] closed pull request #3575: [AIRFLOW-2711] zendesk hook doesn't 
handle search endpoint properly
URL: https://github.com/apache/incubator-airflow/pull/3575
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/hooks/zendesk_hook.py b/airflow/hooks/zendesk_hook.py
index 3cf8353344..102556a5cd 100644
--- a/airflow/hooks/zendesk_hook.py
+++ b/airflow/hooks/zendesk_hook.py
@@ -74,7 +74,11 @@ def call(self, path, query=None, get_all_pages=True, 
side_loading=False):
 self.__handle_rate_limit_exception(rle)
 
 # Find the key with the results
-keys = [path.split("/")[-1].split(".json")[0]]
+key = path.split("/")[-1].split(".json")[0]
+if key == 'search':
+keys = ['results']
+else:
+keys = [key]
 next_page = results['next_page']
 if side_loading:
 keys += query['include'].split(',')


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> zendesk hook doesn't handle search endpoint properly
> 
>
> Key: AIRFLOW-2711
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2711
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: Chris Chow
>Priority: Major
>
> the zendesk hook assumes that the api's response includes the expected result 
> in the key with the same name as the api endpoint, e.g. that the results of a 
> query to /api/v2/users.json includes the key 'users'. /api/v2/search.json 
> actually includes results under the key 'results'



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io commented on issue #4162: [AIRFLOW-XXX] Speed up RBAC view tests

2018-11-08 Thread GitBox
codecov-io commented on issue #4162: [AIRFLOW-XXX] Speed up RBAC view tests
URL: 
https://github.com/apache/incubator-airflow/pull/4162#issuecomment-437183773
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4162?src=pr&el=h1)
 Report
   > Merging 
[#4162](https://codecov.io/gh/apache/incubator-airflow/pull/4162?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/52ee34c5fc7818797e0a7463d896e80e1482cbfd?src=pr&el=desc)
 will **decrease** coverage by `0.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4162/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4162?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4162  +/-   ##
   ==
   - Coverage   77.68%   77.66%   -0.02% 
   ==
 Files 199  199  
 Lines   1627316273  
   ==
   - Hits1264112638   -3 
   - Misses   3632 3635   +3
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4162?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4162/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (-0.28%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4162?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4162?src=pr&el=footer).
 Last update 
[52ee34c...ac39538](https://codecov.io/gh/apache/incubator-airflow/pull/4162?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-843) Exceptions now available in context during on_failure_callback

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-843?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-843:
--
Summary: Exceptions now available in context during on_failure_callback  
(was: Store task exceptions in context)

> Exceptions now available in context during on_failure_callback
> --
>
> Key: AIRFLOW-843
> URL: https://issues.apache.org/jira/browse/AIRFLOW-843
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Scott Kruger
>Priority: Minor
> Fix For: 1.10.1
>
>
> If a task encounters an exception during execution, it should store the 
> exception on the execution context so that other methods (namely 
> `on_failure_callback` can access it.  This would help with custom error 
> integrations, e.g. Sentry.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2594) Airflow run state change is non atomic

2018-11-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2594?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16680473#comment-16680473
 ] 

ASF GitHub Bot commented on AIRFLOW-2594:
-

stale[bot] closed pull request #3489: [AIRFLOW-2594][WIP] Remove implicit 
commits
URL: https://github.com/apache/incubator-airflow/pull/3489
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/api/common/experimental/mark_tasks.py 
b/airflow/api/common/experimental/mark_tasks.py
index e9e4fec223..101cf8cf53 100644
--- a/airflow/api/common/experimental/mark_tasks.py
+++ b/airflow/api/common/experimental/mark_tasks.py
@@ -81,7 +81,7 @@ def set_state(task, execution_date, upstream=False, 
downstream=False,
 assert task.dag is not None
 dag = task.dag
 
-latest_execution_date = dag.latest_execution_date
+latest_execution_date = dag.latest_execution_date()
 assert latest_execution_date is not None
 
 # determine date range of dag runs and tasks to consider
diff --git a/airflow/jobs.py b/airflow/jobs.py
index ad114abda3..55b8ebec33 100644
--- a/airflow/jobs.py
+++ b/airflow/jobs.py
@@ -1402,7 +1402,7 @@ def _process_dags(self, dagbag, dags, tis_out):
 """
 for dag in dags:
 dag = dagbag.get_dag(dag.dag_id)
-if dag.is_paused:
+if dag.is_paused():
 self.log.info("Not processing DAG %s since it's paused", 
dag.dag_id)
 continue
 
@@ -1795,7 +1795,7 @@ def process_file(self, file_path, pickle_dags=False, 
session=None):
 dag.sync_to_db()
 
 paused_dag_ids = [dag.dag_id for dag in dagbag.dags.values()
-  if dag.is_paused]
+  if dag.is_paused()]
 
 # Pickle the DAGs (if necessary) and put them into a SimpleDag
 for dag_id in dagbag.dags:
@@ -2391,7 +2391,7 @@ def _process_backfill_task_instances(self,
 ti_status.active_runs.remove(run)
 executed_run_dates.append(run.execution_date)
 
-if run.dag.is_paused:
+if run.dag.is_paused():
 models.DagStat.update([run.dag_id], session=session)
 
 self._log_progress(ti_status)
diff --git a/airflow/models.py b/airflow/models.py
index 704dc808cf..767ff1db31 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -1241,7 +1241,6 @@ def are_dependents_done(self, session=None):
 count = ti[0][0]
 return count == len(task.downstream_task_ids)
 
-@property
 @provide_session
 def previous_ti(self, session=None):
 """ The task instance for the task that ran before this task instance 
"""
@@ -3454,7 +3453,6 @@ def folder(self):
 def owner(self):
 return ", ".join(list(set([t.owner for t in self.tasks])))
 
-@property
 @provide_session
 def concurrency_reached(self, session=None):
 """
@@ -3468,7 +3466,6 @@ def concurrency_reached(self, session=None):
 )
 return qry.scalar() >= self.concurrency
 
-@property
 @provide_session
 def is_paused(self, session=None):
 """
@@ -3557,7 +3554,6 @@ def get_dagrun(self, execution_date, session=None):
 
 return dagrun
 
-@property
 @provide_session
 def latest_execution_date(self, session=None):
 """
diff --git a/airflow/ti_deps/deps/dag_ti_slots_available_dep.py 
b/airflow/ti_deps/deps/dag_ti_slots_available_dep.py
index c3245ebe53..7fc3561c39 100644
--- a/airflow/ti_deps/deps/dag_ti_slots_available_dep.py
+++ b/airflow/ti_deps/deps/dag_ti_slots_available_dep.py
@@ -26,7 +26,7 @@ class DagTISlotsAvailableDep(BaseTIDep):
 
 @provide_session
 def _get_dep_statuses(self, ti, session, dep_context):
-if ti.task.dag.concurrency_reached:
+if ti.task.dag.concurrency_reached(session):
 yield self._failing_status(
 reason="The maximum number of running tasks ({0}) for this 
task's DAG "
"'{1}' has been 
reached.".format(ti.task.dag.concurrency,
diff --git a/airflow/ti_deps/deps/dag_unpaused_dep.py 
b/airflow/ti_deps/deps/dag_unpaused_dep.py
index adf150fb07..750a3b3bc2 100644
--- a/airflow/ti_deps/deps/dag_unpaused_dep.py
+++ b/airflow/ti_deps/deps/dag_unpaused_dep.py
@@ -26,6 +26,6 @@ class DagUnpausedDep(BaseTIDep):
 
 @provide_session
 def _get_dep_statuses(self, ti, session, dep_context):
-if ti.task.dag.is_paused:
+if ti.task.dag.is_paused(session):
 yield self._failing_status(
 reason="Task's DAG '{0}' is paused.".format(ti.dag_id))
diff --git a/airflow/ti_deps/deps/prev_dagrun_dep.py 
b/airflow/ti_deps/deps/

[jira] [Commented] (AIRFLOW-2524) Airflow integration with AWS Sagemaker

2018-11-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16680459#comment-16680459
 ] 

ASF GitHub Bot commented on AIRFLOW-2524:
-

ashb closed pull request #4126: [AIRFLOW-2524] More AWS SageMaker operators, 
sensors for model, endpoint-config and endpoint
URL: https://github.com/apache/incubator-airflow/pull/4126
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/aws_hook.py 
b/airflow/contrib/hooks/aws_hook.py
index 265d4e56af..9d4a73e1c0 100644
--- a/airflow/contrib/hooks/aws_hook.py
+++ b/airflow/contrib/hooks/aws_hook.py
@@ -183,7 +183,7 @@ def get_session(self, region_name=None):
 def get_credentials(self, region_name=None):
 """Get the underlying `botocore.Credentials` object.
 
-This contains the attributes: access_key, secret_key and token.
+This contains the following authentication attributes: access_key, 
secret_key and token.
 """
 session, _ = self._get_credentials(region_name)
 # Credentials are refreshable, so accessing your access key and
@@ -193,8 +193,8 @@ def get_credentials(self, region_name=None):
 
 def expand_role(self, role):
 """
-Expand an IAM role name to an IAM role ARN. If role is already an IAM 
ARN,
-no change is made.
+If the IAM role is a role name, get the Amazon Resource Name (ARN) for 
the role.
+If IAM role is already an IAM role ARN, no change is made.
 
 :param role: IAM role name or ARN
 :return: IAM role ARN
diff --git a/airflow/contrib/operators/sagemaker_base_operator.py 
b/airflow/contrib/operators/sagemaker_base_operator.py
index cf1e59387a..08d6d0eb6a 100644
--- a/airflow/contrib/operators/sagemaker_base_operator.py
+++ b/airflow/contrib/operators/sagemaker_base_operator.py
@@ -79,7 +79,7 @@ def parse_config_integers(self):
 self.parse_integer(self.config, field)
 
 def expand_role(self):
-raise NotImplementedError('Please implement expand_role() in sub 
class!')
+pass
 
 def preprocess_config(self):
 self.log.info(
diff --git a/airflow/contrib/operators/sagemaker_endpoint_config_operator.py 
b/airflow/contrib/operators/sagemaker_endpoint_config_operator.py
new file mode 100644
index 00..a94cf30229
--- /dev/null
+++ b/airflow/contrib/operators/sagemaker_endpoint_config_operator.py
@@ -0,0 +1,67 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.operators.sagemaker_base_operator import 
SageMakerBaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+
+
+class SageMakerEndpointConfigOperator(SageMakerBaseOperator):
+
+"""
+Create a SageMaker endpoint config.
+
+This operator returns The ARN of the endpoint config created in Amazon 
SageMaker
+
+:param config: The configuration necessary to create an endpoint config.
+
+For details of the configuration parameter, See:
+
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sagemaker.html#SageMaker.Client.create_endpoint_config
+:type config: dict
+:param aws_conn_id: The AWS connection ID to use.
+:type aws_conn_id: str
+"""  # noqa: E501
+
+integer_fields = [
+['ProductionVariants', 'InitialInstanceCount']
+]
+
+@apply_defaults
+def __init__(self,
+ config,
+ *args, **kwargs):
+super(SageMakerEndpointConfigOperator, self).__init__(config=config,
+  *args, **kwargs)
+
+self.config = config
+
+def execute(self, context):
+self.preprocess_config()
+
+self.log.info('Creating SageMaker Endpoint Config %s.', 
self.config['EndpointConfigName'])
+response = self.hook.create_endpoint_config(self.confi

[jira] [Created] (AIRFLOW-3317) FTP Sensor fails immediately when file doesn't exist

2018-11-08 Thread Rolando Bernabe Tribo (JIRA)
Rolando Bernabe Tribo created AIRFLOW-3317:
--

 Summary: FTP Sensor fails immediately when file doesn't exist
 Key: AIRFLOW-3317
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3317
 Project: Apache Airflow
  Issue Type: Bug
  Components: contrib
Affects Versions: 1.10.0
Reporter: Rolando Bernabe Tribo


FTP Sensor outright fails when file does not exist in the FTP server.
{code:java}
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 1633, 
in _run_raw_task
result = task_copy.execute(context=context)
  File 
"/usr/local/lib/python3.6/site-packages/airflow/sensors/base_sensor_operator.py",
 line 68, in execute
while not self.poke(context):
  File 
"/usr/local/lib/python3.6/site-packages/airflow/contrib/sensors/ftp_sensor.py", 
line 56, in poke
raise e
  File 
"/usr/local/lib/python3.6/site-packages/airflow/contrib/sensors/ftp_sensor.py", 
line 52, in poke
hook.get_mod_time(self.path)
  File 
"/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/ftp_hook.py", 
line 234, in get_mod_time
ftp_mdtm = conn.sendcmd('MDTM ' + path)
  File "/usr/local/lib/python3.6/ftplib.py", line 273, in sendcmd
return self.getresp()
  File "/usr/local/lib/python3.6/ftplib.py", line 246, in getresp
raise error_perm(resp)
ftplib.error_perm: 550 The system cannot find the file specified. 
[2018-11-08 21:34:45,123] {{models.py:1756}} INFO - Marking task as 
UP_FOR_RETRY{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #3913: [AIRFLOW-3072] Assign permission get_logs_with_metadata to viewer role

2018-11-08 Thread GitBox
ashb commented on issue #3913: [AIRFLOW-3072] Assign permission 
get_logs_with_metadata to viewer role
URL: 
https://github.com/apache/incubator-airflow/pull/3913#issuecomment-437180010
 
 
   @seelmann The JIRA issue for this PR is marked as 1.10.1, but the diff 
doesn't apply cleanly there. Would you be able to some up with a patch against 
the v1-10-test branch please?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3312) No log output from BashOperator under test

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3312.

Resolution: Duplicate

> No log output from BashOperator under test
> --
>
> Key: AIRFLOW-3312
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3312
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging, operators
>Affects Versions: 1.10.0
>Reporter: Chris Bandy
>Priority: Major
>
> The BashOperator logs some messages as well as the stdout of its command at 
> the info level, but none of these appear when running {{airflow test}} with 
> the default configuration.
> For example, this DAG emits the following in Airflow 1.10.0:
> {code:python}
> from airflow import DAG
> from airflow.operators.bash_operator import BashOperator
> from datetime import datetime
> dag = DAG('please', start_date=datetime(year=2018, month=11, day=1))
> BashOperator(dag=dag, task_id='mine', bash_command='echo thank you')
> {code}
> {noformat}
> $ airflow test please mine '2018-11-01'
> [2018-11-08 00:06:54,098] {__init__.py:51} INFO - Using executor 
> SequentialExecutor
> [2018-11-08 00:06:54,246] {models.py:258} INFO - Filling up the DagBag from 
> /usr/local/airflow/dags
> {noformat}
> When executed by the scheduler, logs go to a file:
> {noformat}
> $ airflow scheduler -n 1
> ...
> [2018-11-08 00:41:02,674] {dag_processing.py:582} INFO - Started a process 
> (PID: 9) to generate tasks for /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:03,185] {dag_processing.py:495} INFO - Processor for 
> /usr/local/airflow/dags/please.py finished
> [2018-11-08 00:41:03,525] {jobs.py:1114} INFO - Tasks up for execution:
>   
> [2018-11-08 00:41:03,536] {jobs.py:1147} INFO - Figuring out tasks to run in 
> Pool(name=None) with 128 open slots and 1 task instances in queue
> [2018-11-08 00:41:03,539] {jobs.py:1184} INFO - DAG please has 0/16 running 
> and queued tasks
> [2018-11-08 00:41:03,540] {jobs.py:1216} INFO - Setting the follow tasks to 
> queued state:
>   
> [2018-11-08 00:41:03,573] {jobs.py:1297} INFO - Setting the follow tasks to 
> queued state:
>   
> [2018-11-08 00:41:03,576] {jobs.py:1339} INFO - Sending ('please', 'mine', 
> datetime.datetime(2018, 11, 1, 0, 0, tzinfo=)) to executor 
> with priority 1 and queue default
> [2018-11-08 00:41:03,578] {base_executor.py:56} INFO - Adding to queue: 
> airflow run please mine 2018-11-01T00:00:00+00:00 --local -sd 
> /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:03,593] {sequential_executor.py:45} INFO - Executing 
> command: airflow run please mine 2018-11-01T00:00:00+00:00 --local -sd 
> /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:04,262] {__init__.py:51} INFO - Using executor 
> SequentialExecutor
> [2018-11-08 00:41:04,406] {models.py:258} INFO - Filling up the DagBag from 
> /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:04,458] {cli.py:492} INFO - Running  please.mine 2018-11-01T00:00:00+00:00 [queued]> on host e2e08cf4dfaa
> [2018-11-08 00:41:09,684] {jobs.py:1443} INFO - Executor reports please.mine 
> execution_date=2018-11-01 00:00:00+00:00 as success
> $ cat logs/please/mine/2018-11-01T00\:00\:00+00\:00/1.log
> [2018-11-08 00:41:04,554] {models.py:1335} INFO - Dependencies all met for 
> 
> [2018-11-08 00:41:04,564] {models.py:1335} INFO - Dependencies all met for 
> 
> [2018-11-08 00:41:04,565] {models.py:1547} INFO -
> 
> Starting attempt 1 of 1
> 
> [2018-11-08 00:41:04,605] {models.py:1569} INFO - Executing 
>  on 2018-11-01T00:00:00+00:00
> [2018-11-08 00:41:04,605] {base_task_runner.py:124} INFO - Running: ['bash', 
> '-c', 'airflow run please mine 2018-11-01T00:00:00+00:00 --job_id 142 --raw 
> -sd DAGS_FOLDER/please.py --cfg_path /tmp/tmp9prq7knr']
> [2018-11-08 00:41:05,214] {base_task_runner.py:107} INFO - Job 142: Subtask 
> mine [2018-11-08 00:41:05,213] {__init__.py:51} INFO - Using executor 
> SequentialExecutor
> [2018-11-08 00:41:05,334] {base_task_runner.py:107} INFO - Job 142: Subtask 
> mine [2018-11-08 00:41:05,333] {models.py:258} INFO - Filling up the DagBag 
> from /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:05,368] {base_task_runner.py:107} INFO - Job 142: Subtask 
> mine [2018-11-08 00:41:05,367] {cli.py:492} INFO - Running  please.mine 2018-11-01T00:00:00+00:00 [running]> on host e2e08cf4dfaa
> [2018-11-08 00:41:05,398] {bash_operator.py:74} INFO - Tmp dir root location:
>  /tmp
> [2018-11-08 00:41:05,398] {bash_operator.py:87} INFO - Temporary script 
> location: /tmp/airflowtmp0is6wwxi/mine8tmew5y4
> [2018-11-08 00:41:05,399] {bash_operator.py:97} INFO - Running command: echo 
> tha

[GitHub] ashb opened a new pull request #4162: [AIRFLOW-XXX] Speed up RBAC view tests

2018-11-08 Thread GitBox
ashb opened a new pull request #4162: [AIRFLOW-XXX] Speed up RBAC view tests
URL: https://github.com/apache/incubator-airflow/pull/4162
 
 
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] No jira needed I think?
   
   ### Description
   
   - [x] Not re-creating the FAB app ones per test functions took the run time 
of
   the TestAirflowBaseViews from 223s down to 53s on my laptop, _and_ made
   it only print the deprecation warning (fixed in another PR already open)
   once instead of 10+ times.
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] stale[bot] closed pull request #3575: [AIRFLOW-2711] zendesk hook doesn't handle search endpoint properly

2018-11-08 Thread GitBox
stale[bot] closed pull request #3575: [AIRFLOW-2711] zendesk hook doesn't 
handle search endpoint properly
URL: https://github.com/apache/incubator-airflow/pull/3575
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/hooks/zendesk_hook.py b/airflow/hooks/zendesk_hook.py
index 3cf8353344..102556a5cd 100644
--- a/airflow/hooks/zendesk_hook.py
+++ b/airflow/hooks/zendesk_hook.py
@@ -74,7 +74,11 @@ def call(self, path, query=None, get_all_pages=True, 
side_loading=False):
 self.__handle_rate_limit_exception(rle)
 
 # Find the key with the results
-keys = [path.split("/")[-1].split(".json")[0]]
+key = path.split("/")[-1].split(".json")[0]
+if key == 'search':
+keys = ['results']
+else:
+keys = [key]
 next_page = results['next_page']
 if side_loading:
 keys += query['include'].split(',')


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] stale[bot] closed pull request #3489: [AIRFLOW-2594][WIP] Remove implicit commits

2018-11-08 Thread GitBox
stale[bot] closed pull request #3489: [AIRFLOW-2594][WIP] Remove implicit 
commits
URL: https://github.com/apache/incubator-airflow/pull/3489
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/api/common/experimental/mark_tasks.py 
b/airflow/api/common/experimental/mark_tasks.py
index e9e4fec223..101cf8cf53 100644
--- a/airflow/api/common/experimental/mark_tasks.py
+++ b/airflow/api/common/experimental/mark_tasks.py
@@ -81,7 +81,7 @@ def set_state(task, execution_date, upstream=False, 
downstream=False,
 assert task.dag is not None
 dag = task.dag
 
-latest_execution_date = dag.latest_execution_date
+latest_execution_date = dag.latest_execution_date()
 assert latest_execution_date is not None
 
 # determine date range of dag runs and tasks to consider
diff --git a/airflow/jobs.py b/airflow/jobs.py
index ad114abda3..55b8ebec33 100644
--- a/airflow/jobs.py
+++ b/airflow/jobs.py
@@ -1402,7 +1402,7 @@ def _process_dags(self, dagbag, dags, tis_out):
 """
 for dag in dags:
 dag = dagbag.get_dag(dag.dag_id)
-if dag.is_paused:
+if dag.is_paused():
 self.log.info("Not processing DAG %s since it's paused", 
dag.dag_id)
 continue
 
@@ -1795,7 +1795,7 @@ def process_file(self, file_path, pickle_dags=False, 
session=None):
 dag.sync_to_db()
 
 paused_dag_ids = [dag.dag_id for dag in dagbag.dags.values()
-  if dag.is_paused]
+  if dag.is_paused()]
 
 # Pickle the DAGs (if necessary) and put them into a SimpleDag
 for dag_id in dagbag.dags:
@@ -2391,7 +2391,7 @@ def _process_backfill_task_instances(self,
 ti_status.active_runs.remove(run)
 executed_run_dates.append(run.execution_date)
 
-if run.dag.is_paused:
+if run.dag.is_paused():
 models.DagStat.update([run.dag_id], session=session)
 
 self._log_progress(ti_status)
diff --git a/airflow/models.py b/airflow/models.py
index 704dc808cf..767ff1db31 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -1241,7 +1241,6 @@ def are_dependents_done(self, session=None):
 count = ti[0][0]
 return count == len(task.downstream_task_ids)
 
-@property
 @provide_session
 def previous_ti(self, session=None):
 """ The task instance for the task that ran before this task instance 
"""
@@ -3454,7 +3453,6 @@ def folder(self):
 def owner(self):
 return ", ".join(list(set([t.owner for t in self.tasks])))
 
-@property
 @provide_session
 def concurrency_reached(self, session=None):
 """
@@ -3468,7 +3466,6 @@ def concurrency_reached(self, session=None):
 )
 return qry.scalar() >= self.concurrency
 
-@property
 @provide_session
 def is_paused(self, session=None):
 """
@@ -3557,7 +3554,6 @@ def get_dagrun(self, execution_date, session=None):
 
 return dagrun
 
-@property
 @provide_session
 def latest_execution_date(self, session=None):
 """
diff --git a/airflow/ti_deps/deps/dag_ti_slots_available_dep.py 
b/airflow/ti_deps/deps/dag_ti_slots_available_dep.py
index c3245ebe53..7fc3561c39 100644
--- a/airflow/ti_deps/deps/dag_ti_slots_available_dep.py
+++ b/airflow/ti_deps/deps/dag_ti_slots_available_dep.py
@@ -26,7 +26,7 @@ class DagTISlotsAvailableDep(BaseTIDep):
 
 @provide_session
 def _get_dep_statuses(self, ti, session, dep_context):
-if ti.task.dag.concurrency_reached:
+if ti.task.dag.concurrency_reached(session):
 yield self._failing_status(
 reason="The maximum number of running tasks ({0}) for this 
task's DAG "
"'{1}' has been 
reached.".format(ti.task.dag.concurrency,
diff --git a/airflow/ti_deps/deps/dag_unpaused_dep.py 
b/airflow/ti_deps/deps/dag_unpaused_dep.py
index adf150fb07..750a3b3bc2 100644
--- a/airflow/ti_deps/deps/dag_unpaused_dep.py
+++ b/airflow/ti_deps/deps/dag_unpaused_dep.py
@@ -26,6 +26,6 @@ class DagUnpausedDep(BaseTIDep):
 
 @provide_session
 def _get_dep_statuses(self, ti, session, dep_context):
-if ti.task.dag.is_paused:
+if ti.task.dag.is_paused(session):
 yield self._failing_status(
 reason="Task's DAG '{0}' is paused.".format(ti.dag_id))
diff --git a/airflow/ti_deps/deps/prev_dagrun_dep.py 
b/airflow/ti_deps/deps/prev_dagrun_dep.py
index 2dd311f3aa..fd886db553 100644
--- a/airflow/ti_deps/deps/prev_dagrun_dep.py
+++ b/airflow/ti_deps/deps/prev_dagrun_dep.py
@@ -56,15 +56,15 @@ def _get_dep_statuses(self, ti, session, dep_context):
 reason="This task in

[jira] [Commented] (AIRFLOW-3312) No log output from BashOperator under test

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3312?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16680395#comment-16680395
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3312:


This'll be fixed in 1.10.1

> No log output from BashOperator under test
> --
>
> Key: AIRFLOW-3312
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3312
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging, operators
>Affects Versions: 1.10.0
>Reporter: Chris Bandy
>Priority: Major
>
> The BashOperator logs some messages as well as the stdout of its command at 
> the info level, but none of these appear when running {{airflow test}} with 
> the default configuration.
> For example, this DAG emits the following in Airflow 1.10.0:
> {code:python}
> from airflow import DAG
> from airflow.operators.bash_operator import BashOperator
> from datetime import datetime
> dag = DAG('please', start_date=datetime(year=2018, month=11, day=1))
> BashOperator(dag=dag, task_id='mine', bash_command='echo thank you')
> {code}
> {noformat}
> $ airflow test please mine '2018-11-01'
> [2018-11-08 00:06:54,098] {__init__.py:51} INFO - Using executor 
> SequentialExecutor
> [2018-11-08 00:06:54,246] {models.py:258} INFO - Filling up the DagBag from 
> /usr/local/airflow/dags
> {noformat}
> When executed by the scheduler, logs go to a file:
> {noformat}
> $ airflow scheduler -n 1
> ...
> [2018-11-08 00:41:02,674] {dag_processing.py:582} INFO - Started a process 
> (PID: 9) to generate tasks for /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:03,185] {dag_processing.py:495} INFO - Processor for 
> /usr/local/airflow/dags/please.py finished
> [2018-11-08 00:41:03,525] {jobs.py:1114} INFO - Tasks up for execution:
>   
> [2018-11-08 00:41:03,536] {jobs.py:1147} INFO - Figuring out tasks to run in 
> Pool(name=None) with 128 open slots and 1 task instances in queue
> [2018-11-08 00:41:03,539] {jobs.py:1184} INFO - DAG please has 0/16 running 
> and queued tasks
> [2018-11-08 00:41:03,540] {jobs.py:1216} INFO - Setting the follow tasks to 
> queued state:
>   
> [2018-11-08 00:41:03,573] {jobs.py:1297} INFO - Setting the follow tasks to 
> queued state:
>   
> [2018-11-08 00:41:03,576] {jobs.py:1339} INFO - Sending ('please', 'mine', 
> datetime.datetime(2018, 11, 1, 0, 0, tzinfo=)) to executor 
> with priority 1 and queue default
> [2018-11-08 00:41:03,578] {base_executor.py:56} INFO - Adding to queue: 
> airflow run please mine 2018-11-01T00:00:00+00:00 --local -sd 
> /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:03,593] {sequential_executor.py:45} INFO - Executing 
> command: airflow run please mine 2018-11-01T00:00:00+00:00 --local -sd 
> /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:04,262] {__init__.py:51} INFO - Using executor 
> SequentialExecutor
> [2018-11-08 00:41:04,406] {models.py:258} INFO - Filling up the DagBag from 
> /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:04,458] {cli.py:492} INFO - Running  please.mine 2018-11-01T00:00:00+00:00 [queued]> on host e2e08cf4dfaa
> [2018-11-08 00:41:09,684] {jobs.py:1443} INFO - Executor reports please.mine 
> execution_date=2018-11-01 00:00:00+00:00 as success
> $ cat logs/please/mine/2018-11-01T00\:00\:00+00\:00/1.log
> [2018-11-08 00:41:04,554] {models.py:1335} INFO - Dependencies all met for 
> 
> [2018-11-08 00:41:04,564] {models.py:1335} INFO - Dependencies all met for 
> 
> [2018-11-08 00:41:04,565] {models.py:1547} INFO -
> 
> Starting attempt 1 of 1
> 
> [2018-11-08 00:41:04,605] {models.py:1569} INFO - Executing 
>  on 2018-11-01T00:00:00+00:00
> [2018-11-08 00:41:04,605] {base_task_runner.py:124} INFO - Running: ['bash', 
> '-c', 'airflow run please mine 2018-11-01T00:00:00+00:00 --job_id 142 --raw 
> -sd DAGS_FOLDER/please.py --cfg_path /tmp/tmp9prq7knr']
> [2018-11-08 00:41:05,214] {base_task_runner.py:107} INFO - Job 142: Subtask 
> mine [2018-11-08 00:41:05,213] {__init__.py:51} INFO - Using executor 
> SequentialExecutor
> [2018-11-08 00:41:05,334] {base_task_runner.py:107} INFO - Job 142: Subtask 
> mine [2018-11-08 00:41:05,333] {models.py:258} INFO - Filling up the DagBag 
> from /usr/local/airflow/dags/please.py
> [2018-11-08 00:41:05,368] {base_task_runner.py:107} INFO - Job 142: Subtask 
> mine [2018-11-08 00:41:05,367] {cli.py:492} INFO - Running  please.mine 2018-11-01T00:00:00+00:00 [running]> on host e2e08cf4dfaa
> [2018-11-08 00:41:05,398] {bash_operator.py:74} INFO - Tmp dir root location:
>  /tmp
> [2018-11-08 00:41:05,398] {bash_operator.py:87} INFO - Temporary script 
> location: /tmp/airflowtmp0is6wwxi/mine8tmew5y4
> [2018-11-08 00:41:05,399] 

[jira] [Comment Edited] (AIRFLOW-2780) Adds IMAP Hook to interact with a mail server

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2780?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16680390#comment-16680390
 ] 

Ash Berlin-Taylor edited comment on AIRFLOW-2780 at 11/8/18 9:16 PM:
-

You can pick what you like, and I (since I'm doing the release of 1.10.1) treat 
it as an indication that people want it in the next release, and if it's a 
small change, easy to fix there's a reasonable chance of it being pulled in to 
the release branch.

Generally the committer who merges (or someone who comes along after) sets the 
fix version to the next release - 2.0.0 at the moment.


was (Author: ashb):
You can pick what you like, and I (since I'm doing the release of 1.10.1) treat 
it as an indication that people want it in the next release, and if it's a 
small change, easy to fix there's a reasonable chance of it being pulled in to 
the release branch.

> Adds IMAP Hook to interact with a mail server
> -
>
> Key: AIRFLOW-2780
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2780
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Major
> Fix For: 1.10.1
>
>
> This Hook connects to a mail server via IMAP to be able to retrieve email 
> attachments by using [Python's IMAP 
> Library.|https://docs.python.org/3.6/library/imaplib.html]
> Features:
> - `has_mail_attachment`: Can be used in a `Sensor` to check if there is an 
> attachment on the mail server with the given name.
> - `retrieve_mail_attachments`: Can be used in an `Operator` to do sth. with 
> the attachments returned as list of tuple.
> - `download_mail_attachments`: Can be used in an `Operator` to download the 
> attachment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2780) Adds IMAP Hook to interact with a mail server

2018-11-08 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2780?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16680390#comment-16680390
 ] 

Ash Berlin-Taylor commented on AIRFLOW-2780:


You can pick what you like, and I (since I'm doing the release of 1.10.1) treat 
it as an indication that people want it in the next release, and if it's a 
small change, easy to fix there's a reasonable chance of it being pulled in to 
the release branch.

> Adds IMAP Hook to interact with a mail server
> -
>
> Key: AIRFLOW-2780
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2780
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Major
> Fix For: 1.10.1
>
>
> This Hook connects to a mail server via IMAP to be able to retrieve email 
> attachments by using [Python's IMAP 
> Library.|https://docs.python.org/3.6/library/imaplib.html]
> Features:
> - `has_mail_attachment`: Can be used in a `Sensor` to check if there is an 
> attachment on the mail server with the given name.
> - `retrieve_mail_attachments`: Can be used in an `Operator` to do sth. with 
> the attachments returned as list of tuple.
> - `download_mail_attachments`: Can be used in an `Operator` to download the 
> attachment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb closed pull request #4126: [AIRFLOW-2524] More AWS SageMaker operators, sensors for model, endpoint-config and endpoint

2018-11-08 Thread GitBox
ashb closed pull request #4126: [AIRFLOW-2524] More AWS SageMaker operators, 
sensors for model, endpoint-config and endpoint
URL: https://github.com/apache/incubator-airflow/pull/4126
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/aws_hook.py 
b/airflow/contrib/hooks/aws_hook.py
index 265d4e56af..9d4a73e1c0 100644
--- a/airflow/contrib/hooks/aws_hook.py
+++ b/airflow/contrib/hooks/aws_hook.py
@@ -183,7 +183,7 @@ def get_session(self, region_name=None):
 def get_credentials(self, region_name=None):
 """Get the underlying `botocore.Credentials` object.
 
-This contains the attributes: access_key, secret_key and token.
+This contains the following authentication attributes: access_key, 
secret_key and token.
 """
 session, _ = self._get_credentials(region_name)
 # Credentials are refreshable, so accessing your access key and
@@ -193,8 +193,8 @@ def get_credentials(self, region_name=None):
 
 def expand_role(self, role):
 """
-Expand an IAM role name to an IAM role ARN. If role is already an IAM 
ARN,
-no change is made.
+If the IAM role is a role name, get the Amazon Resource Name (ARN) for 
the role.
+If IAM role is already an IAM role ARN, no change is made.
 
 :param role: IAM role name or ARN
 :return: IAM role ARN
diff --git a/airflow/contrib/operators/sagemaker_base_operator.py 
b/airflow/contrib/operators/sagemaker_base_operator.py
index cf1e59387a..08d6d0eb6a 100644
--- a/airflow/contrib/operators/sagemaker_base_operator.py
+++ b/airflow/contrib/operators/sagemaker_base_operator.py
@@ -79,7 +79,7 @@ def parse_config_integers(self):
 self.parse_integer(self.config, field)
 
 def expand_role(self):
-raise NotImplementedError('Please implement expand_role() in sub 
class!')
+pass
 
 def preprocess_config(self):
 self.log.info(
diff --git a/airflow/contrib/operators/sagemaker_endpoint_config_operator.py 
b/airflow/contrib/operators/sagemaker_endpoint_config_operator.py
new file mode 100644
index 00..a94cf30229
--- /dev/null
+++ b/airflow/contrib/operators/sagemaker_endpoint_config_operator.py
@@ -0,0 +1,67 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.operators.sagemaker_base_operator import 
SageMakerBaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+
+
+class SageMakerEndpointConfigOperator(SageMakerBaseOperator):
+
+"""
+Create a SageMaker endpoint config.
+
+This operator returns The ARN of the endpoint config created in Amazon 
SageMaker
+
+:param config: The configuration necessary to create an endpoint config.
+
+For details of the configuration parameter, See:
+
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sagemaker.html#SageMaker.Client.create_endpoint_config
+:type config: dict
+:param aws_conn_id: The AWS connection ID to use.
+:type aws_conn_id: str
+"""  # noqa: E501
+
+integer_fields = [
+['ProductionVariants', 'InitialInstanceCount']
+]
+
+@apply_defaults
+def __init__(self,
+ config,
+ *args, **kwargs):
+super(SageMakerEndpointConfigOperator, self).__init__(config=config,
+  *args, **kwargs)
+
+self.config = config
+
+def execute(self, context):
+self.preprocess_config()
+
+self.log.info('Creating SageMaker Endpoint Config %s.', 
self.config['EndpointConfigName'])
+response = self.hook.create_endpoint_config(self.config)
+if response['ResponseMetadata']['HTTPStatusCode'] != 200:
+raise AirflowException(
+'Sagemaker endpoint config creation failed: %s' % response)
+else:
+return {
+'EndpointConfig': self.ho

[jira] [Created] (AIRFLOW-3314) The lineage automatic inlets feature does not work as described.

2018-11-08 Thread Reuben van Ammers (JIRA)
Reuben van Ammers created AIRFLOW-3314:
--

 Summary: The lineage automatic inlets feature does not work as 
described.
 Key: AIRFLOW-3314
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3314
 Project: Apache Airflow
  Issue Type: Bug
Affects Versions: 1.10.0
Reporter: Reuben van Ammers
 Attachments: test_lineage_broken.py

Currently, this is the description of the arguments to the prepare lineage 
wrapper regarding inlets in airflow/lineage/__init__.py:

inlets can be:
 "auto" -> picks up any outlets from direct upstream tasks that have outlets
 defined, as such that if A -> B -> C and B does not have outlets but A does,
 these are provided as inlets.



This implies that non state producing tasks should have no effect on the 
behaviour for inlets and outlets, which is desirable to easily add operators 
that don't change state of files (such as for tracking/communication). 

 

This is not the current behaviour. This can be seen by changing the test case 
in tests/lineage/test_lineage.py to use the auto feature on operation 5 to use 
auto. Given the description, one would expect 1 inlet file, the the file 
produced from op3. However, as can be seen in the attatched broken test, this 
is not the case, and the presence of the non-state affecting operator breaks 
the test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2143) Try number displays incorrect values in the web UI

2018-11-08 Thread Chris Bandy (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2143?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679843#comment-16679843
 ] 

Chris Bandy commented on AIRFLOW-2143:
--

Affects 1.10.0 as well.

> Try number displays incorrect values in the web UI
> --
>
> Key: AIRFLOW-2143
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2143
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: James Davidheiser
>Priority: Minor
> Attachments: adhoc_query.png, task_instance_page.png
>
>
> This was confusing us a lot in our task runs - in the database, a task that 
> ran is marked as 1 try.  However, when we view it in the UI, it shows at 2 
> tries in several places.  These include:
>  * Task Instance Details (ie 
> [https://airflow/task?execution_date=xxx&dag_id=xxx&task_id=xxx 
> )|https://airflow/task?execution_date=xxx&dag_id=xxx&task_id=xxx]
>  * Task instance browser (/admin/taskinstance/)
>  * Task Tries graph (/admin/airflow/tries)
> Notably, is is correctly shown as 1 try in the log filenames, on the log 
> viewer page (admin/airflow/log?execution_date=), and some other places.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4126: [AIRFLOW-2524] More AWS SageMaker operators, sensors for model, endpoint-config and endpoint

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4126: [AIRFLOW-2524] More AWS SageMaker 
operators, sensors for model, endpoint-config and endpoint
URL: 
https://github.com/apache/incubator-airflow/pull/4126#issuecomment-435335196
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=h1)
 Report
   > Merging 
[#4126](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/9248e37727a0ff510103fa24088513845dafa711?src=pr&el=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4126/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4126  +/-   ##
   ==
   + Coverage   77.66%   77.68%   +0.01% 
   ==
 Files 199  199  
 Lines   1627316273  
   ==
   + Hits1263812641   +3 
   + Misses   3635 3632   -3
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4126/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `93.03% <ø> (ø)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4126/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=footer).
 Last update 
[9248e37...a090ec6](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4126: [AIRFLOW-2524] More AWS SageMaker operators, sensors for model, endpoint-config and endpoint

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4126: [AIRFLOW-2524] More AWS SageMaker 
operators, sensors for model, endpoint-config and endpoint
URL: 
https://github.com/apache/incubator-airflow/pull/4126#issuecomment-435335196
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=h1)
 Report
   > Merging 
[#4126](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/9248e37727a0ff510103fa24088513845dafa711?src=pr&el=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4126/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4126  +/-   ##
   ==
   + Coverage   77.66%   77.68%   +0.01% 
   ==
 Files 199  199  
 Lines   1627316273  
   ==
   + Hits1263812641   +3 
   + Misses   3635 3632   -3
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4126/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `93.03% <ø> (ø)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4126/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=footer).
 Last update 
[9248e37...a090ec6](https://codecov.io/gh/apache/incubator-airflow/pull/4126?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3315) Add ImapAttachmentSensor to poke for mail attachments

2018-11-08 Thread Felix Uellendall (JIRA)
Felix Uellendall created AIRFLOW-3315:
-

 Summary: Add ImapAttachmentSensor to poke for mail attachments
 Key: AIRFLOW-3315
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3315
 Project: Apache Airflow
  Issue Type: New Feature
Reporter: Felix Uellendall
Assignee: Felix Uellendall


This kind of sensor pokes a mail server for attachments in mails with a given 
name.

It will use the existing 
[ImapHook|https://issues.apache.org/jira/projects/AIRFLOW/issues/AIRFLOW-2780] 
to establish a connection to the mail server 
and search for the attachment in all mails.
If an attachment has been found it will immediately stop and return that an 
attachment has been found for the given name.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2903) Default owner should be "airflow" and not "Airflow"

2018-11-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2903?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679069#comment-16679069
 ] 

ASF GitHub Bot commented on AIRFLOW-2903:
-

kaxil closed pull request #4151: [AIRFLOW-2903] Change default owner to 
`airflow`
URL: https://github.com/apache/incubator-airflow/pull/4151
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Default owner should be "airflow" and not "Airflow"
> ---
>
> Key: AIRFLOW-2903
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2903
> Project: Apache Airflow
>  Issue Type: Task
>  Components: configuration
>Affects Versions: 1.8.2, 1.9.0
>Reporter: Kaxil Naik
>Priority: Minor
> Fix For: 2.0.0
>
> Attachments: image-2018-08-15-10-46-20-698.png
>
>
>  !image-2018-08-15-10-46-20-698.png! 
> As shown in the image above, if the owner is not set in the DAG, it defaults 
> to "Airflow" user when in reality it should be "airflow" user.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2799) Filtering UI objects by datetime is broken

2018-11-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679841#comment-16679841
 ] 

ASF GitHub Bot commented on AIRFLOW-2799:
-

Fokko closed pull request #4061: [AIRFLOW-2799] Fix filtering UI objects by 
datetime
URL: https://github.com/apache/incubator-airflow/pull/4061
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/utils/timezone.py b/airflow/utils/timezone.py
index 6d49fbcbb3..5adaa2f5c4 100644
--- a/airflow/utils/timezone.py
+++ b/airflow/utils/timezone.py
@@ -164,9 +164,9 @@ def datetime(*args, **kwargs):
 return dt.datetime(*args, **kwargs)
 
 
-def parse(string):
+def parse(string, timezone=None):
 """
 Parse a time string and return an aware datetime
 :param string: time string
 """
-return pendulum.parse(string, tz=TIMEZONE)
+return pendulum.parse(string, tz=timezone or TIMEZONE)
diff --git a/airflow/www/utils.py b/airflow/www/utils.py
index 6404d27f95..757d2268bf 100644
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -38,7 +38,7 @@
 
 from flask import after_this_request, request, Response
 from flask_admin.model import filters
-from flask_admin.contrib.sqla.filters import FilterConverter
+import flask_admin.contrib.sqla.filters as sqlafilters
 from flask_login import current_user
 
 from airflow import configuration, models, settings
@@ -448,7 +448,43 @@ def __call__(self, field, **kwargs):
 return wtforms.widgets.core.HTMLString(html)
 
 
-class UtcFilterConverter(FilterConverter):
+class UtcDateTimeFilterMixin(object):
+def clean(self, value):
+dt = super(UtcDateTimeFilterMixin, self).clean(value)
+return timezone.make_aware(dt, timezone=timezone.utc)
+
+
+class UtcDateTimeEqualFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeEqualFilter):
+pass
+
+
+class UtcDateTimeNotEqualFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeNotEqualFilter):
+pass
+
+
+class UtcDateTimeGreaterFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeGreaterFilter):
+pass
+
+
+class UtcDateTimeSmallerFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeSmallerFilter):
+pass
+
+
+class UtcDateTimeBetweenFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeBetweenFilter):
+pass
+
+
+class UtcDateTimeNotBetweenFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeNotBetweenFilter):
+pass
+
+
+class UtcFilterConverter(sqlafilters.FilterConverter):
+
+utcdatetime_filters = (UtcDateTimeEqualFilter, UtcDateTimeNotEqualFilter,
+   UtcDateTimeGreaterFilter, UtcDateTimeSmallerFilter,
+   UtcDateTimeBetweenFilter, 
UtcDateTimeNotBetweenFilter,
+   sqlafilters.FilterEmpty)
+
 @filters.convert('utcdatetime')
 def conv_utcdatetime(self, column, name, **kwargs):
-return self.conv_datetime(column, name, **kwargs)
+return [f(column, name, **kwargs) for f in self.utcdatetime_filters]
diff --git a/airflow/www_rbac/utils.py b/airflow/www_rbac/utils.py
index 0176a5312c..b25e1541ab 100644
--- a/airflow/www_rbac/utils.py
+++ b/airflow/www_rbac/utils.py
@@ -37,7 +37,10 @@
 from pygments import highlight, lexers
 from pygments.formatters import HtmlFormatter
 from flask import request, Response, Markup, url_for
-from airflow import configuration
+from flask_appbuilder.models.sqla.interface import SQLAInterface
+import flask_appbuilder.models.sqla.filters as fab_sqlafilters
+import sqlalchemy as sqla
+from airflow import configuration, settings
 from airflow.models import BaseOperator
 from airflow.operators.subdag_operator import SubDagOperator
 from airflow.utils import timezone
@@ -378,3 +381,69 @@ def get_chart_height(dag):
 charts, that is charts that take up space based on the size of the 
components within.
 """
 return 600 + len(dag.tasks) * 10
+
+
+class UtcAwareFilterMixin(object):
+def apply(self, query, value):
+value = timezone.parse(value, timezone=timezone.utc)
+
+return super(UtcAwareFilterMixin, self).apply(query, value)
+
+
+class UtcAwareFilterEqual(UtcAwareFilterMixin, fab_sqlafilters.FilterEqual):
+pass
+
+
+class UtcAwareFilterGreater(UtcAwareFilterMixin, 
fab_sqlafilters.FilterGreater):
+pass
+
+
+class UtcAwareFilterSmaller(UtcAwareFilterMixin, 
fab_sqlafilters.FilterSmaller):
+pass
+
+
+class UtcAwareFilterNotEqual(UtcAwareFilterMixin, 
fab_sqlafilters.FilterNotEqual):
+pass
+
+
+class UtcAwareFilterConverter(fab_sqlafilters.SQLAFilterConverter):
+
+conversion_table = (
+(('is_utcdatetime', [UtcAwareFilterEqual,
+ UtcAwareFilterGreater,
+ 

[GitHub] codecov-io commented on issue #4155: [AIRFLOW-XXX] Add missing docs for SNS classes

2018-11-08 Thread GitBox
codecov-io commented on issue #4155: [AIRFLOW-XXX] Add missing docs for SNS 
classes
URL: 
https://github.com/apache/incubator-airflow/pull/4155#issuecomment-437166545
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=h1)
 Report
   > Merging 
[#4155](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/9248e37727a0ff510103fa24088513845dafa711?src=pr&el=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4155/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#4155   +/-   ##
   ===
 Coverage   77.66%   77.66%   
   ===
 Files 199  199   
 Lines   1627316273   
   ===
 Hits1263812638   
 Misses   3635 3635
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=footer).
 Last update 
[9248e37...8c44dad](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4155: [AIRFLOW-XXX] Add missing docs for SNS classes

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4155: [AIRFLOW-XXX] Add missing docs for 
SNS classes
URL: 
https://github.com/apache/incubator-airflow/pull/4155#issuecomment-437166545
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=h1)
 Report
   > Merging 
[#4155](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/9248e37727a0ff510103fa24088513845dafa711?src=pr&el=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4155/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#4155   +/-   ##
   ===
 Coverage   77.66%   77.66%   
   ===
 Files 199  199   
 Lines   1627316273   
   ===
 Hits1263812638   
 Misses   3635 3635
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=footer).
 Last update 
[9248e37...8c44dad](https://codecov.io/gh/apache/incubator-airflow/pull/4155?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3315) Add ImapAttachmentSensor to poke for mail attachments

2018-11-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3315?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679808#comment-16679808
 ] 

ASF GitHub Bot commented on AIRFLOW-3315:
-

feluelle opened a new pull request #4161: [AIRFLOW-3315] Add 
ImapAttachmentSensor
URL: https://github.com/apache/incubator-airflow/pull/4161
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3315
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   This PR adds a sensor that pokes a mail server for attachments in mails with 
a given name.
   If an attachment has been found it will immediately stop and return that an 
attachment has been found for the given name.
   **This PR also updates the license header in imap_hook and test_imap_hook**
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   This PR adds tests for:
   * test_poke_with_attachment_found
   * test_poke_with_attachment_not_found
   * test_poke_with_check_regex_true (tests if `check_regex=True` will be 
passed to the Hook's method)
   * test_poke_with_different_mail_folder (tests a mail folder other than 
`INBOX`)
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ImapAttachmentSensor to poke for mail attachments
> -
>
> Key: AIRFLOW-3315
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3315
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Major
>
> This kind of sensor pokes a mail server for attachments in mails with a given 
> name.
> It will use the existing 
> [ImapHook|https://issues.apache.org/jira/projects/AIRFLOW/issues/AIRFLOW-2780]
>  to establish a connection to the mail server 
> and search for the attachment in all mails.
> If an attachment has been found it will immediately stop and return that an 
> attachment has been found for the given name.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3278) update operators uploading to GCS to support gzip flag

2018-11-08 Thread jack (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3278?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jack updated AIRFLOW-3278:
--
Description: 
gzip flag was added to hook by PR:

[https://issues.apache.org/jira/browse/AIRFLOW-2932|https://github.com/apache/incubator-airflow/pull/3893]

 

*Needs to modify following operators to support the gzip flag:*

S3ToGoogleCloudStorageOperator

PostgresToGoogleCloudStorageOperator

CassandraToGoogleCloudStorageOperator

MySqlToGoogleCloudStorageOperator

GoogleCloudStorageToGoogleCloudStorageOperator  (can be used to convert non 
gzip file to gzip)

 

*Operators that already support the gzip flag:*

FileToGoogleCloudStorageOperator

 

*Others:*

BigQueryToCloudStorageOperator  - has separated compression flag

  was:
gzip flag was added to hook by PR:

[https://issues.apache.org/jira/browse/AIRFLOW-2932|https://github.com/apache/incubator-airflow/pull/3893]

 

*Needs to modify following operators to support the gzip flag:*

FileToGoogleCloudStorageOperator

S3ToGoogleCloudStorageOperator

PostgresToGoogleCloudStorageOperator

CassandraToGoogleCloudStorageOperator

MySqlToGoogleCloudStorageOperator

GoogleCloudStorageToGoogleCloudStorageOperator  (can be used to convert non 
gzip file to gzip)

 

*Operators that already support the gzip flag:*

FileToGoogleCloudStorageOperator

 

*Others:*

BigQueryToCloudStorageOperator  - has separated compression flag


> update operators uploading to GCS to support gzip flag
> --
>
> Key: AIRFLOW-3278
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3278
> Project: Apache Airflow
>  Issue Type: Task
>Affects Versions: 1.10.0
>Reporter: jack
>Priority: Major
>
> gzip flag was added to hook by PR:
> [https://issues.apache.org/jira/browse/AIRFLOW-2932|https://github.com/apache/incubator-airflow/pull/3893]
>  
> *Needs to modify following operators to support the gzip flag:*
> S3ToGoogleCloudStorageOperator
> PostgresToGoogleCloudStorageOperator
> CassandraToGoogleCloudStorageOperator
> MySqlToGoogleCloudStorageOperator
> GoogleCloudStorageToGoogleCloudStorageOperator  (can be used to convert non 
> gzip file to gzip)
>  
> *Operators that already support the gzip flag:*
> FileToGoogleCloudStorageOperator
>  
> *Others:*
> BigQueryToCloudStorageOperator  - has separated compression flag



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-08 Thread GitBox
ashb commented on issue #4006: [AIRFLOW-3164] Verify server certificate when 
connecting to LDAP
URL: 
https://github.com/apache/incubator-airflow/pull/4006#issuecomment-437162375
 
 
   @Fokko @kaxil tests are green on this now, and I've verified that it uses 
TLS1.2 by default so I'm happy with this now. Could one of you take a final 
look?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3253) KubernetesPodOperator Unauthorized Code 401

2018-11-08 Thread Sunny Gupta (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3253?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679604#comment-16679604
 ] 

Sunny Gupta commented on AIRFLOW-3253:
--

Great. Will wait be waiting for next release.

> KubernetesPodOperator Unauthorized Code 401
> ---
>
> Key: AIRFLOW-3253
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3253
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication, gcp, kubernetes
>Affects Versions: 1.10.0
>Reporter: Sunny Gupta
>Assignee: Trevor Edwards
>Priority: Minor
> Attachments: Screenshot from 2018-10-25 02-08-28.png
>
>
> apache-airflow==1.10.0
> kubernetes==7.0.0 (Tried)
>  kubernetes==8.0.0b1 (Tried)
>  
> Everytime after couple of successful scheduled runs, some runs failed and 
> throw below error.
> Error looks related  to k8s authorization and it seems like a pattern in my 
> case, everytime expiry comes near, job fails and after new expiry updates it 
> runs for a while and fails.
> !Screenshot from 2018-10-25 02-08-28.png!
> Above speculation could be wrong, need help to fix this issue. I am running 
> one sample python hello DAG and planning to move production workload but this 
> is blocker for me.
> Tried :(
>  * ~/.kube folder clearing and regenerate token by `gcloud container clusters 
> get-credentials ***` even tried setting as cron to force update tokens.
>  * Tried kubernetes==7.0.0 to latest beta version.
> Below is my kubectl config. When I run *kubectl* cli to do GET ops on pods, 
> nodes resources,no issues.
>  
> {code:java}
> $ kubectl config view 
> apiVersion: v1
> clusters:
> - cluster:
>     certificate-authority-data: DATA+OMITTED
>     server: https://XX.XX.XX.XX
>   name: gke_us-central1-b_dev-kube-cluster
> contexts:
> - context:
>     cluster: gke_us-central1-b_dev-kube-cluster
>     user: gke_us-central1-b_dev-kube-cluster
>   name: gke_us-central1-b_dev-kube-cluster
> current-context: gke_us-central1-b_dev-kube-cluster
> kind: Config
> preferences: {}
> users:
> - name: gke_us-central1-b_dev-kube-cluster
>   user:
>     auth-provider:
>   config:
>     access-token: ya29.c.TOKEN5EREdigv
>     cmd-args: config config-helper --format=json
>     cmd-path: /usr/lib/google-cloud-sdk/bin/gcloud
>     expiry: 2018-10-24T20:54:37Z
>     expiry-key: '{.credential.token_expiry}'
>     token-key: '{.credential.access_token}'
>   name: gcp
> {code}
>  
>  
> In an hour, running every */5 min, 2-3 jobs fails. with below error.
>  
> {code:java}
> kubernetes.client.rest.ApiException: (401)
> Reason: Unauthorized
> HTTP response headers: HTTPHeaderDict({'Date': 'Wed, 24 Oct 2018 06:20:04 
> GMT', 'Content-Length': '129', 'Audit-Id': 
> '89dcda61-a60f-4b23-85d6-9d28a6bfeed0', 'Www-Authenticate': 'Basic 
> realm="kubernetes-master"', 'Content-Type': 'application/json'})
> HTTP response body: 
> {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"Unauthorized","reason":"Unauthorized","code":401}{code}
>  
>  
> {code:java}
> // complete logs
> 
> *** Log file does not exist: 
> /root/airflow/logs/pyk8s.v3/python-hello/2018-10-24T06:16:00+00:00/1.log
> *** Fetching from: 
> http://aflow-worker.internal:8793/log/pyk8s.v3/python-hello/2018-10-24T06:16:00+00:00/1.log
> [2018-10-24 06:20:02,947] {models.py:1335} INFO - Dependencies all met for 
> 
> [2018-10-24 06:20:02,952] {models.py:1335} INFO - Dependencies all met for 
> 
> [2018-10-24 06:20:02,952] {models.py:1547} INFO -
> 
> Starting attempt 1 of 1
> 
> [2018-10-24 06:20:02,966] {models.py:1569} INFO - Executing 
>  on 2018-10-24T06:16:00+00:00
> [2018-10-24 06:20:02,967] {base_task_runner.py:124} INFO - Running: ['bash', 
> '-c', 'airflow run pyk8s.v3 python-hello 2018-10-24T06:16:00+00:00 --job_id 
> 354 --raw -sd DAGS_FOLDER/pyk8s.v3.py --cfg_path /tmp/tmpf0saygt7']
> [2018-10-24 06:20:03,405] {base_task_runner.py:107} INFO - Job 354: Subtask 
> python-hello [2018-10-24 06:20:03,404] {settings.py:174} INFO - 
> setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
> [2018-10-24 06:20:03,808] {base_task_runner.py:107} INFO - Job 354: Subtask 
> python-hello [2018-10-24 06:20:03,807] {__init__.py:51} INFO - Using executor 
> CeleryExecutor
> [2018-10-24 06:20:03,970] {base_task_runner.py:107} INFO - Job 354: Subtask 
> python-hello [2018-10-24 06:20:03,970] {models.py:258} INFO - Filling up the 
> DagBag from /root/airflow/dags/pyk8s.v3.py
> [2018-10-24 06:20:04,255] {base_task_runner.py:107} INFO - Job 354: Subtask 
> python-hello [2018-10-24 0

[GitHub] codecov-io commented on issue #4118: [AIRFLOW-3271] Airflow RBAC Permissions modification via UI do not persist

2018-11-08 Thread GitBox
codecov-io commented on issue #4118: [AIRFLOW-3271] Airflow RBAC Permissions 
modification via UI do not persist
URL: 
https://github.com/apache/incubator-airflow/pull/4118#issuecomment-437130562
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4118?src=pr&el=h1)
 Report
   > Merging 
[#4118](https://codecov.io/gh/apache/incubator-airflow/pull/4118?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/c4e5151bcd095eae1cd6ca1b4e96b302df3a2166?src=pr&el=desc)
 will **increase** coverage by `0.99%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4118/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4118?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4118  +/-   ##
   ==
   + Coverage   76.68%   77.67%   +0.99% 
   ==
 Files 199  199  
 Lines   1618916280  +91 
   ==
   + Hits1241412645 +231 
   + Misses   3775 3635 -140
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4118?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `92.94% <100%> (+1.67%)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `78.57% <0%> (-2.86%)` | :arrow_down: |
   | 
[airflow/security/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy9zZWN1cml0eS91dGlscy5weQ==)
 | `26.92% <0%> (-2.03%)` | :arrow_down: |
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `93.03% <0%> (-1.29%)` | :arrow_down: |
   | 
[airflow/settings.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy9zZXR0aW5ncy5weQ==)
 | `80.41% <0%> (-0.74%)` | :arrow_down: |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `89.05% <0%> (-0.37%)` | :arrow_down: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (-0.22%)` | :arrow_down: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `72.32% <0%> (-0.06%)` | :arrow_down: |
   | 
[airflow/operators/python\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uX29wZXJhdG9yLnB5)
 | `95.03% <0%> (ø)` | :arrow_up: |
   | 
[airflow/sensors/s3\_prefix\_sensor.py](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree#diff-YWlyZmxvdy9zZW5zb3JzL3MzX3ByZWZpeF9zZW5zb3IucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | ... and [21 
more](https://codecov.io/gh/apache/incubator-airflow/pull/4118/diff?src=pr&el=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4118?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4118?src=pr&el=footer).
 Last update 
[c4e5151...241bb97](https://codecov.io/gh/apache/incubator-airflow/pull/4118?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4112: [AIRFLOW-3212] Add AwsGlueCatalogPartitionSensor

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4112: [AIRFLOW-3212] Add 
AwsGlueCatalogPartitionSensor
URL: 
https://github.com/apache/incubator-airflow/pull/4112#issuecomment-434413640
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4112?src=pr&el=h1)
 Report
   > Merging 
[#4112](https://codecov.io/gh/apache/incubator-airflow/pull/4112?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/9248e37727a0ff510103fa24088513845dafa711?src=pr&el=desc)
 will **decrease** coverage by `0.98%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4112/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4112?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4112  +/-   ##
   ==
   - Coverage   77.66%   76.68%   -0.99% 
   ==
 Files 199  199  
 Lines   1627316189  -84 
   ==
   - Hits1263812414 -224 
   - Misses   3635 3775 +140
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4112?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/generic\_transfer.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ2VuZXJpY190cmFuc2Zlci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/www\_rbac/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy91dGlscy5weQ==)
 | `68.94% <0%> (-5.15%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `79.83% <0%> (-3.23%)` | :arrow_down: |
   | 
[airflow/hooks/postgres\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9wb3N0Z3Jlc19ob29rLnB5)
 | `91.66% <0%> (-2.78%)` | :arrow_down: |
   | 
[airflow/hooks/mysql\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9teXNxbF9ob29rLnB5)
 | `90.38% <0%> (-2.6%)` | :arrow_down: |
   | 
[airflow/hooks/hive\_hooks.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oaXZlX2hvb2tzLnB5)
 | `73.42% <0%> (-1.85%)` | :arrow_down: |
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `91.27% <0%> (-1.35%)` | :arrow_down: |
   | ... and [20 
more](https://codecov.io/gh/apache/incubator-airflow/pull/4112/diff?src=pr&el=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4112?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4112?src=pr&el=footer).
 Last update 
[9248e37...9514482](https://codecov.io/gh/apache/incubator-airflow/pull/4112?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mikemole commented on issue #4112: [AIRFLOW-3212] Add AwsGlueCatalogPartitionSensor

2018-11-08 Thread GitBox
mikemole commented on issue #4112: [AIRFLOW-3212] Add 
AwsGlueCatalogPartitionSensor
URL: 
https://github.com/apache/incubator-airflow/pull/4112#issuecomment-437116378
 
 
   @Fokko Ok, I rebased. Tests are running now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3313) Fix doc strings in dataproc operator referencing wrong type of propeties

2018-11-08 Thread holdenk (JIRA)
holdenk created AIRFLOW-3313:


 Summary: Fix doc strings in dataproc operator referencing wrong 
type of propeties
 Key: AIRFLOW-3313
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3313
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: holdenk
Assignee: holdenk


The docstrings in dataproc operator more or less copied the Pig doc string 
(which is fine), but forgot to change the property name to patch the operator 
type.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3316) GCS to BQ operator leaves schema_fields operator unset when autodetect=True

2018-11-08 Thread Conrad Lee (JIRA)
Conrad Lee created AIRFLOW-3316:
---

 Summary: GCS to BQ operator leaves schema_fields operator unset 
when autodetect=True
 Key: AIRFLOW-3316
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3316
 Project: Apache Airflow
  Issue Type: Bug
  Components: operators
Affects Versions: 1.10.1
Reporter: Conrad Lee
Assignee: Conrad Lee


When I use the GoogleCloudStorageToBigQueryOperator to load data from Parquet 
into BigQuery, I leave the schema_fields argument set to 'None' and set 
autodetect=True.

 

This causes the following error: 

 
{code:java}
[2018-11-08 09:42:03,690] {models.py:1736} ERROR - local variable 
'schema_fields' referenced before assignment
Traceback (most recent call last)
  File "/usr/local/lib/airflow/airflow/models.py", line 1633, in _run_raw_tas
result = task_copy.execute(context=context
  File "/home/airflow/gcs/plugins/bq_operator_updated.py", line 2018, in execut
schema_fields=schema_fields
UnboundLocalError: local variable 'schema_fields' referenced before assignmen
{code}
 

The problem is this set of checks in which the schema_fields variable is set 
neglects to cover all the cases
{code:java}
if not self.schema_fields:
  if self.schema_object and self.source_format != 'DATASTORE_BACKUP':
gcs_hook = GoogleCloudStorageHook(
google_cloud_storage_conn_id=self.google_cloud_storage_conn_id, 
delegate_to=self.delegate_to)

schema_fields = json.loads(gcs_hook.download(
  self.bucket,
  self.schema_object).decode("utf-8"))
  elif self.schema_object is None and self.autodetect is False:
raise ValueError('At least one of `schema_fields`, `schema_object`, '
'or `autodetect` must be passed.')

else:
schema_fields = self.schema_fields

{code}
After the `elif` we need to handle the case where autodetect is set to True.  
This can be done by simply adding two lines:
{code:java}
if not self.schema_fields:
  if self.schema_object and self.source_format != 'DATASTORE_BACKUP':
gcs_hook = GoogleCloudStorageHook(
google_cloud_storage_conn_id=self.google_cloud_storage_conn_id, 
delegate_to=self.delegate_to)

schema_fields = json.loads(gcs_hook.download(
  self.bucket,
  self.schema_object).decode("utf-8"))
  elif self.schema_object is None and self.autodetect is False:
raise ValueError('At least one of `schema_fields`, `schema_object`, '
'or `autodetect` must be passed.')
  else:
schema_fiels = None
else:
schema_fields = self.schema_fields{code}
 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3312) No log output from BashOperator under test

2018-11-08 Thread Chris Bandy (JIRA)
Chris Bandy created AIRFLOW-3312:


 Summary: No log output from BashOperator under test
 Key: AIRFLOW-3312
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3312
 Project: Apache Airflow
  Issue Type: Bug
  Components: logging, operators
Affects Versions: 1.10.0
Reporter: Chris Bandy


The BashOperator logs some messages as well as the stdout of its command at the 
info level, but none of these appear when running {{airflow test}} with the 
default configuration.

For example, this DAG emits the following in Airflow 1.10.0:
{code:python}
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime

dag = DAG('please', start_date=datetime(year=2018, month=11, day=1))

BashOperator(dag=dag, task_id='mine', bash_command='echo thank you')
{code}

{noformat}
$ airflow test please mine '2018-11-01'
[2018-11-08 00:06:54,098] {__init__.py:51} INFO - Using executor 
SequentialExecutor
[2018-11-08 00:06:54,246] {models.py:258} INFO - Filling up the DagBag from 
/usr/local/airflow/dags
{noformat}

When executed by the scheduler, logs go to a file:

{noformat}
$ airflow scheduler -n 1
...
[2018-11-08 00:41:02,674] {dag_processing.py:582} INFO - Started a process 
(PID: 9) to generate tasks for /usr/local/airflow/dags/please.py
[2018-11-08 00:41:03,185] {dag_processing.py:495} INFO - Processor for 
/usr/local/airflow/dags/please.py finished
[2018-11-08 00:41:03,525] {jobs.py:1114} INFO - Tasks up for execution:

[2018-11-08 00:41:03,536] {jobs.py:1147} INFO - Figuring out tasks to run in 
Pool(name=None) with 128 open slots and 1 task instances in queue
[2018-11-08 00:41:03,539] {jobs.py:1184} INFO - DAG please has 0/16 running and 
queued tasks
[2018-11-08 00:41:03,540] {jobs.py:1216} INFO - Setting the follow tasks to 
queued state:

[2018-11-08 00:41:03,573] {jobs.py:1297} INFO - Setting the follow tasks to 
queued state:

[2018-11-08 00:41:03,576] {jobs.py:1339} INFO - Sending ('please', 'mine', 
datetime.datetime(2018, 11, 1, 0, 0, tzinfo=)) to executor with 
priority 1 and queue default
[2018-11-08 00:41:03,578] {base_executor.py:56} INFO - Adding to queue: airflow 
run please mine 2018-11-01T00:00:00+00:00 --local -sd 
/usr/local/airflow/dags/please.py
[2018-11-08 00:41:03,593] {sequential_executor.py:45} INFO - Executing command: 
airflow run please mine 2018-11-01T00:00:00+00:00 --local -sd 
/usr/local/airflow/dags/please.py
[2018-11-08 00:41:04,262] {__init__.py:51} INFO - Using executor 
SequentialExecutor
[2018-11-08 00:41:04,406] {models.py:258} INFO - Filling up the DagBag from 
/usr/local/airflow/dags/please.py
[2018-11-08 00:41:04,458] {cli.py:492} INFO - Running  on host e2e08cf4dfaa
[2018-11-08 00:41:09,684] {jobs.py:1443} INFO - Executor reports please.mine 
execution_date=2018-11-01 00:00:00+00:00 as success

$ cat logs/please/mine/2018-11-01T00\:00\:00+00\:00/1.log
[2018-11-08 00:41:04,554] {models.py:1335} INFO - Dependencies all met for 

[2018-11-08 00:41:04,564] {models.py:1335} INFO - Dependencies all met for 

[2018-11-08 00:41:04,565] {models.py:1547} INFO -

Starting attempt 1 of 1


[2018-11-08 00:41:04,605] {models.py:1569} INFO - Executing 
 on 2018-11-01T00:00:00+00:00
[2018-11-08 00:41:04,605] {base_task_runner.py:124} INFO - Running: ['bash', 
'-c', 'airflow run please mine 2018-11-01T00:00:00+00:00 --job_id 142 --raw -sd 
DAGS_FOLDER/please.py --cfg_path /tmp/tmp9prq7knr']
[2018-11-08 00:41:05,214] {base_task_runner.py:107} INFO - Job 142: Subtask 
mine [2018-11-08 00:41:05,213] {__init__.py:51} INFO - Using executor 
SequentialExecutor
[2018-11-08 00:41:05,334] {base_task_runner.py:107} INFO - Job 142: Subtask 
mine [2018-11-08 00:41:05,333] {models.py:258} INFO - Filling up the DagBag 
from /usr/local/airflow/dags/please.py
[2018-11-08 00:41:05,368] {base_task_runner.py:107} INFO - Job 142: Subtask 
mine [2018-11-08 00:41:05,367] {cli.py:492} INFO - Running  on host e2e08cf4dfaa
[2018-11-08 00:41:05,398] {bash_operator.py:74} INFO - Tmp dir root location:
 /tmp
[2018-11-08 00:41:05,398] {bash_operator.py:87} INFO - Temporary script 
location: /tmp/airflowtmp0is6wwxi/mine8tmew5y4
[2018-11-08 00:41:05,399] {bash_operator.py:97} INFO - Running command: echo 
thank you
[2018-11-08 00:41:05,402] {bash_operator.py:106} INFO - Output:
[2018-11-08 00:41:05,404] {bash_operator.py:110} INFO - thank you
[2018-11-08 00:41:05,404] {bash_operator.py:114} INFO - Command exited with 
return code 0
[2018-11-08 00:41:09,504] {logging_mixin.py:95} INFO - [2018-11-08 
00:41:09,503] {jobs.py:2612} INFO - Task exited with return code 0
{noformat}

 


This appears to be a regression. In Airflow 1.9.0, the same DAG with default 
configuration emi

[jira] [Commented] (AIRFLOW-3311) Allow Pod Operator to Retain Failed Pods

2018-11-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3311?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679791#comment-16679791
 ] 

ASF GitHub Bot commented on AIRFLOW-3311:
-

EamonKeane opened a new pull request #4160: [AIRFLOW-3311] Allow pod operator 
to keep failed pods
URL: https://github.com/apache/incubator-airflow/pull/4160
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ✔] My PR addresses the following [Airflow Jira]
   
   [AIRFLOW-3311] Allow pod operator to keep failed pods
   
https://issues.apache.org/jira/browse/AIRFLOW-3311?jql=text%20~%20%22keep%20failed%20pod%22
   
   ### Description
   
   - [✔ ] Here are some details about my PR, including screenshots of any UI 
changes:
   Extends AIRFLOW-2854 to enable keeping of pods with non-zero exit codes for 
log inspection in kubernetes clusters.
   
   ### Tests
   
   - [✔ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   tests/contrib/minikube/test_kubernetes_pod_operator.py:
   test_keep_failed_pod()
   
   ### Commits
   
   - [✔ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [✔ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [✔ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow Pod Operator to Retain Failed Pods
> 
>
> Key: AIRFLOW-3311
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3311
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib
>Affects Versions: 2.0.0
>Reporter: Eamon Keane
>Assignee: Eamon Keane
>Priority: Minor
>
> When using the pod operator it is convenient to be able to retain failed pods 
> for log inspectionl
> Airflow 2854 introduced the ability to delete pods made with the pod 
> operator, however it only has configuration to allow you to delete all pods.
> This extends Airflow 2854 to allow the user to specify a flag to keep pods 
> that exit with a non-zero exit code.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2780) Adds IMAP Hook to interact with a mail server

2018-11-08 Thread Felix Uellendall (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2780?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Felix Uellendall resolved AIRFLOW-2780.
---
   Resolution: Done
Fix Version/s: 1.10.1

Hey [~ashb] 

Can I decide in what "Fix Version/s" the hook will be released or who does 
this? 
I mean when I pick "1.10.1" here is it sure that it will be in this release?

> Adds IMAP Hook to interact with a mail server
> -
>
> Key: AIRFLOW-2780
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2780
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Major
> Fix For: 1.10.1
>
>
> This Hook connects to a mail server via IMAP to be able to retrieve email 
> attachments by using [Python's IMAP 
> Library.|https://docs.python.org/3.6/library/imaplib.html]
> Features:
> - `has_mail_attachment`: Can be used in a `Sensor` to check if there is an 
> attachment on the mail server with the given name.
> - `retrieve_mail_attachments`: Can be used in an `Operator` to do sth. with 
> the attachments returned as list of tuple.
> - `download_mail_attachments`: Can be used in an `Operator` to download the 
> attachment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-660) Impossible to record second task failure

2018-11-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-660?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679822#comment-16679822
 ] 

ASF GitHub Bot commented on AIRFLOW-660:


ashb closed pull request #1911: [AIRFLOW-660] Add id autoincrement column to 
task_fail table
URL: https://github.com/apache/incubator-airflow/pull/1911
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index 61ea359a0d..5cdab3aefb 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -1611,6 +1611,7 @@ class TaskFail(Base):
 
 __tablename__ = "task_fail"
 
+id = Column(Integer, primary_key=True, autoincrement=True)
 task_id = Column(String(ID_LEN), primary_key=True)
 dag_id = Column(String(ID_LEN), primary_key=True)
 execution_date = Column(DateTime, primary_key=True)


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Impossible to record second task failure
> 
>
> Key: AIRFLOW-660
> URL: https://issues.apache.org/jira/browse/AIRFLOW-660
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Affects Versions: 1.8.0
>Reporter: Alexander Shorin
>Priority: Blocker
>
> {code}
> /var/log/airflow/airflow_scheduler_err.log.10: [SQL: 'INSERT INTO task_fail 
> (task_id, dag_id, execution_date, start_date, end_date, duration) VALUES 
> (%(task_id)s, %(dag_id)s, %(execution_date)s, %(start_date)s, %(end_date)s, 
> %(duration)s)'] [parameters: {'task_id': 'test_task', 'end_date': 
> datetime.datetime(2016, 11, 30, 14, 38, 39, 197485), 'execution_date': 
> datetime.datetime(2016, 11, 30, 0, 0), 'duration': 331.723087, 'start_date': 
> datetime.datetime(2016, 11, 30, 14, 33, 7, 474398), 'dag_id': 'test_dag'}]
> /var/log/airflow/airflow_scheduler_err.log.10-Process 
> DagFileProcessor314-Process:
> /var/log/airflow/airflow_scheduler_err.log.10-Traceback (most recent call 
> last):
> /var/log/airflow/airflow_scheduler_err.log.10-  File 
> "/usr/local/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
> /var/log/airflow/airflow_scheduler_err.log.10-self.run()
> /var/log/airflow/airflow_scheduler_err.log.10-  File 
> "/usr/local/lib/python2.7/multiprocessing/process.py", line 114, in run
> /var/log/airflow/airflow_scheduler_err.log.10-self._target(*self._args, 
> **self._kwargs)
> /var/log/airflow/airflow_scheduler_err.log.10-  File 
> "/usr/local/lib/python2.7/site-packages/airflow/jobs.py", line 318, in helper
> /var/log/airflow/airflow_scheduler_err.log.10-pickle_dags)
> /var/log/airflow/airflow_scheduler_err.log.10-  File 
> "/usr/local/lib/python2.7/site-packages/airflow/utils/db.py", line 56, in 
> wrapper
> /var/log/airflow/airflow_scheduler_err.log.10-session.commit()
> /var/log/airflow/airflow_scheduler_err.log.10-  File 
> "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 813, 
> in commit
> /var/log/airflow/airflow_scheduler_err.log.10-self.transaction.commit()
> /var/log/airflow/airflow_scheduler_err.log.10-  File 
> "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 390, 
> in commit
> /var/log/airflow/airflow_scheduler_err.log.10-
> self._assert_active(prepared_ok=True)
> /var/log/airflow/airflow_scheduler_err.log.10-  File 
> "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 214, 
> in _assert_active
> /var/log/airflow/airflow_scheduler_err.log.10-% self._rollback_exception
> /var/log/airflow/airflow_scheduler_err.log.10:InvalidRequestError: This 
> Session's transaction has been rolled back due to a previous exception during 
> flush. To begin a new transaction with this Session, first issue 
> Session.rollback(). Original exception was: (psycopg2.IntegrityError) 
> duplicate key value violates unique constraint "task_fail_pkey"
> /var/log/airflow/airflow_scheduler_err.log.10-DETAIL:  Key (task_id, dag_id, 
> execution_date)=(test_dag, test_task, 2016-11-30 00:00:00) already exists.
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3182) 'all_done' trigger rule works incorrectly with BranchPythonOperator upstream tasks

2018-11-08 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3182?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679567#comment-16679567
 ] 

jack commented on AIRFLOW-3182:
---

Could the trigger rule you set be wrong?

I think you should do:
run_aggregation = PythonOperator(
task_id='daily_aggregation',
python_callable=run_daily_aggregation,
provide_context=True,
trigger_rule=TriggerRule.ALL_SUCCESS,
dag=dag
 

This means that daily_aggregation will start only when : start, hour_branch & 
task_for_hour-23 are success.

 

In your example daily_aggregation will start when start, hour_branch & 
task_for_hour-23 are done. and SKIP consider to be done. so it make scene it 
run every hour.

> 'all_done' trigger rule works incorrectly with BranchPythonOperator upstream 
> tasks
> --
>
> Key: AIRFLOW-3182
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3182
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0, 1.10.0
>Reporter: Greg H
>Priority: Major
> Attachments: BrannchPythonOperator.png
>
>
> We have a job that runs some data processing every hour. At the end of the 
> day we need to run aggregation on all data generated by the 'hourly' jobs, 
> regardless if any 'hourly' job failed or not. For this purpose we have 
> prepared DAG that uses BranchPythonOperator in order to decide which 'hourly' 
> job needs to be run in given time and when task for hour 23 is done, we 
> trigger the aggregation (downstream). For this to work regardless of the last 
> 'hourly' task status the *'all_done'* trigger rule is set in the aggregation 
> task. Unfortunately, such configuration works incorrectly causing aggregation 
> task to be run after every 'hourly' task, despite the fact the aggregation 
> task is set as downstream for 'task_for_hour-23' +only+:
>   !BrannchPythonOperator.png!
> Here's sample code:
> {code:java}
> # coding: utf-8
> from airflow import DAG
> from airflow.operators.python_operator import PythonOperator
> from airflow.operators.python_operator import BranchPythonOperator
> from airflow.operators.dummy_operator import DummyOperator
> from airflow.models import TriggerRule
> from datetime import datetime
> import logging
> dag_id = 'test'
> today = datetime.today().strftime("%Y-%m-%d");
> task_prefix = 'task_for_hour-'
> default_args = {
> 'owner': 'airflow',
> 'depends_on_past': False,
> 'start_date': datetime(2018, 6, 18),
> 'catchup': False,
> }
> dag = DAG(
> dag_id=dag_id,
> default_args=default_args,
> schedule_interval="@hourly",
> catchup=False
> )
> # Setting the current hour
> def get_current_hour():
> return datetime.now().hour
> # Returns the name id of the task to launch next (task_for_hour-0, 
> task_for_hour-1, etc.)
> def branch():
> return task_prefix + str(get_current_hour())
> # Running hourly job
> def run_hourly_job(**kwargs):
> current_hour = get_current_hour()
> logging.info("Running job for hour: %s" % current_hour)
> # Main daily aggregation
> def run_daily_aggregation(**kwargs):
> logging.info("Running daily aggregation for %s" % today)
> 
> start_task = DummyOperator(
> task_id='start',
> dag=dag
> )
> # 'branch' method returns name of the task to be run next.
> hour_branching = BranchPythonOperator(
> task_id='hour_branching',
> python_callable=branch,
> dag=dag)
> run_aggregation = PythonOperator(
> task_id='daily_aggregation',
> python_callable=run_daily_aggregation,
> provide_context=True,
> trigger_rule=TriggerRule.ALL_DONE,
> dag=dag
> )
> start_task.set_downstream(hour_branching)
> # Create tasks for each hour
> for hour in range(24):
> if hour == 23:
> task_for_hour_23 = PythonOperator(
> task_id=task_prefix + '23',
> python_callable=run_hourly_job,
> provide_context=True,
> dag=dag
> )
> hour_branching.set_downstream(task_for_hour_23)
> task_for_hour_23.set_downstream(run_aggregation)
> else:
> hour_branching.set_downstream(PythonOperator(
> task_id=task_prefix + str(hour),
> python_callable=run_hourly_job,
> provide_context=True,
> dag=dag)
> )
> {code}
> This me be also related to AIRFLOW-1419



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2799) Filtering UI objects by datetime is broken

2018-11-08 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2799?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-2799.
---
Resolution: Fixed

> Filtering UI objects by datetime is broken 
> ---
>
> Key: AIRFLOW-2799
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2799
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui, webserver
>Affects Versions: 1.10.0
> Environment: Debian Stretch, Python 3.5.3
>Reporter: Kevin Campbell
>Assignee: Ash Berlin-Taylor
>Priority: Major
> Fix For: 1.10.1
>
>
> On master (49fd23a3ee0269e2b974648f4a823c1d0b6c12ec) searching objects via 
> the user interface is broken for datetime fields.
> Create a new installation
>  Create a test dag (example_bash_operator)
>  Start webserver and scheduler
>  Enable dag
> On web UI, go to Browse > Task Instances
>  Search for task instances with execution_date greater than 5 days ago
>  You will get an exception
> {code:java}
>   / (  ()   )  \___
>  /( (  (  )   _))  )   )\
>(( (   )()  )   (   )  )
>  ((/  ( _(   )   (   _) ) (  () )  )
> ( (  ( (_)   (((   )  .((_ ) .  )_
>( (  )(  (  ))   ) . ) (   )
>   (  (   (  (   ) (  _  ( _) ).  ) . ) ) ( )
>   ( (  (   ) (  )   (  )) ) _)(   )  )  )
>  ( (  ( \ ) ((_  ( ) ( )  )   ) )  )) ( )
>   (  (   (  (   (_ ( ) ( _)  ) (  )  )   )
>  ( (  ( (  (  ) (_  )  ) )  _)   ) _( ( )
>   ((  (   )(( _)   _) _(_ (  (_ )
>(_((__(_(__(( ( ( |  ) ) ) )_))__))_)___)
>((__)\\||lll|l||///  \_))
> (   /(/ (  )  ) )\   )
>   (( ( ( | | ) ) )\   )
>(   /(| / ( )) ) ) )) )
>  ( ( _(|)_) )
>   (  ||\(|(|)|/|| )
> (|(||(||))
>   ( //|/l|||)|\\ \ )
> (/ / //  /|//\\  \ \  \ _)
> ---
> Node: wave.diffractive.io
> ---
> Traceback (most recent call last):
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/sqlalchemy/engine/base.py",
>  line 1116, in _execute_context
> context = constructor(dialect, self, conn, *args)
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/sqlalchemy/engine/default.py",
>  line 649, in _init_compiled
> for key in compiled_params
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/sqlalchemy/engine/default.py",
>  line 649, in 
> for key in compiled_params
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/sqlalchemy/sql/type_api.py",
>  line 1078, in process
> return process_param(value, dialect)
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/sqlalchemy_utc/sqltypes.py",
>  line 30, in process_bind_param
> raise ValueError('naive datetime is disallowed')
> ValueError: naive datetime is disallowed
> The above exception was the direct cause of the following exception:
> Traceback (most recent call last):
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/flask/app.py",
>  line 1982, in wsgi_app
> response = self.full_dispatch_request()
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/flask/app.py",
>  line 1614, in full_dispatch_request
> rv = self.handle_user_exception(e)
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/flask/app.py",
>  line 1517, in handle_user_exception
> reraise(exc_type, exc_value, tb)
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/flask/_compat.py",
>  line 33, in reraise
> raise value
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/flask/app.py",
>  line 1612, in full_dispatch_request
> rv = self.dispatch_request()
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/flask/app.py",
>  line 1598, in dispatch_request
> return self.view_functions[rule.endpoint](**req.view_args)
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-packages/flask_admin/base.py",
>  line 69, in inner
> return self._run_view(f, *args, **kwargs)
>   File 
> "/home/kev/.virtualenvs/airflow/local/lib/python3.5/site-pack

[jira] [Commented] (AIRFLOW-3299) Logs for currently running sensors not visible in the UI

2018-11-08 Thread Chris Bandy (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3299?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679846#comment-16679846
 ] 

Chris Bandy commented on AIRFLOW-3299:
--

Possibly related to AIRFLOW-2143?

> Logs for currently running sensors not visible in the UI
> 
>
> Key: AIRFLOW-3299
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3299
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Reporter: Brad Holmes
>Priority: Major
>
> When a task is actively running, the logs are not appearing.  I have tracked 
> this down to the {{next_try_number}} logic of task-instances.
> In [the view at line 
> 836|https://github.com/apache/incubator-airflow/blame/master/airflow/www/views.py#L836],
>  we have
> {code:java}
> logs = [''] * (ti.next_try_number - 1 if ti is not None else 0)
> {code}
> The length of the {{logs}} array informs the frontend on the number of 
> {{attempts}} that exist, and thus how many AJAX calls to make to load the 
> logs.
> Here is the current logic I have observed
> ||Task State||Current length of 'logs'||Needed length of 'logs'||
> |Successfully completed in 1 attempt|1|1|
> |Successfully completed in 2 attempt|2|2|
> |Not yet attempted|0|0|
> |Actively running task, first time|0|1|
> That last case is the bug.  Perhaps task-instance needs a method like 
> {{most_recent_try_number}} ?  I don't see how to make use of {{try_number()}} 
> or {{next_try_number()}} to meet the need here.
> ||Task State||try_number()||next_try_number()||Number of Attempts _Should_ 
> Display||
> |Successfully completed in 1 attempt|2|2|1|
> |Successfully completed in 2 attempt|3|3|2|
> |Not yet attempted|1|1|0|
> |Actively running task, first time|0|1|1|
> [~ashb] : You implemented this portion of task-instance 11 months ago.  Any 
> suggestions?  Or perhaps the problem is elsewhere?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2854) kubernetes_pod_operator add more configuration items

2018-11-08 Thread Eamon Keane (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2854?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16680069#comment-16680069
 ] 

Eamon Keane commented on AIRFLOW-2854:
--

hi [~devbarry], nice work. I extended this in 3311 to allow users to keep 
failed pods (handy for log inspection). Let me know if you have any comments or 
would have done it differently.

 

https://issues.apache.org/jira/browse/AIRFLOW-3311

> kubernetes_pod_operator add more configuration items
> 
>
> Key: AIRFLOW-2854
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2854
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib
>Affects Versions: 2.0.0
>Reporter: pengchen
>Assignee: pengchen
>Priority: Minor
> Fix For: 2.0.0, 1.10.1
>
>
> kubernetes_pod_operator is missing kubernetes pods related configuration 
> items, as follows:
> 1. image_pull_secrets
> _Pull secrets_ are used to _pull_ private container _images_ from registries. 
> In this case, we need to configure the image_pull_secrets in pod spec file
> 2. service_account_name
> When kubernetes is running on rbac Authorization. If it is a job that needs 
> to operate on kubernetes resources, we need to configure service account.
> 3. is_delete_operator_pod
> This option can be given to the user to decide whether to delete the job pod 
> created by pod_operator, which is currently not processed.
> 4. hostnetwork



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io commented on issue #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods

2018-11-08 Thread GitBox
codecov-io commented on issue #4160: [AIRFLOW-3311] Allow pod operator to keep 
failed pods
URL: 
https://github.com/apache/incubator-airflow/pull/4160#issuecomment-437080435
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4160?src=pr&el=h1)
 Report
   > Merging 
[#4160](https://codecov.io/gh/apache/incubator-airflow/pull/4160?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/9248e37727a0ff510103fa24088513845dafa711?src=pr&el=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4160/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4160?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4160  +/-   ##
   ==
   - Coverage   77.66%   77.65%   -0.01% 
   ==
 Files 199  199  
 Lines   1627316273  
   ==
   - Hits1263812637   -1 
   - Misses   3635 3636   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4160?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4160/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.15% <0%> (-0.05%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4160?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4160?src=pr&el=footer).
 Last update 
[9248e37...a8203aa](https://codecov.io/gh/apache/incubator-airflow/pull/4160?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG closed pull request #4158: [AIRFLOW-XXX] Pin version of MarkupSafe to fix "TestAirflowBaseViews" CI failure

2018-11-08 Thread GitBox
XD-DENG closed pull request #4158: [AIRFLOW-XXX] Pin version of MarkupSafe to 
fix "TestAirflowBaseViews" CI failure
URL: https://github.com/apache/incubator-airflow/pull/4158
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/setup.py b/setup.py
index d093d4971f..a1e109a135 100644
--- a/setup.py
+++ b/setup.py
@@ -329,6 +329,7 @@ def do_setup():
 'unicodecsv>=0.14.1',
 'werkzeug>=0.14.1, <0.15.0',
 'zope.deprecation>=4.0, <5.0',
+'MarkupSafe==1.0',
 ],
 setup_requires=[
 'docutils>=0.14, <1.0',


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on issue #4158: [AIRFLOW-XXX] Pin version of MarkupSafe to fix "TestAirflowBaseViews" CI failure

2018-11-08 Thread GitBox
XD-DENG commented on issue #4158: [AIRFLOW-XXX] Pin version of MarkupSafe to 
fix "TestAirflowBaseViews" CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-437056278
 
 
   Closing this PR now. Let's re-open it later if necessary.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] oelesinsc24 commented on a change in pull request #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow

2018-11-08 Thread GitBox
oelesinsc24 commented on a change in pull request #4068: [AIRFLOW-2310]: Add 
AWS Glue Job Compatibility to Airflow
URL: https://github.com/apache/incubator-airflow/pull/4068#discussion_r231928642
 
 

 ##
 File path: airflow/contrib/hooks/aws_glue_job_hook.py
 ##
 @@ -0,0 +1,126 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.aws_hook import AwsHook
+import time
+from botocore.exceptions import ClientError
+
+
+class AwsGlueJobHook(AwsHook):
+"""
+Interact with AWS Glue - create job
+
+:param job_name: unique job name per AWS account
+:type str
+:param desc: job description
+:type str
+:param aws_conn_id: aws connection id
+:type aws_conn_id: str
+"""
+
+def __init__(self,
+ job_name=None,
+ desc=None,
+ aws_conn_id='aws_default',
+ *args, **kwargs):
+self.job_name = job_name
+self.desc = desc
+self.aws_conn_id = aws_conn_id
+super(AwsGlueJobHook, self).__init__(self.aws_conn_id, *args, **kwargs)
+
+def get_conn(self):
+return self.get_client_type('glue')
+
+def list_jobs(self):
+conn = self.get_conn()
+return conn.get_jobs()
+
+def initialize_job(self, script_arguments=None):
+"""
+Initializes connection with AWS Glue
+to run job
+:return:
+"""
+glue_client = self.get_conn()
+
+try:
+job_response = self.get_glue_job()
+job_name = job_response['Name']
+job_run = glue_client.start_job_run(
+JobName=job_name,
+Arguments=script_arguments
+)
+return self.job_completion(job_name, job_run['JobRunId'])
+except ClientError as general_error:
+raise AirflowException(
+'Failed to run aws glue job, error: {error}'.format(
+error=str(general_error)
+)
+)
+
+def job_completion(self, job_name=None, run_id=None):
+"""
+:param job_name:
+:param run_id:
+:return:
+"""
+glue_client = self.get_conn()
+job_status = glue_client.get_job_run(
+JobName=job_name,
+RunId=run_id,
+PredecessorsIncluded=True
+)
+job_run_state = job_status['JobRun']['JobRunState']
+
+while True:
 
 Review comment:
   You're right. Adjusting this now


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on issue #4158: [AIRFLOW-XXX] Pin version of MarkupSafe to fix "TestAirflowBaseViews" CI failure

2018-11-08 Thread GitBox
XD-DENG commented on issue #4158: [AIRFLOW-XXX] Pin version of MarkupSafe to 
fix "TestAirflowBaseViews" CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-437027850
 
 
   Yes, I'm chatting with Ash in Slack and check the release history of the 
dependencies. Let's change the project name to "incubator-holmes"


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #4158: [AIRFLOW-XXX] Pin version of MarkupSafe to fix "TestAirflowBaseViews" CI failure

2018-11-08 Thread GitBox
kaxil commented on issue #4158: [AIRFLOW-XXX] Pin version of MarkupSafe to fix 
"TestAirflowBaseViews" CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-437026677
 
 
   Yup - Let's wait to see if we need this. The fact it is transient is very 
very frustrating


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil closed pull request #4159: [AIRFLOW-XXX] Add GeneCards to companies list

2018-11-08 Thread GitBox
kaxil closed pull request #4159: [AIRFLOW-XXX] Add GeneCards to companies list
URL: https://github.com/apache/incubator-airflow/pull/4159
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/README.md b/README.md
index ab3fca4b35..5df519983b 100644
--- a/README.md
+++ b/README.md
@@ -167,6 +167,7 @@ Currently **officially** using Airflow:
 1. [Fundera](https://fundera.com) 
[[@andyxhadji](https://github.com/andyxhadji)]
 1. [G Adventures](https://gadventures.com) 
[[@samuelmullin](https://github.com/samuelmullin)]
 1. [GameWisp](https://gamewisp.com) [[@tjbiii](https://github.com/TJBIII) & 
[@theryanwalls](https://github.com/theryanwalls)]
+1. [GeneCards](https://www.genecards.org) 
[[@oferze](https://github.com/oferze)]
 1. [Gentner Lab](http://github.com/gentnerlab) 
[[@neuromusic](https://github.com/neuromusic)]
 1. [Glassdoor](https://github.com/Glassdoor) 
[[@syvineckruyk](https://github.com/syvineckruyk) & 
[@sid88in](https://github.com/sid88in)]
 1. [Global Fashion Group](http://global-fashion-group.com) 
[[@GFG](https://github.com/GFG)]


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4161: [AIRFLOW-3315] Add ImapAttachmentSensor

2018-11-08 Thread GitBox
codecov-io commented on issue #4161: [AIRFLOW-3315] Add ImapAttachmentSensor
URL: 
https://github.com/apache/incubator-airflow/pull/4161#issuecomment-437019165
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4161?src=pr&el=h1)
 Report
   > Merging 
[#4161](https://codecov.io/gh/apache/incubator-airflow/pull/4161?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4161/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4161?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4161  +/-   ##
   ==
   + Coverage   77.52%   77.54%   +0.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258612588   +2 
   + Misses   3648 3646   -2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4161?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4161/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.15% <0%> (-0.05%)` | :arrow_down: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4161/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4161?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4161?src=pr&el=footer).
 Last update 
[848e432...edff092](https://codecov.io/gh/apache/incubator-airflow/pull/4161?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko closed pull request #4061: [AIRFLOW-2799] Fix filtering UI objects by datetime

2018-11-08 Thread GitBox
Fokko closed pull request #4061: [AIRFLOW-2799] Fix filtering UI objects by 
datetime
URL: https://github.com/apache/incubator-airflow/pull/4061
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/utils/timezone.py b/airflow/utils/timezone.py
index 6d49fbcbb3..5adaa2f5c4 100644
--- a/airflow/utils/timezone.py
+++ b/airflow/utils/timezone.py
@@ -164,9 +164,9 @@ def datetime(*args, **kwargs):
 return dt.datetime(*args, **kwargs)
 
 
-def parse(string):
+def parse(string, timezone=None):
 """
 Parse a time string and return an aware datetime
 :param string: time string
 """
-return pendulum.parse(string, tz=TIMEZONE)
+return pendulum.parse(string, tz=timezone or TIMEZONE)
diff --git a/airflow/www/utils.py b/airflow/www/utils.py
index 6404d27f95..757d2268bf 100644
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -38,7 +38,7 @@
 
 from flask import after_this_request, request, Response
 from flask_admin.model import filters
-from flask_admin.contrib.sqla.filters import FilterConverter
+import flask_admin.contrib.sqla.filters as sqlafilters
 from flask_login import current_user
 
 from airflow import configuration, models, settings
@@ -448,7 +448,43 @@ def __call__(self, field, **kwargs):
 return wtforms.widgets.core.HTMLString(html)
 
 
-class UtcFilterConverter(FilterConverter):
+class UtcDateTimeFilterMixin(object):
+def clean(self, value):
+dt = super(UtcDateTimeFilterMixin, self).clean(value)
+return timezone.make_aware(dt, timezone=timezone.utc)
+
+
+class UtcDateTimeEqualFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeEqualFilter):
+pass
+
+
+class UtcDateTimeNotEqualFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeNotEqualFilter):
+pass
+
+
+class UtcDateTimeGreaterFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeGreaterFilter):
+pass
+
+
+class UtcDateTimeSmallerFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeSmallerFilter):
+pass
+
+
+class UtcDateTimeBetweenFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeBetweenFilter):
+pass
+
+
+class UtcDateTimeNotBetweenFilter(UtcDateTimeFilterMixin, 
sqlafilters.DateTimeNotBetweenFilter):
+pass
+
+
+class UtcFilterConverter(sqlafilters.FilterConverter):
+
+utcdatetime_filters = (UtcDateTimeEqualFilter, UtcDateTimeNotEqualFilter,
+   UtcDateTimeGreaterFilter, UtcDateTimeSmallerFilter,
+   UtcDateTimeBetweenFilter, 
UtcDateTimeNotBetweenFilter,
+   sqlafilters.FilterEmpty)
+
 @filters.convert('utcdatetime')
 def conv_utcdatetime(self, column, name, **kwargs):
-return self.conv_datetime(column, name, **kwargs)
+return [f(column, name, **kwargs) for f in self.utcdatetime_filters]
diff --git a/airflow/www_rbac/utils.py b/airflow/www_rbac/utils.py
index 0176a5312c..b25e1541ab 100644
--- a/airflow/www_rbac/utils.py
+++ b/airflow/www_rbac/utils.py
@@ -37,7 +37,10 @@
 from pygments import highlight, lexers
 from pygments.formatters import HtmlFormatter
 from flask import request, Response, Markup, url_for
-from airflow import configuration
+from flask_appbuilder.models.sqla.interface import SQLAInterface
+import flask_appbuilder.models.sqla.filters as fab_sqlafilters
+import sqlalchemy as sqla
+from airflow import configuration, settings
 from airflow.models import BaseOperator
 from airflow.operators.subdag_operator import SubDagOperator
 from airflow.utils import timezone
@@ -378,3 +381,69 @@ def get_chart_height(dag):
 charts, that is charts that take up space based on the size of the 
components within.
 """
 return 600 + len(dag.tasks) * 10
+
+
+class UtcAwareFilterMixin(object):
+def apply(self, query, value):
+value = timezone.parse(value, timezone=timezone.utc)
+
+return super(UtcAwareFilterMixin, self).apply(query, value)
+
+
+class UtcAwareFilterEqual(UtcAwareFilterMixin, fab_sqlafilters.FilterEqual):
+pass
+
+
+class UtcAwareFilterGreater(UtcAwareFilterMixin, 
fab_sqlafilters.FilterGreater):
+pass
+
+
+class UtcAwareFilterSmaller(UtcAwareFilterMixin, 
fab_sqlafilters.FilterSmaller):
+pass
+
+
+class UtcAwareFilterNotEqual(UtcAwareFilterMixin, 
fab_sqlafilters.FilterNotEqual):
+pass
+
+
+class UtcAwareFilterConverter(fab_sqlafilters.SQLAFilterConverter):
+
+conversion_table = (
+(('is_utcdatetime', [UtcAwareFilterEqual,
+ UtcAwareFilterGreater,
+ UtcAwareFilterSmaller,
+ UtcAwareFilterNotEqual]),) +
+fab_sqlafilters.SQLAFilterConverter.conversion_table
+)
+
+
+class CustomSQLAInterface(SQLAInterface):
+"""
+FAB does not know how to handle columns with leadin

[GitHub] ashb closed pull request #1911: [AIRFLOW-660] Add id autoincrement column to task_fail table

2018-11-08 Thread GitBox
ashb closed pull request #1911: [AIRFLOW-660] Add id autoincrement column to 
task_fail table
URL: https://github.com/apache/incubator-airflow/pull/1911
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index 61ea359a0d..5cdab3aefb 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -1611,6 +1611,7 @@ class TaskFail(Base):
 
 __tablename__ = "task_fail"
 
+id = Column(Integer, primary_key=True, autoincrement=True)
 task_id = Column(String(ID_LEN), primary_key=True)
 dag_id = Column(String(ID_LEN), primary_key=True)
 execution_date = Column(DateTime, primary_key=True)


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] exploy commented on issue #1911: [AIRFLOW-660] Add id autoincrement column to task_fail table

2018-11-08 Thread GitBox
exploy commented on issue #1911: [AIRFLOW-660] Add id autoincrement column to 
task_fail table
URL: 
https://github.com/apache/incubator-airflow/pull/1911#issuecomment-437008764
 
 
   Can someone close it?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] exploy commented on issue #1911: [AIRFLOW-660] Add id autoincrement column to task_fail table

2018-11-08 Thread GitBox
exploy commented on issue #1911: [AIRFLOW-660] Add id autoincrement column to 
task_fail table
URL: 
https://github.com/apache/incubator-airflow/pull/1911#issuecomment-437008548
 
 
   `id` field is currently `primary_key=True` so to me this PR is no longer 
needed. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feluelle opened a new pull request #4161: [AIRFLOW-3315] Add ImapAttachmentSensor

2018-11-08 Thread GitBox
feluelle opened a new pull request #4161: [AIRFLOW-3315] Add 
ImapAttachmentSensor
URL: https://github.com/apache/incubator-airflow/pull/4161
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3315
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   This PR adds a sensor that pokes a mail server for attachments in mails with 
a given name.
   If an attachment has been found it will immediately stop and return that an 
attachment has been found for the given name.
   **This PR also updates the license header in imap_hook and test_imap_hook**
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   This PR adds tests for:
   * test_poke_with_attachment_found
   * test_poke_with_attachment_not_found
   * test_poke_with_check_regex_true (tests if `check_regex=True` will be 
passed to the Hook's method)
   * test_poke_with_different_mail_folder (tests a mail folder other than 
`INBOX`)
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-436958003
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=h1)
 Report
   > Merging 
[#4158](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4158/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#4158   +/-   ##
   ===
 Coverage   77.52%   77.52%   
   ===
 Files 199  199   
 Lines   1623416234   
   ===
 Hits1258612586   
 Misses   3648 3648
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=footer).
 Last update 
[848e432...21faffd](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
XD-DENG edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-437001694
 
 
   Given the failure this PR intends to resolve is transient, I’m not sure if 
this is the correct solution. 
   
   We don’t have to merge it right now, but may be worth revisiting/considering 
it when we repeat seeing **TestAirflowBaseViews**-related transient failures.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] EamonKeane opened a new pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods

2018-11-08 Thread GitBox
EamonKeane opened a new pull request #4160: [AIRFLOW-3311] Allow pod operator 
to keep failed pods
URL: https://github.com/apache/incubator-airflow/pull/4160
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ✔] My PR addresses the following [Airflow Jira]
   
   [AIRFLOW-3311] Allow pod operator to keep failed pods
   
https://issues.apache.org/jira/browse/AIRFLOW-3311?jql=text%20~%20%22keep%20failed%20pod%22
   
   ### Description
   
   - [✔ ] Here are some details about my PR, including screenshots of any UI 
changes:
   Extends AIRFLOW-2854 to enable keeping of pods with non-zero exit codes for 
log inspection in kubernetes clusters.
   
   ### Tests
   
   - [✔ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   tests/contrib/minikube/test_kubernetes_pod_operator.py:
   test_keep_failed_pod()
   
   ### Commits
   
   - [✔ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [✔ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [✔ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
XD-DENG commented on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-437001694
 
 
   Given the failure this PR intends to resolve is transient, I’m not sure if 
this is the correct solution. 
   
   We don’t have to merge it right now, but may be worth revisiting/considering 
it when we repeat seeing *TestAirflowBaseViews*-related transient failures.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] exploy commented on a change in pull request #4156: [AIRFLOW-3314] Changed auto inlets feature to work as described

2018-11-08 Thread GitBox
exploy commented on a change in pull request #4156: [AIRFLOW-3314] Changed auto 
inlets feature to work as described
URL: https://github.com/apache/incubator-airflow/pull/4156#discussion_r231892080
 
 

 ##
 File path: airflow/lineage/__init__.py
 ##
 @@ -110,26 +111,32 @@ def wrapper(self, context, *args, **kwargs):
   for i in inlets]
 self.inlets.extend(inlets)
 
-if self._inlets['auto']:
-# dont append twice
-task_ids = set(self._inlets['task_ids']).symmetric_difference(
-self.upstream_task_ids
-)
-inlets = self.xcom_pull(context,
-task_ids=task_ids,
-dag_id=self.dag_id,
-key=PIPELINE_OUTLETS)
-inlets = [item for sublist in inlets if sublist for item in 
sublist]
-inlets = [DataSet.map_type(i['typeName'])(data=i['attributes'])
-  for i in inlets]
-self.inlets.extend(inlets)
-
-if len(self._inlets['datasets']) > 0:
-self.inlets.extend(self._inlets['datasets'])
+if self._inlets["auto"]:  # Performs a tree traversal, starting with 
the current task. If outlets
 
 Review comment:
   Looks like, it should be multiline docsting 
https://www.python.org/dev/peps/pep-0257/#multi-line-docstrings


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4159: [AIRFLOW-XXX] Add GeneCards to companies list

2018-11-08 Thread GitBox
codecov-io commented on issue #4159: [AIRFLOW-XXX] Add GeneCards to companies 
list
URL: 
https://github.com/apache/incubator-airflow/pull/4159#issuecomment-436970917
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4159?src=pr&el=h1)
 Report
   > Merging 
[#4159](https://codecov.io/gh/apache/incubator-airflow/pull/4159?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4159/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4159?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4159  +/-   ##
   ==
   + Coverage   77.52%   77.52%   +<.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258512586   +1 
   + Misses   3649 3648   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4159?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4159/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4159?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4159?src=pr&el=footer).
 Last update 
[848e432...159dc6f](https://codecov.io/gh/apache/incubator-airflow/pull/4159?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-436958003
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=h1)
 Report
   > Merging 
[#4158](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4158/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4158  +/-   ##
   ==
   + Coverage   77.52%   77.52%   +<.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258512586   +1 
   + Misses   3649 3648   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4158/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=footer).
 Last update 
[848e432...21faffd](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-436958003
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=h1)
 Report
   > Merging 
[#4158](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4158/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4158  +/-   ##
   ==
   + Coverage   77.52%   77.52%   +<.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258512586   +1 
   + Misses   3649 3648   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4158/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=footer).
 Last update 
[848e432...21faffd](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] oferze opened a new pull request #4159: [AIRFLOW-XXX] Add GeneCards to companies list

2018-11-08 Thread GitBox
oferze opened a new pull request #4159: [AIRFLOW-XXX] Add GeneCards to 
companies list
URL: https://github.com/apache/incubator-airflow/pull/4159
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4006: [AIRFLOW-3164] Verify server 
certificate when connecting to LDAP
URL: 
https://github.com/apache/incubator-airflow/pull/4006#issuecomment-427468686
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4006?src=pr&el=h1)
 Report
   > Merging 
[#4006](https://codecov.io/gh/apache/incubator-airflow/pull/4006?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4006/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4006?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4006  +/-   ##
   ==
   + Coverage   77.52%   77.52%   +<.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258512586   +1 
   + Misses   3649 3648   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4006?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4006/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4006?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4006?src=pr&el=footer).
 Last update 
[848e432...d8d0e8c](https://codecov.io/gh/apache/incubator-airflow/pull/4006?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-436958003
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=h1)
 Report
   > Merging 
[#4158](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4158/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4158  +/-   ##
   ==
   + Coverage   77.52%   77.52%   +<.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258512586   +1 
   + Misses   3649 3648   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4158/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=footer).
 Last update 
[848e432...21faffd](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
codecov-io commented on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-436958003
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=h1)
 Report
   > Merging 
[#4158](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4158/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4158  +/-   ##
   ==
   + Coverage   77.52%   77.52%   +<.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258512586   +1 
   + Misses   3649 3648   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4158/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=footer).
 Last update 
[848e432...21faffd](https://codecov.io/gh/apache/incubator-airflow/pull/4158?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feluelle commented on issue #4120: [AIRFLOW-XXX] Update Contributing Guide - Git Hooks

2018-11-08 Thread GitBox
feluelle commented on issue #4120: [AIRFLOW-XXX] Update Contributing Guide - 
Git Hooks
URL: 
https://github.com/apache/incubator-airflow/pull/4120#issuecomment-436952323
 
 
   @ashb Done. :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4120: [AIRFLOW-XXX] Update Contributing Guide - Git Hooks

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4120: [AIRFLOW-XXX] Update Contributing 
Guide - Git Hooks
URL: 
https://github.com/apache/incubator-airflow/pull/4120#issuecomment-434876896
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=h1)
 Report
   > Merging 
[#4120](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4120/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4120  +/-   ##
   ==
   + Coverage   77.52%   77.52%   +<.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258512586   +1 
   + Misses   3649 3648   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4120/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=footer).
 Last update 
[848e432...462fee5](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4120: [AIRFLOW-XXX] Update Contributing Guide - Git Hooks

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4120: [AIRFLOW-XXX] Update Contributing 
Guide - Git Hooks
URL: 
https://github.com/apache/incubator-airflow/pull/4120#issuecomment-434876896
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=h1)
 Report
   > Merging 
[#4120](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/848e432865da82c86918bac8b466d17dc12cd84c?src=pr&el=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4120/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4120  +/-   ##
   ==
   + Coverage   77.52%   77.52%   +<.01% 
   ==
 Files 199  199  
 Lines   1623416234  
   ==
   + Hits1258512586   +1 
   + Misses   3649 3648   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4120/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=footer).
 Last update 
[848e432...462fee5](https://codecov.io/gh/apache/incubator-airflow/pull/4120?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
XD-DENG edited a comment on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-436949645
 
 
   Please don't merge until multiple Travis CI runs are triggered & pass to 
confirm this pinning does fix the transient CI failure (in 
`TestAirflowBaseViews`).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on issue #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
XD-DENG commented on issue #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4158#issuecomment-436949645
 
 
   Please don't merge until multiple Travis CI runs pass to confirm this does 
fix the transient CI failure (in `TestAirflowBaseViews`).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-08 Thread GitBox
ashb commented on issue #4006: [AIRFLOW-3164] Verify server certificate when 
connecting to LDAP
URL: 
https://github.com/apache/incubator-airflow/pull/4006#issuecomment-436948799
 
 
   I've rebased this (on Bolke's fork)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG opened a new pull request #4158: [AIRFLOW-XXX] Try to pin version of MarkupSafe to fix CI failure

2018-11-08 Thread GitBox
XD-DENG opened a new pull request #4158: [AIRFLOW-XXX] Try to pin version of 
MarkupSafe to fix CI failure
URL: https://github.com/apache/incubator-airflow/pull/4158
 
 
   MarkupSafe is a dependency of jinja2.
   
   Its latest release (1.1.0) time matches with the time from which we start to 
have weird error (mainly `TestAirflowBaseViews`).
   
   Try to pin it to see if it fix the CI.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4086: [AIRFLOW-3245] fix list processing in resolve_template_files

2018-11-08 Thread GitBox
codecov-io edited a comment on issue #4086: [AIRFLOW-3245] fix list processing 
in resolve_template_files
URL: 
https://github.com/apache/incubator-airflow/pull/4086#issuecomment-432321357
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4086?src=pr&el=h1)
 Report
   > Merging 
[#4086](https://codecov.io/gh/apache/incubator-airflow/pull/4086?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/b1d9111fc6d8305e4720111029a9212083e717d7?src=pr&el=desc)
 will **increase** coverage by `0.06%`.
   > The diff coverage is `81.81%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4086/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4086?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4086  +/-   ##
   ==
   + Coverage   77.49%   77.56%   +0.06% 
   ==
 Files 199  199  
 Lines   1624616244   -2 
   ==
   + Hits1259012599   +9 
   + Misses   3656 3645  -11
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4086?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4086/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.35% <81.81%> (+0.15%)` | :arrow_up: |
   | 
[airflow/security/utils.py](https://codecov.io/gh/apache/incubator-airflow/pull/4086/diff?src=pr&el=tree#diff-YWlyZmxvdy9zZWN1cml0eS91dGlscy5weQ==)
 | `26.92% <0%> (-2.03%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4086?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4086?src=pr&el=footer).
 Last update 
[b1d9111...f346e0e](https://codecov.io/gh/apache/incubator-airflow/pull/4086?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] VincentKetelaars opened a new pull request #4157: [AIRFLOW-XXX] Correct scheduling period in scheduler.rst

2018-11-08 Thread GitBox
VincentKetelaars opened a new pull request #4157: [AIRFLOW-XXX] Correct 
scheduling period in scheduler.rst
URL: https://github.com/apache/incubator-airflow/pull/4157
 
 
   The accompanying description of the code extract depicts a DAG that is 
scheduled daily, not hourly.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >