[GitHub] codecov-io edited a comment on issue #3992: [AIRFLOW-620] Feature to tail custom number of logs instead of rendering whole log

2018-11-05 Thread GitBox
codecov-io edited a comment on issue #3992: [AIRFLOW-620] Feature to tail 
custom number of logs instead of rendering whole log
URL: 
https://github.com/apache/incubator-airflow/pull/3992#issuecomment-426519197
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3992?src=pr=h1)
 Report
   > Merging 
[#3992](https://codecov.io/gh/apache/incubator-airflow/pull/3992?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/3ea61de78512f73e7098fd3035fdfd57bd8c6ab2?src=pr=desc)
 will **decrease** coverage by `1.03%`.
   > The diff coverage is `52.74%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3992/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3992?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#3992  +/-   ##
   ==
   - Coverage   76.67%   75.63%   -1.04% 
   ==
 Files 199  199  
 Lines   1618916024 -165 
   ==
   - Hits1241312120 -293 
   - Misses   3776 3904 +128
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/3992?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/log/file\_task\_handler.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZmlsZV90YXNrX2hhbmRsZXIucHk=)
 | `76.41% <32.14%> (-13%)` | :arrow_down: |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `63.05% <4.76%> (-1.78%)` | :arrow_down: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `69.03% <90.47%> (+0.29%)` | :arrow_up: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `72.21% <90.47%> (-0.18%)` | :arrow_down: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `0% <0%> (-97.65%)` | :arrow_down: |
   | 
[airflow/operators/slack\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc2xhY2tfb3BlcmF0b3IucHk=)
 | `0% <0%> (-97.37%)` | :arrow_down: |
   | 
[airflow/operators/s3\_to\_hive\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvczNfdG9faGl2ZV9vcGVyYXRvci5weQ==)
 | `0% <0%> (-94.02%)` | :arrow_down: |
   | 
[airflow/security/kerberos.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy9zZWN1cml0eS9rZXJiZXJvcy5weQ==)
 | `0% <0%> (-71.43%)` | :arrow_down: |
   | 
[airflow/operators/latest\_only\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbGF0ZXN0X29ubHlfb3BlcmF0b3IucHk=)
 | `25% <0%> (-65%)` | :arrow_down: |
   | 
[airflow/operators/subdag\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc3ViZGFnX29wZXJhdG9yLnB5)
 | `70.96% <0%> (-19.36%)` | :arrow_down: |
   | ... and [49 
more](https://codecov.io/gh/apache/incubator-airflow/pull/3992/diff?src=pr=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3992?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3992?src=pr=footer).
 Last update 
[3ea61de...8694d71](https://codecov.io/gh/apache/incubator-airflow/pull/3992?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3302) Small CSS fixes

2018-11-05 Thread Sumit Maheshwari (JIRA)
Sumit Maheshwari created AIRFLOW-3302:
-

 Summary: Small CSS fixes
 Key: AIRFLOW-3302
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3302
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Sumit Maheshwari
Assignee: Sumit Maheshwari






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] bolkedebruin commented on issue #4136: Fix for scheduler infinite loop when evaluating non-UTC DAGs after DST

2018-11-05 Thread GitBox
bolkedebruin commented on issue #4136: Fix for scheduler infinite loop when 
evaluating non-UTC DAGs after DST
URL: 
https://github.com/apache/incubator-airflow/pull/4136#issuecomment-436151122
 
 
   Please verify if this issue still exists on master. A fix already went in 
that should have addressed it


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] msumit commented on issue #4129: [AIRFLOW-3294] Update connections form and integration docs

2018-11-05 Thread GitBox
msumit commented on issue #4129: [AIRFLOW-3294] Update connections form and 
integration docs
URL: 
https://github.com/apache/incubator-airflow/pull/4129#issuecomment-436147329
 
 
   @ashb yeah sure, will keep that in mind.  


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3062) Add Qubole in integration docs

2018-11-05 Thread Sumit Maheshwari (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3062?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sumit Maheshwari resolved AIRFLOW-3062.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

> Add Qubole in integration docs
> --
>
> Key: AIRFLOW-3062
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3062
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Sumit Maheshwari
>Assignee: Sumit Maheshwari
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] XD-DENG edited a comment on issue #4138: [AIRFLOW-3301] Update DockerOperator unit test for PR #3977 to fix CI failure

2018-11-05 Thread GitBox
XD-DENG edited a comment on issue #4138: [AIRFLOW-3301] Update DockerOperator 
unit test for PR #3977 to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4138#issuecomment-436117694
 
 
   The CI is still failing due to unrelated part 
(`tests.www_rbac.test_views.TestAirflowBaseViews.test_failed`).
   
   But the DockerOperator-related test is working fine now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on issue #4138: [AIRFLOW-3301] Update unit test for PR #3977 to fix CI failure

2018-11-05 Thread GitBox
XD-DENG commented on issue #4138: [AIRFLOW-3301] Update unit test for PR #3977 
to fix CI failure
URL: 
https://github.com/apache/incubator-airflow/pull/4138#issuecomment-436117694
 
 
   The CI is still failing, but due to unrelated part 
(`tests.www_rbac.test_views.TestAirflowBaseViews.test_failed`).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Cplo commented on issue #3519: [AIRFLOW-2642] fix wrong value git-sync initcontainer env GIT_SYNC_ROOT

2018-11-05 Thread GitBox
Cplo commented on issue #3519: [AIRFLOW-2642] fix wrong value git-sync 
initcontainer env GIT_SYNC_ROOT
URL: 
https://github.com/apache/incubator-airflow/pull/3519#issuecomment-436117521
 
 
   I have run it in our kubernetes environment and it is very stable. My 
configuration is as follows(XXX was omitted )
   
   [kubernetes]
   airflow_configmap = airflow-configmap
   git_repo = https://XXX/Airflow-DAGs.git
   git_branch = prod
   git_subpath =
   git_user = XXX
   git_password = XXX
   # dags_volume_claim = airflow-dags
   # dags_volume_subpath =
   logs_volume_claim = airflow
   logs_volume_subpath =
   in_cluster = True
   namespace = airflow
   gcp_service_account_keys =
   
   # For cloning DAGs from git repositories into volumes: 
https://github.com/kubernetes/git-sync
   git_sync_container_repository = XXX/library/git-sync-amd64
   git_sync_container_tag = v2.0.5
   git_sync_init_container_name = git-sync-clone
   delete_worker_pods = true
   image_pull_secrets = XXX
   worker_container_image_pull_policy = Always
   worker_container_repository = XXX/library/airflow
   worker_container_tag = master-ce8b0fe5


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Cplo removed a comment on issue #3519: [AIRFLOW-2642] fix wrong value git-sync initcontainer env GIT_SYNC_ROOT

2018-11-05 Thread GitBox
Cplo removed a comment on issue #3519: [AIRFLOW-2642] fix wrong value git-sync 
initcontainer env GIT_SYNC_ROOT
URL: 
https://github.com/apache/incubator-airflow/pull/3519#issuecomment-436117076
 
 
   I have run it in our kubernetes environment and it is very stable. My 
configuration is as follows
   `
   [kubernetes]
   airflow_configmap = airflow-configmap
   git_repo = https://code.devops.xiaohongshu.com/devops/Airflow-DAGs.git
   git_branch = prod
   git_subpath =
   git_user = pengchen
   git_password = 283049lo
   # dags_volume_claim = airflow-dags
   # dags_volume_subpath =
   logs_volume_claim = airflow
   logs_volume_subpath =
   in_cluster = True
   namespace = airflow
   gcp_service_account_keys =
   
   # For cloning DAGs from git repositories into volumes: 
https://github.com/kubernetes/git-sync
   git_sync_container_repository = 
docker-reg.devops.xiaohongshu.com/library/git-sync-amd64
   git_sync_container_tag = v2.0.5
   git_sync_init_container_name = git-sync-clone
   delete_worker_pods = true
   image_pull_secrets = xhs-docker-registry
   worker_container_image_pull_policy = Always
   worker_container_repository = 
docker-reg.devops.xiaohongshu.com/library/airflow
   worker_container_tag = master-ce8b0fe5`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Cplo commented on issue #3519: [AIRFLOW-2642] fix wrong value git-sync initcontainer env GIT_SYNC_ROOT

2018-11-05 Thread GitBox
Cplo commented on issue #3519: [AIRFLOW-2642] fix wrong value git-sync 
initcontainer env GIT_SYNC_ROOT
URL: 
https://github.com/apache/incubator-airflow/pull/3519#issuecomment-436117076
 
 
   I have run it in our kubernetes environment and it is very stable. My 
configuration is as follows
   `
   [kubernetes]
   airflow_configmap = airflow-configmap
   git_repo = https://code.devops.xiaohongshu.com/devops/Airflow-DAGs.git
   git_branch = prod
   git_subpath =
   git_user = pengchen
   git_password = 283049lo
   # dags_volume_claim = airflow-dags
   # dags_volume_subpath =
   logs_volume_claim = airflow
   logs_volume_subpath =
   in_cluster = True
   namespace = airflow
   gcp_service_account_keys =
   
   # For cloning DAGs from git repositories into volumes: 
https://github.com/kubernetes/git-sync
   git_sync_container_repository = 
docker-reg.devops.xiaohongshu.com/library/git-sync-amd64
   git_sync_container_tag = v2.0.5
   git_sync_init_container_name = git-sync-clone
   delete_worker_pods = true
   image_pull_secrets = xhs-docker-registry
   worker_container_image_pull_policy = Always
   worker_container_repository = 
docker-reg.devops.xiaohongshu.com/library/airflow
   worker_container_tag = master-ce8b0fe5`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on issue #4138: [AIRFLOW-3301] Update CI test for PR #3977

2018-11-05 Thread GitBox
XD-DENG commented on issue #4138: [AIRFLOW-3301] Update CI test for PR #3977
URL: 
https://github.com/apache/incubator-airflow/pull/4138#issuecomment-436113109
 
 
   No worries. We all made/make/making mistakes ;-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] deagon commented on issue #4138: [AIRFLOW-3301] Update CI test for PR #3977

2018-11-05 Thread GitBox
deagon commented on issue #4138: [AIRFLOW-3301] Update CI test for PR #3977
URL: 
https://github.com/apache/incubator-airflow/pull/4138#issuecomment-436112371
 
 
   Sorry for my mistake. LGTM.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG edited a comment on issue #4138: [AIRFLOW-3301] Update CI test for PR #3977

2018-11-05 Thread GitBox
XD-DENG edited a comment on issue #4138: [AIRFLOW-3301] Update CI test for PR 
#3977
URL: 
https://github.com/apache/incubator-airflow/pull/4138#issuecomment-436111972
 
 
   Hi @deagon , in your PR 
https://github.com/apache/incubator-airflow/pull/3977, new argument 
`auto_remove` was added but the relevant test was not updated. This is causing 
CI failure 
(`tests.operators.test_docker_operator.DockerOperatorTestCase.test_execute`). 
   
   Details:
   ```
  AssertionError: Expected call: 
create_host_config(binds=['/host/path:/container/path', 
'/mkdtemp:/tmp/airflow'], cpu_shares=1024, dns=None, dns_search=None, 
mem_limit=None, network_mode='bridge', shm_size=1000)
  Actual call: create_host_config(auto_remove=False, 
binds=['/host/path:/container/path', '/mkdtemp:/tmp/airflow'], cpu_shares=1024, 
dns=None, dns_search=None, mem_limit=None, network_mode='bridge', shm_size=1000)
   ```
   
   I have updated the relevant unit test here. FYI @Fokko @ashb 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG commented on issue #4138: [AIRFLOW-3301] Update CI test for PR #3977

2018-11-05 Thread GitBox
XD-DENG commented on issue #4138: [AIRFLOW-3301] Update CI test for PR #3977
URL: 
https://github.com/apache/incubator-airflow/pull/4138#issuecomment-436111972
 
 
   Hi @deagon , in your PR 
https://github.com/apache/incubator-airflow/pull/3977, new argument 
`auto_remove` was added but the relevant test was not updated. This is causing 
CI failure 
(`tests.operators.test_docker_operator.DockerOperatorTestCase.test_execute`). 
   
   Details:
   ```
  AssertionError: Expected call: 
create_host_config(binds=['/host/path:/container/path', 
'/mkdtemp:/tmp/airflow'], cpu_shares=1024, dns=None, dns_search=None, 
mem_limit=None, network_mode='bridge', shm_size=1000)
  Actual call: create_host_config(auto_remove=False, 
binds=['/host/path:/container/path', '/mkdtemp:/tmp/airflow'], cpu_shares=1024, 
dns=None, dns_search=None, mem_limit=None, network_mode='bridge', shm_size=1000)
   ```
   
   I have updated the relevant unit test here. FYI @Fokko 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] XD-DENG opened a new pull request #4138: [AIRFLOW-3301] Update CI test for PR #3977

2018-11-05 Thread GitBox
XD-DENG opened a new pull request #4138: [AIRFLOW-3301] Update CI test for PR 
#3977
URL: https://github.com/apache/incubator-airflow/pull/4138
 
 
   ### Jira
   
 - https://issues.apache.org/jira/browse/AIRFLOW-3301
   
   ### Description
   
   In PR https://github.com/apache/incubator-airflow/pull/3977, new argument 
`auto_remove` was added  but test was NOT updated accordingly, and it results 
in CI failure.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3301) Update CI test for [AIRFLOW-3132] (PR #3977)

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3301?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16676055#comment-16676055
 ] 

ASF GitHub Bot commented on AIRFLOW-3301:
-

XD-DENG opened a new pull request #4138: [AIRFLOW-3301] Update CI test for PR 
#3977
URL: https://github.com/apache/incubator-airflow/pull/4138
 
 
   ### Jira
   
 - https://issues.apache.org/jira/browse/AIRFLOW-3301
   
   ### Description
   
   In PR https://github.com/apache/incubator-airflow/pull/3977, new argument 
`auto_remove` was added  but test was NOT updated accordingly, and it results 
in CI failure.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Update CI test for [AIRFLOW-3132] (PR #3977)
> 
>
> Key: AIRFLOW-3301
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3301
> Project: Apache Airflow
>  Issue Type: Test
>  Components: tests
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Critical
>
> In PR [https://github.com/apache/incubator-airflow/pull/3977,] test is not 
> updated accordingly, and it results in CI failure.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3301) Update CI test for [AIRFLOW-3132] (PR #3977)

2018-11-05 Thread Xiaodong DENG (JIRA)
Xiaodong DENG created AIRFLOW-3301:
--

 Summary: Update CI test for [AIRFLOW-3132] (PR #3977)
 Key: AIRFLOW-3301
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3301
 Project: Apache Airflow
  Issue Type: Test
  Components: tests
Reporter: Xiaodong DENG
Assignee: Xiaodong DENG


In PR [https://github.com/apache/incubator-airflow/pull/3977,] test is not 
updated accordingly, and it results in CI failure.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] r39132 commented on issue #4137: [AIRFLOW-XXX] Fix Docstrings in Hooks, Sensors & Operators

2018-11-05 Thread GitBox
r39132 commented on issue #4137: [AIRFLOW-XXX] Fix Docstrings in Hooks, Sensors 
& Operators
URL: 
https://github.com/apache/incubator-airflow/pull/4137#issuecomment-436109582
 
 
   @kaxil Do you have a count before and after your change that shows a 
reduction in flake8 errors?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] fordguo closed pull request #2227: [AIRFLOW-1083] Fixes the connect error when jaydebeapi >1.0 and the jdbc's autocommit bug

2018-11-05 Thread GitBox
fordguo closed pull request #2227: [AIRFLOW-1083] Fixes the connect error when 
jaydebeapi >1.0 and the jdbc's autocommit bug
URL: https://github.com/apache/incubator-airflow/pull/2227
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/hooks/jdbc_hook.py b/airflow/hooks/jdbc_hook.py
index bc1f352ecc..4d51dd306c 100644
--- a/airflow/hooks/jdbc_hook.py
+++ b/airflow/hooks/jdbc_hook.py
@@ -44,6 +44,9 @@ class JdbcHook(DbApiHook):
 default_conn_name = 'jdbc_default'
 supports_autocommit = True
 
+def is_jaydebeapi_v1(self):
+return jaydebeapi.__version_info__[0] >= 1
+
 def get_conn(self):
 conn = self.get_connection(getattr(self, self.conn_name_attr))
 host = conn.host
@@ -52,9 +55,15 @@ def get_conn(self):
 jdbc_driver_loc = conn.extra_dejson.get('extra__jdbc__drv_path')
 jdbc_driver_name = conn.extra_dejson.get('extra__jdbc__drv_clsname')
 
-conn = jaydebeapi.connect(jdbc_driver_name,
-  [str(host), str(login), str(psw)],
-  jdbc_driver_loc,)
+if self.is_jaydebeapi_v1():
+conn = jaydebeapi.connect(jdbc_driver_name,
+  str(host),
+  [str(login), str(psw)],
+  jdbc_driver_loc,)
+else:
+conn = jaydebeapi.connect(jdbc_driver_name,
+  [str(host), str(login), str(psw)],
+  jdbc_driver_loc,)
 return conn
 
 def set_autocommit(self, conn, autocommit):
@@ -64,4 +73,7 @@ def set_autocommit(self, conn, autocommit):
 :param conn: The connection
 :return:
 """
-conn.jconn.autocommit = autocommit
+if self.is_jaydebeapi_v1():
+conn.jconn.autoCommit = autocommit
+else:
+conn.jconn.autocommit = autocommit


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1083) after jaydebeapi >=1.0, use the connect(jclassname, url, driver_args...)

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1083?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16676004#comment-16676004
 ] 

ASF GitHub Bot commented on AIRFLOW-1083:
-

fordguo closed pull request #2227: [AIRFLOW-1083] Fixes the connect error when 
jaydebeapi >1.0 and the jdbc's autocommit bug
URL: https://github.com/apache/incubator-airflow/pull/2227
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/hooks/jdbc_hook.py b/airflow/hooks/jdbc_hook.py
index bc1f352ecc..4d51dd306c 100644
--- a/airflow/hooks/jdbc_hook.py
+++ b/airflow/hooks/jdbc_hook.py
@@ -44,6 +44,9 @@ class JdbcHook(DbApiHook):
 default_conn_name = 'jdbc_default'
 supports_autocommit = True
 
+def is_jaydebeapi_v1(self):
+return jaydebeapi.__version_info__[0] >= 1
+
 def get_conn(self):
 conn = self.get_connection(getattr(self, self.conn_name_attr))
 host = conn.host
@@ -52,9 +55,15 @@ def get_conn(self):
 jdbc_driver_loc = conn.extra_dejson.get('extra__jdbc__drv_path')
 jdbc_driver_name = conn.extra_dejson.get('extra__jdbc__drv_clsname')
 
-conn = jaydebeapi.connect(jdbc_driver_name,
-  [str(host), str(login), str(psw)],
-  jdbc_driver_loc,)
+if self.is_jaydebeapi_v1():
+conn = jaydebeapi.connect(jdbc_driver_name,
+  str(host),
+  [str(login), str(psw)],
+  jdbc_driver_loc,)
+else:
+conn = jaydebeapi.connect(jdbc_driver_name,
+  [str(host), str(login), str(psw)],
+  jdbc_driver_loc,)
 return conn
 
 def set_autocommit(self, conn, autocommit):
@@ -64,4 +73,7 @@ def set_autocommit(self, conn, autocommit):
 :param conn: The connection
 :return:
 """
-conn.jconn.autocommit = autocommit
+if self.is_jaydebeapi_v1():
+conn.jconn.autoCommit = autocommit
+else:
+conn.jconn.autocommit = autocommit


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> after jaydebeapi >=1.0, use the connect(jclassname, url, driver_args...)
> 
>
> Key: AIRFLOW-1083
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1083
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: 1.8.0
>Reporter: Ford Guo
>Priority: Major
>
> I use the latest jaydebeapi 1.1.1, and dont get the correct connection with 
> jdbc_hook



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] fordguo commented on issue #2227: [AIRFLOW-1083] Fixes the connect error when jaydebeapi >1.0 and the jdbc's autocommit bug

2018-11-05 Thread GitBox
fordguo commented on issue #2227: [AIRFLOW-1083] Fixes the connect error when 
jaydebeapi >1.0 and the jdbc's autocommit bug
URL: 
https://github.com/apache/incubator-airflow/pull/2227#issuecomment-436107108
 
 
   > @fordguo are you still working on this?
   
   @ron819 it worked on my legacy project with old superset which I modified.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675947#comment-16675947
 ] 

ASF GitHub Bot commented on AIRFLOW-3272:
-

morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   Make sure you have checked all steps below.
   
   Jira
 My PR addresses the following Airflow Jira issues and references them in 
the PR title. For example, "[AIRFLOW-3272] My Airflow PR"
   https://issues.apache.org/jira/browse/AIRFLOW-3272
   Description
 Add support for gRPC connection in airflow. 
   
   In Airflow there are use cases of calling gPRC services, so instead of each 
time create the channel in a PythonOperator, there should be a basic GrpcHook 
to take care of it. The hook needs to take care of the authentication.
   
   Tests
 My PR adds the following unit tests OR does not need testing for this 
extremely good reason:
   Commits
 My commits all reference Jira issues in their subject lines, and I have 
squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "How to write a good git commit message":
   Subject is separated from body by a blank line
   Subject is limited to 50 characters (not including Jira issue reference)
   Subject does not end with a period
   Subject uses the imperative mood ("add", not "adding")
   Body wraps at 72 characters
   Body explains "what" and "why", not "how"
   Documentation
 In case of new functionality, my PR adds documentation that describes how 
to use it.
   When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   Code Quality
 Passes flake8


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675946#comment-16675946
 ] 

ASF GitHub Bot commented on AIRFLOW-3272:
-

morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/grpc_hook.py 
b/airflow/contrib/hooks/grpc_hook.py
new file mode 100644
index 00..b260847f19
--- /dev/null
+++ b/airflow/contrib/hooks/grpc_hook.py
@@ -0,0 +1,118 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+import grpc
+from google import auth as google_auth
+from google.auth import jwt as google_auth_jwt
+from google.auth.transport import grpc as google_auth_transport_grpc
+from google.auth.transport import requests as google_auth_transport_requests
+
+from airflow.hooks.base_hook import BaseHook
+from airflow.exceptions import AirflowConfigException
+
+
+class GrpcHook(BaseHook):
+"""
+General interaction with gRPC servers.
+:param grpc_conn_id: The connection ID to use when fetching connection 
info.
+:type grpc_conn_id: str
+:param interceptors: a list of gRPC interceptor objects which would be 
applied
+to the connected gRPC channle. None by default.
+:type interceptors: a list of gRPC interceptors based on or extends the 
four
+official gRPC interceptors, eg, UnaryUnaryClientInterceptor, 
UnaryStreamClientInterceptor,
+StreamUnaryClientInterceptor, StreamStreamClientInterceptor.
+::param custom_connection_func: The customized connection function to 
return gRPC channel.
+:type custom_connection_func: python callable objects that accept the 
connection as
+its only arg. Could be partial or lambda.
+"""
+
+def __init__(self, grpc_conn_id, interceptors=None, 
custom_connection_func=None):
+self.grpc_conn_id = grpc_conn_id
+self.conn = self.get_connection(self.grpc_conn_id)
+self.extras = self.conn.extra_dejson
+self.interceptors = interceptors if interceptors else []
+self.custom_connection_func = custom_connection_func
+
+def get_conn(self):
+if "://" in self.conn.host:
+base_url = self.conn.host
+else:
+# schema defaults to HTTP
+schema = self.conn.schema if self.conn.schema else "http"
+base_url = schema + "://" + self.conn.host
+
+if self.conn.port:
+base_url = base_url + ":" + str(self.conn.port) + "/"
+
+auth_type = self._get_field("auth_type")
+
+if auth_type == "NO_AUTH":
+channel = grpc.insecure_channel(base_url)
+elif auth_type == "SSL" or auth_type == "TLS":
+credential_file_name = self._get_field("credential_pem_file")
+creds = 
grpc.ssl_channel_credentials(open(credential_file_name).read())
+channel = grpc.secure_channel(base_url, creds)
+elif auth_type == "JWT_GOOGLE":
+credentials, _ = google_auth.default()
+jwt_creds = 
google_auth_jwt.OnDemandCredentials.from_signing_credentials(
+credentials)
+channel = google_auth_transport_grpc.secure_authorized_channel(
+jwt_creds, None, base_url)
+elif auth_type == "OATH_GOOGLE":
+scopes = self._get_field("scopes").split(",")
+credentials, _ = google_auth.default(scopes=scopes)
+request = google_auth_transport_requests.Request()
+channel = google_auth_transport_grpc.secure_authorized_channel(
+credentials, request, base_url)
+elif auth_type == "CUSTOM":
+if not self.custom_connection_func:
+raise AirflowConfigException(
+"Customized connection function not set, not able to 
establish a channel")
+channel = self.custom_connection_func(self.conn)
+
+if self.interceptors:
+for interceptor in self.interceptors:
+channel = grpc.intercept_channel(channel,
+ interceptor)
+
+

[GitHub] morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/grpc_hook.py 
b/airflow/contrib/hooks/grpc_hook.py
new file mode 100644
index 00..b260847f19
--- /dev/null
+++ b/airflow/contrib/hooks/grpc_hook.py
@@ -0,0 +1,118 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+import grpc
+from google import auth as google_auth
+from google.auth import jwt as google_auth_jwt
+from google.auth.transport import grpc as google_auth_transport_grpc
+from google.auth.transport import requests as google_auth_transport_requests
+
+from airflow.hooks.base_hook import BaseHook
+from airflow.exceptions import AirflowConfigException
+
+
+class GrpcHook(BaseHook):
+"""
+General interaction with gRPC servers.
+:param grpc_conn_id: The connection ID to use when fetching connection 
info.
+:type grpc_conn_id: str
+:param interceptors: a list of gRPC interceptor objects which would be 
applied
+to the connected gRPC channle. None by default.
+:type interceptors: a list of gRPC interceptors based on or extends the 
four
+official gRPC interceptors, eg, UnaryUnaryClientInterceptor, 
UnaryStreamClientInterceptor,
+StreamUnaryClientInterceptor, StreamStreamClientInterceptor.
+::param custom_connection_func: The customized connection function to 
return gRPC channel.
+:type custom_connection_func: python callable objects that accept the 
connection as
+its only arg. Could be partial or lambda.
+"""
+
+def __init__(self, grpc_conn_id, interceptors=None, 
custom_connection_func=None):
+self.grpc_conn_id = grpc_conn_id
+self.conn = self.get_connection(self.grpc_conn_id)
+self.extras = self.conn.extra_dejson
+self.interceptors = interceptors if interceptors else []
+self.custom_connection_func = custom_connection_func
+
+def get_conn(self):
+if "://" in self.conn.host:
+base_url = self.conn.host
+else:
+# schema defaults to HTTP
+schema = self.conn.schema if self.conn.schema else "http"
+base_url = schema + "://" + self.conn.host
+
+if self.conn.port:
+base_url = base_url + ":" + str(self.conn.port) + "/"
+
+auth_type = self._get_field("auth_type")
+
+if auth_type == "NO_AUTH":
+channel = grpc.insecure_channel(base_url)
+elif auth_type == "SSL" or auth_type == "TLS":
+credential_file_name = self._get_field("credential_pem_file")
+creds = 
grpc.ssl_channel_credentials(open(credential_file_name).read())
+channel = grpc.secure_channel(base_url, creds)
+elif auth_type == "JWT_GOOGLE":
+credentials, _ = google_auth.default()
+jwt_creds = 
google_auth_jwt.OnDemandCredentials.from_signing_credentials(
+credentials)
+channel = google_auth_transport_grpc.secure_authorized_channel(
+jwt_creds, None, base_url)
+elif auth_type == "OATH_GOOGLE":
+scopes = self._get_field("scopes").split(",")
+credentials, _ = google_auth.default(scopes=scopes)
+request = google_auth_transport_requests.Request()
+channel = google_auth_transport_grpc.secure_authorized_channel(
+credentials, request, base_url)
+elif auth_type == "CUSTOM":
+if not self.custom_connection_func:
+raise AirflowConfigException(
+"Customized connection function not set, not able to 
establish a channel")
+channel = self.custom_connection_func(self.conn)
+
+if self.interceptors:
+for interceptor in self.interceptors:
+channel = grpc.intercept_channel(channel,
+ interceptor)
+
+return channel
+
+def run(self, stub_class, call_func, streaming=False, data={}):
+with self.get_conn() as channel:
+stub = stub_class(channel)
+try:
+response = stub.call_func(**data)
+

[GitHub] morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   Make sure you have checked all steps below.
   
   Jira
 My PR addresses the following Airflow Jira issues and references them in 
the PR title. For example, "[AIRFLOW-3272] My Airflow PR"
   https://issues.apache.org/jira/browse/AIRFLOW-3272
   Description
 Add support for gRPC connection in airflow. 
   
   In Airflow there are use cases of calling gPRC services, so instead of each 
time create the channel in a PythonOperator, there should be a basic GrpcHook 
to take care of it. The hook needs to take care of the authentication.
   
   Tests
 My PR adds the following unit tests OR does not need testing for this 
extremely good reason:
   Commits
 My commits all reference Jira issues in their subject lines, and I have 
squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "How to write a good git commit message":
   Subject is separated from body by a blank line
   Subject is limited to 50 characters (not including Jira issue reference)
   Subject does not end with a period
   Subject uses the imperative mood ("add", not "adding")
   Body wraps at 72 characters
   Body explains "what" and "why", not "how"
   Documentation
 In case of new functionality, my PR adds documentation that describes how 
to use it.
   When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   Code Quality
 Passes flake8


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4030: [AIRFLOW-XXX] Log the task_id in the PendingDeprecationWarning for BaseOperator

2018-11-05 Thread GitBox
codecov-io edited a comment on issue #4030: [AIRFLOW-XXX] Log the task_id in 
the PendingDeprecationWarning for BaseOperator
URL: 
https://github.com/apache/incubator-airflow/pull/4030#issuecomment-428644838
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4030?src=pr=h1)
 Report
   > Merging 
[#4030](https://codecov.io/gh/apache/incubator-airflow/pull/4030?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e703d6beeb379ee88ef5e7df495e8a785666f8af?src=pr=desc)
 will **decrease** coverage by `0.9%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4030/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4030?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4030  +/-   ##
   ==
   - Coverage   76.67%   75.76%   -0.91% 
   ==
 Files 199  199  
 Lines   1618615946 -240 
   ==
   - Hits1241012082 -328 
   - Misses   3776 3864  +88
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4030?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `91.67% <ø> (-0.38%)` | :arrow_down: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `0% <0%> (-97.65%)` | :arrow_down: |
   | 
[airflow/operators/slack\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc2xhY2tfb3BlcmF0b3IucHk=)
 | `0% <0%> (-97.37%)` | :arrow_down: |
   | 
[airflow/operators/s3\_to\_hive\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvczNfdG9faGl2ZV9vcGVyYXRvci5weQ==)
 | `0% <0%> (-94.02%)` | :arrow_down: |
   | 
[airflow/security/kerberos.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9zZWN1cml0eS9rZXJiZXJvcy5weQ==)
 | `0% <0%> (-71.43%)` | :arrow_down: |
   | 
[airflow/operators/latest\_only\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbGF0ZXN0X29ubHlfb3BlcmF0b3IucHk=)
 | `25% <0%> (-65%)` | :arrow_down: |
   | 
[airflow/operators/subdag\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc3ViZGFnX29wZXJhdG9yLnB5)
 | `70.96% <0%> (-19.36%)` | :arrow_down: |
   | 
[airflow/example\_dags/example\_python\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9weXRob25fb3BlcmF0b3IucHk=)
 | `78.94% <0%> (-15.79%)` | :arrow_down: |
   | 
[airflow/operators/python\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uX29wZXJhdG9yLnB5)
 | `81.98% <0%> (-13.05%)` | :arrow_down: |
   | 
[airflow/utils/file.py](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9maWxlLnB5)
 | `76% <0%> (-8%)` | :arrow_down: |
   | ... and [46 
more](https://codecov.io/gh/apache/incubator-airflow/pull/4030/diff?src=pr=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4030?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4030?src=pr=footer).
 Last update 
[e703d6b...fb5f25d](https://codecov.io/gh/apache/incubator-airflow/pull/4030?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil edited a comment on issue #4137: [AIRFLOW-XXX] Fix Docstrings in Hooks, Sensors & Operators

2018-11-05 Thread GitBox
kaxil edited a comment on issue #4137: [AIRFLOW-XXX] Fix Docstrings in Hooks, 
Sensors & Operators
URL: 
https://github.com/apache/incubator-airflow/pull/4137#issuecomment-436084787
 
 
   @Fokko @ashb @r39132I found https://pypi.org/project/flake8-docstrings/ 
which we can include it in `.github/PULL_REQUEST_TEMPLATE.md` to enforce that 
docstrings are included and also in correct format. 
   
   A user should then do the following:
   
   ```
   git diff upstream/master -u -- "*.py" | flake8 --diff 
   ```
   
   We only need to add `flake8-docstrings` as dependency.
   
   However, we might want to ignore many rules which we might need to either 
try it ourselves or find some docs/list and decide.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #4137: [AIRFLOW-XXX] Fix Docstrings in Hooks, Sensors & Operators

2018-11-05 Thread GitBox
kaxil commented on issue #4137: [AIRFLOW-XXX] Fix Docstrings in Hooks, Sensors 
& Operators
URL: 
https://github.com/apache/incubator-airflow/pull/4137#issuecomment-436084787
 
 
   @Fokko @ashb   I found https://pypi.org/project/flake8-docstrings/ which we 
can include it in `.github/PULL_REQUEST_TEMPLATE.md` to enforce that docstrings 
are included and also in correct format.
   
   A user should then do the following:
   
   ```
   git diff upstream/master -u -- "*.py" | flake8 --diff 
   ```
   
   We only need to add `flake8-docstrings` as dependency.
   
   However, we might want to ignore many rules which we might need to either 
try it ourselves or find some docs/list and decide.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil opened a new pull request #4137: [AIRFLOW-XXX] Fix Docstrings in Hooks, Sensors & Operators

2018-11-05 Thread GitBox
kaxil opened a new pull request #4137: [AIRFLOW-XXX] Fix Docstrings in Hooks, 
Sensors & Operators
URL: https://github.com/apache/incubator-airflow/pull/4137
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
   
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   Fix issues like the below ones:
   
![image](https://user-images.githubusercontent.com/8811558/48034781-a4059100-e158-11e8-8232-0b7d5cad3ea1.png)
   
   
![image](https://user-images.githubusercontent.com/8811558/48034817-c1d2f600-e158-11e8-8fe4-a5cd3b0f2d48.png)
   
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   Doc change. Not needed
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] exploy commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS to GCS operator

2018-11-05 Thread GitBox
exploy commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS 
to GCS operator
URL: https://github.com/apache/incubator-airflow/pull/4134#discussion_r230953238
 
 

 ##
 File path: airflow/contrib/operators/adls_to_gcs.py
 ##
 @@ -0,0 +1,144 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.hooks.azure_data_lake_hook import AzureDataLakeHook
+from airflow.contrib.operators.adls_list_operator import 
AzureDataLakeStorageListOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.gcs_hook import GoogleCloudStorageHook, 
_parse_gcs_url
+
+import os
+from tempfile import NamedTemporaryFile
+
+
+class AdlsToGoogleCloudStorageOperator(AzureDataLakeStorageListOperator):
+"""
+Synchronizes an Azure Data Lake Storage path with a GCS bucket
+:param path: The Azure Data Lake path to find the objects (templated)
+:type path: str
+:param dest_gcs: The Google Cloud Storage bucket and prefix to
+store the objects. (templated)
+:type dest_gcs: str
+:param replace: If true, replaces same-named files in GCS
+:type replace: bool
+:param azure_data_lake_conn_id: The connection ID to use when
+connecting to Azure Data Lake Storage.
+:type azure_data_lake_conn_id: str
+:param dest_google_cloud_storage_conn_id: The connection ID to use when
+connecting to Google Cloud Storage.
+:type dest_google_cloud_storage_conn_id: str
+:param delegate_to: The account to impersonate, if any.
+For this to work, the service account making the request must have
+domain-wide delegation enabled.
+:type delegate_to: str
+
+**Examples**:
+The following Operator would copy a single file named
+``hello/world.avro`` from ADLS to the GCS bucket ``mybucket``. Its full
+resulting gcs path will be ``gs://mybucket/hello/world.avro`` ::
+copy_single_file = AdlsToGoogleCloudStorageOperator(
+task_id='copy_single_file',
+path='hello/world.avro',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+
+The following Operator would copy all parquet files from ADLS
+to the GCS bucket ``mybucket``. ::
+copy_all_files = AdlsToGoogleCloudStorageOperator(
+task_id='copy_all_files',
+path='*.parquet',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+
+ The following Operator would copy all parquet files from ADLS
+path ``/hello/world``to the GCS bucket ``mybucket``. ::
+copy_world_files = AdlsToGoogleCloudStorageOperator(
+task_id='copy_world_files',
+path='hello/world/*.parquet',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+"""
+template_fields = ('path', 'dest_gcs')
+ui_color = '#f0eee4'
+
+@apply_defaults
+def __init__(self,
+ path,
+ dest_gcs=None,
+ replace=False,
+ azure_data_lake_conn_id='azure_data_lake_default',
+ google_cloud_storage_conn_id='google_cloud_default',
 
 Review comment:
   What about leaving both `azure_data_lake_conn_id`  and 
`google_cloud_storage_conn_id` as Nones? 
   Or maybe even making them mandatory arguments? 
   Similar applies to `dest_gcs` equal to None - it is hard for me to come up 
with a good example when this default value makes sense. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For 

[GitHub] exploy commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS to GCS operator

2018-11-05 Thread GitBox
exploy commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS 
to GCS operator
URL: https://github.com/apache/incubator-airflow/pull/4134#discussion_r230950581
 
 

 ##
 File path: airflow/contrib/operators/adls_to_gcs.py
 ##
 @@ -0,0 +1,144 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.hooks.azure_data_lake_hook import AzureDataLakeHook
+from airflow.contrib.operators.adls_list_operator import 
AzureDataLakeStorageListOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.gcs_hook import GoogleCloudStorageHook, 
_parse_gcs_url
+
+import os
+from tempfile import NamedTemporaryFile
+
+
+class AdlsToGoogleCloudStorageOperator(AzureDataLakeStorageListOperator):
+"""
+Synchronizes an Azure Data Lake Storage path with a GCS bucket
+:param path: The Azure Data Lake path to find the objects (templated)
 
 Review comment:
   I see that you following a style of other similar operators like 
`S3ToGoogleCloudStorageOperator`. But what do you think about the name for this 
param: `src_adls` this would match with `dest_gcs`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Padarn commented on issue #4089: adding image_pull_secrets into pod at creation inside operator

2018-11-05 Thread GitBox
Padarn commented on issue #4089: adding image_pull_secrets into pod at creation 
inside operator
URL: 
https://github.com/apache/incubator-airflow/pull/4089#issuecomment-436070775
 
 
   Apologies I thought I had, I’ll read more closely.
   
   On Tue, 6 Nov 2018 at 1:24 AM, Fokko Driesprong 
   wrote:
   
   > @Padarn  Can you adhere to the git commit
   > guidelines?
   >
   > —
   > You are receiving this because you were mentioned.
   > Reply to this email directly, view it on GitHub
   > 
,
   > or mute the thread
   > 

   > .
   >
   
   -- 
   _Grab is hiring. Learn more at *https://grab.careers 
   *_
   
   
   By communicating with Grab Inc and/or its 
   subsidiaries, associate companies and jointly controlled entities (“Grab 
   Group”), you are deemed to have consented to processing of your personal 
   data as set out in the Privacy Notice which can be viewed at 
   https://grab.com/privacy/ 
   
   
   This email contains 
   confidential information and is only for the intended recipient(s). If you 
   are not the intended recipient(s), please do not disseminate, distribute or 
   copy this email and notify Grab Group immediately if you have received this 
   by mistake and delete this email from your system. Email transmission 
   cannot be guaranteed to be secure or error-free as any information therein 
   could be intercepted, corrupted, lost, destroyed, delayed or incomplete, or 
   contain viruses. Grab Group do not accept liability for any errors or 
   omissions in the contents of this email arises as a result of email 
   transmission. All intellectual property rights in this email and 
   attachments therein shall remain vested in Grab Group, unless otherwise 
   provided by law.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] exploy commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS to GCS operator

2018-11-05 Thread GitBox
exploy commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS 
to GCS operator
URL: https://github.com/apache/incubator-airflow/pull/4134#discussion_r230948173
 
 

 ##
 File path: airflow/contrib/operators/adls_to_gcs.py
 ##
 @@ -0,0 +1,144 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.hooks.azure_data_lake_hook import AzureDataLakeHook
+from airflow.contrib.operators.adls_list_operator import 
AzureDataLakeStorageListOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.gcs_hook import GoogleCloudStorageHook, 
_parse_gcs_url
+
+import os
 
 Review comment:
   What do you think about ordering imports as PEP8 suggest?
   ```
   1) Standard library imports.
   2) Related third party imports.
   3) Local application/library specific imports.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] exploy commented on a change in pull request #4118: [AIRFLOW-3271] Airflow RBAC Permissions modification via UI do not persist

2018-11-05 Thread GitBox
exploy commented on a change in pull request #4118: [AIRFLOW-3271] Airflow RBAC 
Permissions modification via UI do not persist
URL: https://github.com/apache/incubator-airflow/pull/4118#discussion_r230946740
 
 

 ##
 File path: airflow/www_rbac/security.py
 ##
 @@ -181,13 +181,17 @@ def init_role(self, role_name, role_vms, role_perms):
 if not role:
 role = self.add_role(role_name)
 
-role_pvms = []
-for pvm in pvms:
-if pvm.view_menu.name in role_vms and pvm.permission.name in 
role_perms:
-role_pvms.append(pvm)
-role.permissions = list(set(role_pvms))
-self.get_session.merge(role)
-self.get_session.commit()
+if (len(role.permissions) == 0):
 
 Review comment:
   python does not require you to surround if condition with brackets.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] matt-land opened a new pull request #4136: Fix for scheduler infinite loop when evaluating non-UTC DAGs after DST

2018-11-05 Thread GitBox
matt-land opened a new pull request #4136: Fix for scheduler infinite loop when 
evaluating non-UTC DAGs after DST
URL: https://github.com/apache/incubator-airflow/pull/4136
 
 
   …ing than their start state
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following 
[Airflow-1710](https://issues.apache.org/jira/browse/AIRFLOW/) issues and 
references them in the PR title. For example, "\[AIRFLOW-1710\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-1710
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [X ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   This fixes the infinite loop in the AF jobs scheduler scanning a DAG with a 
start date in Daylight Savings Time while evaluating a potential run date in 
Standard Time.
   
   The infinite loop Airflow gets trapped in is below:
   
   `while next_run_date <= last_run.execution_date:
   next_run_date = dag.following_schedule(next_run_date)`
   
   dag.following_schedule is not incrementing as expected. It instead starts to 
repeating values that dag.following_schedule has already returned.
   
   This is caused by an interaction between a pendulum instance timezone having 
a DST offset this is 'illegal' for the evaluated next run date.
   
   IE, If I call following_schedule with a dag.timezone that has a DST offset 
of -5, but on the date being evaluated the value should be -6, the function 
stops incrementing and repeats ranges of values.
   
   The fix is to simply drop the dst offset from dag.timezone.  
dag.following_schedule will then increment normally, it fixes the infinite loop.
   
   ### Tests
   
   - [X] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   I need help with my PR
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675856#comment-16675856
 ] 

ASF GitHub Bot commented on AIRFLOW-3272:
-

morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   Make sure you have checked all steps below.
   
   Jira
 My PR addresses the following Airflow Jira issues and references them in 
the PR title. For example, "[AIRFLOW-3272] My Airflow PR"
   https://issues.apache.org/jira/browse/AIRFLOW-3272
   Description
 Add support for gRPC connection in airflow. 
   
   In Airflow there are use cases of calling gPRC services, so instead of each 
time create the channel in a PythonOperator, there should be a basic GrpcHook 
to take care of it. The hook needs to take care of the authentication.
   
   Tests
 My PR adds the following unit tests OR does not need testing for this 
extremely good reason:
   Commits
 My commits all reference Jira issues in their subject lines, and I have 
squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "How to write a good git commit message":
   Subject is separated from body by a blank line
   Subject is limited to 50 characters (not including Jira issue reference)
   Subject does not end with a period
   Subject uses the imperative mood ("add", not "adding")
   Body wraps at 72 characters
   Body explains "what" and "why", not "how"
   Documentation
 In case of new functionality, my PR adds documentation that describes how 
to use it.
   When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   Code Quality
 Passes flake8


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] aoen commented on issue #4005: [AIRFLOW-3160] Load latest_dagruns asynchronously, speed up front page load time

2018-11-05 Thread GitBox
aoen commented on issue #4005: [AIRFLOW-3160] Load latest_dagruns 
asynchronously, speed up front page load time
URL: 
https://github.com/apache/incubator-airflow/pull/4005#issuecomment-436066368
 
 
   @Fokko thanks! I owe you a PR review :).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675855#comment-16675855
 ] 

ASF GitHub Bot commented on AIRFLOW-3272:
-

morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/grpc_hook.py 
b/airflow/contrib/hooks/grpc_hook.py
new file mode 100644
index 00..b260847f19
--- /dev/null
+++ b/airflow/contrib/hooks/grpc_hook.py
@@ -0,0 +1,118 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+import grpc
+from google import auth as google_auth
+from google.auth import jwt as google_auth_jwt
+from google.auth.transport import grpc as google_auth_transport_grpc
+from google.auth.transport import requests as google_auth_transport_requests
+
+from airflow.hooks.base_hook import BaseHook
+from airflow.exceptions import AirflowConfigException
+
+
+class GrpcHook(BaseHook):
+"""
+General interaction with gRPC servers.
+:param grpc_conn_id: The connection ID to use when fetching connection 
info.
+:type grpc_conn_id: str
+:param interceptors: a list of gRPC interceptor objects which would be 
applied
+to the connected gRPC channle. None by default.
+:type interceptors: a list of gRPC interceptors based on or extends the 
four
+official gRPC interceptors, eg, UnaryUnaryClientInterceptor, 
UnaryStreamClientInterceptor,
+StreamUnaryClientInterceptor, StreamStreamClientInterceptor.
+::param custom_connection_func: The customized connection function to 
return gRPC channel.
+:type custom_connection_func: python callable objects that accept the 
connection as
+its only arg. Could be partial or lambda.
+"""
+
+def __init__(self, grpc_conn_id, interceptors=None, 
custom_connection_func=None):
+self.grpc_conn_id = grpc_conn_id
+self.conn = self.get_connection(self.grpc_conn_id)
+self.extras = self.conn.extra_dejson
+self.interceptors = interceptors if interceptors else []
+self.custom_connection_func = custom_connection_func
+
+def get_conn(self):
+if "://" in self.conn.host:
+base_url = self.conn.host
+else:
+# schema defaults to HTTP
+schema = self.conn.schema if self.conn.schema else "http"
+base_url = schema + "://" + self.conn.host
+
+if self.conn.port:
+base_url = base_url + ":" + str(self.conn.port) + "/"
+
+auth_type = self._get_field("auth_type")
+
+if auth_type == "NO_AUTH":
+channel = grpc.insecure_channel(base_url)
+elif auth_type == "SSL" or auth_type == "TLS":
+credential_file_name = self._get_field("credential_pem_file")
+creds = 
grpc.ssl_channel_credentials(open(credential_file_name).read())
+channel = grpc.secure_channel(base_url, creds)
+elif auth_type == "JWT_GOOGLE":
+credentials, _ = google_auth.default()
+jwt_creds = 
google_auth_jwt.OnDemandCredentials.from_signing_credentials(
+credentials)
+channel = google_auth_transport_grpc.secure_authorized_channel(
+jwt_creds, None, base_url)
+elif auth_type == "OATH_GOOGLE":
+scopes = self._get_field("scopes").split(",")
+credentials, _ = google_auth.default(scopes=scopes)
+request = google_auth_transport_requests.Request()
+channel = google_auth_transport_grpc.secure_authorized_channel(
+credentials, request, base_url)
+elif auth_type == "CUSTOM":
+if not self.custom_connection_func:
+raise AirflowConfigException(
+"Customized connection function not set, not able to 
establish a channel")
+channel = self.custom_connection_func(self.conn)
+
+if self.interceptors:
+for interceptor in self.interceptors:
+channel = grpc.intercept_channel(channel,
+ interceptor)
+
+

[GitHub] morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   Make sure you have checked all steps below.
   
   Jira
 My PR addresses the following Airflow Jira issues and references them in 
the PR title. For example, "[AIRFLOW-3272] My Airflow PR"
   https://issues.apache.org/jira/browse/AIRFLOW-3272
   Description
 Add support for gRPC connection in airflow. 
   
   In Airflow there are use cases of calling gPRC services, so instead of each 
time create the channel in a PythonOperator, there should be a basic GrpcHook 
to take care of it. The hook needs to take care of the authentication.
   
   Tests
 My PR adds the following unit tests OR does not need testing for this 
extremely good reason:
   Commits
 My commits all reference Jira issues in their subject lines, and I have 
squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "How to write a good git commit message":
   Subject is separated from body by a blank line
   Subject is limited to 50 characters (not including Jira issue reference)
   Subject does not end with a period
   Subject uses the imperative mood ("add", not "adding")
   Body wraps at 72 characters
   Body explains "what" and "why", not "how"
   Documentation
 In case of new functionality, my PR adds documentation that describes how 
to use it.
   When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   Code Quality
 Passes flake8


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/grpc_hook.py 
b/airflow/contrib/hooks/grpc_hook.py
new file mode 100644
index 00..b260847f19
--- /dev/null
+++ b/airflow/contrib/hooks/grpc_hook.py
@@ -0,0 +1,118 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+import grpc
+from google import auth as google_auth
+from google.auth import jwt as google_auth_jwt
+from google.auth.transport import grpc as google_auth_transport_grpc
+from google.auth.transport import requests as google_auth_transport_requests
+
+from airflow.hooks.base_hook import BaseHook
+from airflow.exceptions import AirflowConfigException
+
+
+class GrpcHook(BaseHook):
+"""
+General interaction with gRPC servers.
+:param grpc_conn_id: The connection ID to use when fetching connection 
info.
+:type grpc_conn_id: str
+:param interceptors: a list of gRPC interceptor objects which would be 
applied
+to the connected gRPC channle. None by default.
+:type interceptors: a list of gRPC interceptors based on or extends the 
four
+official gRPC interceptors, eg, UnaryUnaryClientInterceptor, 
UnaryStreamClientInterceptor,
+StreamUnaryClientInterceptor, StreamStreamClientInterceptor.
+::param custom_connection_func: The customized connection function to 
return gRPC channel.
+:type custom_connection_func: python callable objects that accept the 
connection as
+its only arg. Could be partial or lambda.
+"""
+
+def __init__(self, grpc_conn_id, interceptors=None, 
custom_connection_func=None):
+self.grpc_conn_id = grpc_conn_id
+self.conn = self.get_connection(self.grpc_conn_id)
+self.extras = self.conn.extra_dejson
+self.interceptors = interceptors if interceptors else []
+self.custom_connection_func = custom_connection_func
+
+def get_conn(self):
+if "://" in self.conn.host:
+base_url = self.conn.host
+else:
+# schema defaults to HTTP
+schema = self.conn.schema if self.conn.schema else "http"
+base_url = schema + "://" + self.conn.host
+
+if self.conn.port:
+base_url = base_url + ":" + str(self.conn.port) + "/"
+
+auth_type = self._get_field("auth_type")
+
+if auth_type == "NO_AUTH":
+channel = grpc.insecure_channel(base_url)
+elif auth_type == "SSL" or auth_type == "TLS":
+credential_file_name = self._get_field("credential_pem_file")
+creds = 
grpc.ssl_channel_credentials(open(credential_file_name).read())
+channel = grpc.secure_channel(base_url, creds)
+elif auth_type == "JWT_GOOGLE":
+credentials, _ = google_auth.default()
+jwt_creds = 
google_auth_jwt.OnDemandCredentials.from_signing_credentials(
+credentials)
+channel = google_auth_transport_grpc.secure_authorized_channel(
+jwt_creds, None, base_url)
+elif auth_type == "OATH_GOOGLE":
+scopes = self._get_field("scopes").split(",")
+credentials, _ = google_auth.default(scopes=scopes)
+request = google_auth_transport_requests.Request()
+channel = google_auth_transport_grpc.secure_authorized_channel(
+credentials, request, base_url)
+elif auth_type == "CUSTOM":
+if not self.custom_connection_func:
+raise AirflowConfigException(
+"Customized connection function not set, not able to 
establish a channel")
+channel = self.custom_connection_func(self.conn)
+
+if self.interceptors:
+for interceptor in self.interceptors:
+channel = grpc.intercept_channel(channel,
+ interceptor)
+
+return channel
+
+def run(self, stub_class, call_func, streaming=False, data={}):
+with self.get_conn() as channel:
+stub = stub_class(channel)
+try:
+response = stub.call_func(**data)
+

[GitHub] janhicken commented on issue #4125: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-05 Thread GitBox
janhicken commented on issue #4125: [AIRFLOW-2715] Pick up the region setting 
while launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4125#issuecomment-436064621
 
 
   Alright, I will do this as you descriubed @kaxil 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #2551: [AIRFLOW-1543] Improve error message for incorrect fernet_key

2018-11-05 Thread GitBox
codecov-io edited a comment on issue #2551: [AIRFLOW-1543] Improve error 
message for incorrect fernet_key
URL: 
https://github.com/apache/incubator-airflow/pull/2551#issuecomment-325708902
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2551?src=pr=h1)
 Report
   > Merging 
[#2551](https://codecov.io/gh/apache/incubator-airflow/pull/2551?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e703d6beeb379ee88ef5e7df495e8a785666f8af?src=pr=desc)
 will **decrease** coverage by `5.81%`.
   > The diff coverage is `66.66%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/2551/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/2551?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#2551  +/-   ##
   ==
   - Coverage   76.67%   70.85%   -5.82% 
   ==
 Files 199  150  -49 
 Lines   1618611585-4601 
   ==
   - Hits12410 8209-4201 
   + Misses   3776 3376 -400
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/2551?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `87.17% <66.66%> (-4.88%)` | :arrow_down: |
   | 
[airflow/operators/email\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZW1haWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/hooks/pig\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9waWdfaG9vay5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/slack\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc2xhY2tfb3BlcmF0b3IucHk=)
 | `0% <0%> (-97.37%)` | :arrow_down: |
   | 
[airflow/operators/s3\_file\_transform\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvczNfZmlsZV90cmFuc2Zvcm1fb3BlcmF0b3IucHk=)
 | `0% <0%> (-96.23%)` | :arrow_down: |
   | 
[airflow/operators/redshift\_to\_s3\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcmVkc2hpZnRfdG9fczNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-95.46%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `0% <0%> (-94.45%)` | :arrow_down: |
   | 
[airflow/executors/celery\_executor.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvY2VsZXJ5X2V4ZWN1dG9yLnB5)
 | `0% <0%> (-80.62%)` | :arrow_down: |
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `22.27% <0%> (-72.05%)` | :arrow_down: |
   | 
[airflow/hooks/mssql\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9tc3NxbF9ob29rLnB5)
 | `6.66% <0%> (-66.67%)` | :arrow_down: |
   | ... and [199 
more](https://codecov.io/gh/apache/incubator-airflow/pull/2551/diff?src=pr=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2551?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2551?src=pr=footer).
 Last update 
[e703d6b...1126a50](https://codecov.io/gh/apache/incubator-airflow/pull/2551?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #4125: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-05 Thread GitBox
kaxil commented on issue #4125: [AIRFLOW-2715] Pick up the region setting while 
launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4125#issuecomment-436060616
 
 
   I have Reverted the changes becaue for the location API, `location` is is 
required. You will need to add this parameter to the DataflowTemplateOperator, 
you need to check if it is not passed as a default option or a parameter, you 
wil need to specify a default location.
   
   Add tests to see that location is provided, if not which default location is 
used, etc


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil closed pull request #4125: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-05 Thread GitBox
kaxil closed pull request #4125: [AIRFLOW-2715] Pick up the region setting 
while launching Dataflow templates
URL: https://github.com/apache/incubator-airflow/pull/4125
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/gcp_dataflow_hook.py 
b/airflow/contrib/hooks/gcp_dataflow_hook.py
index 4fdb07c74d..f6d7768cf7 100644
--- a/airflow/contrib/hooks/gcp_dataflow_hook.py
+++ b/airflow/contrib/hooks/gcp_dataflow_hook.py
@@ -278,8 +278,9 @@ def _start_template_dataflow(self, name, variables, 
parameters,
 "parameters": parameters,
 "environment": environment}
 service = self.get_conn()
-request = service.projects().templates().launch(
+request = service.projects().locations().templates().launch(
 projectId=variables['project'],
+location=variables['region'],
 gcsPath=dataflow_template,
 body=body
 )


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2715) Dataflow template operator dosenot support region parameter

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2715?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675820#comment-16675820
 ] 

ASF GitHub Bot commented on AIRFLOW-2715:
-

kaxil closed pull request #4125: [AIRFLOW-2715] Pick up the region setting 
while launching Dataflow templates
URL: https://github.com/apache/incubator-airflow/pull/4125
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/gcp_dataflow_hook.py 
b/airflow/contrib/hooks/gcp_dataflow_hook.py
index 4fdb07c74d..f6d7768cf7 100644
--- a/airflow/contrib/hooks/gcp_dataflow_hook.py
+++ b/airflow/contrib/hooks/gcp_dataflow_hook.py
@@ -278,8 +278,9 @@ def _start_template_dataflow(self, name, variables, 
parameters,
 "parameters": parameters,
 "environment": environment}
 service = self.get_conn()
-request = service.projects().templates().launch(
+request = service.projects().locations().templates().launch(
 projectId=variables['project'],
+location=variables['region'],
 gcsPath=dataflow_template,
 body=body
 )


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Dataflow template operator dosenot support region parameter
> ---
>
> Key: AIRFLOW-2715
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2715
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.9.0
>Reporter: Mohammed Tameem
>Priority: Critical
> Fix For: 2.0.0
>
>
> The DataflowTemplateOperator  uses dataflow.projects.templates.launch which 
> has a region parameter but only supports execution of the dataflow job in the 
> us-central1 region. Alternatively  there is another api, 
> dataflow.projects.locations.templates.launch which supports execution of the 
> template in all regional endpoints provided by google cloud.
> It would be great if,
>  # The base REST API of this operator could be changed from 
> "dataflow.projects.templates.launch" to 
> "dataflow.projects.locations.templates.launch"
>  # A templated region paramter was included in the operator to run the 
> dataflow job in the requested regional endpoint.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] morgendave commented on issue #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
morgendave commented on issue #4101: [AIRFLOW-3272] Add base grpc hook
URL: 
https://github.com/apache/incubator-airflow/pull/4101#issuecomment-436054758
 
 
   > At the bottom of your PR, click “close pull request”, and then re-open it 
to trigger the build.
   > […](#)
   > On 5 Nov 2018, at 20:59, morgendave ***@***.***> wrote: New test failures 
seem to be irrelevant to this PR, how can I re trigger the build? — You are 
receiving this because you commented. Reply to this email directly, view it on 
GitHub <[#4101 
(comment)](https://github.com/apache/incubator-airflow/pull/4101#issuecomment-436014689)>,
 or mute the thread 
.
   
   Thanks a lot


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675813#comment-16675813
 ] 

ASF GitHub Bot commented on AIRFLOW-3272:
-

morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/grpc_hook.py 
b/airflow/contrib/hooks/grpc_hook.py
new file mode 100644
index 00..b260847f19
--- /dev/null
+++ b/airflow/contrib/hooks/grpc_hook.py
@@ -0,0 +1,118 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+import grpc
+from google import auth as google_auth
+from google.auth import jwt as google_auth_jwt
+from google.auth.transport import grpc as google_auth_transport_grpc
+from google.auth.transport import requests as google_auth_transport_requests
+
+from airflow.hooks.base_hook import BaseHook
+from airflow.exceptions import AirflowConfigException
+
+
+class GrpcHook(BaseHook):
+"""
+General interaction with gRPC servers.
+:param grpc_conn_id: The connection ID to use when fetching connection 
info.
+:type grpc_conn_id: str
+:param interceptors: a list of gRPC interceptor objects which would be 
applied
+to the connected gRPC channle. None by default.
+:type interceptors: a list of gRPC interceptors based on or extends the 
four
+official gRPC interceptors, eg, UnaryUnaryClientInterceptor, 
UnaryStreamClientInterceptor,
+StreamUnaryClientInterceptor, StreamStreamClientInterceptor.
+::param custom_connection_func: The customized connection function to 
return gRPC channel.
+:type custom_connection_func: python callable objects that accept the 
connection as
+its only arg. Could be partial or lambda.
+"""
+
+def __init__(self, grpc_conn_id, interceptors=None, 
custom_connection_func=None):
+self.grpc_conn_id = grpc_conn_id
+self.conn = self.get_connection(self.grpc_conn_id)
+self.extras = self.conn.extra_dejson
+self.interceptors = interceptors if interceptors else []
+self.custom_connection_func = custom_connection_func
+
+def get_conn(self):
+if "://" in self.conn.host:
+base_url = self.conn.host
+else:
+# schema defaults to HTTP
+schema = self.conn.schema if self.conn.schema else "http"
+base_url = schema + "://" + self.conn.host
+
+if self.conn.port:
+base_url = base_url + ":" + str(self.conn.port) + "/"
+
+auth_type = self._get_field("auth_type")
+
+if auth_type == "NO_AUTH":
+channel = grpc.insecure_channel(base_url)
+elif auth_type == "SSL" or auth_type == "TLS":
+credential_file_name = self._get_field("credential_pem_file")
+creds = 
grpc.ssl_channel_credentials(open(credential_file_name).read())
+channel = grpc.secure_channel(base_url, creds)
+elif auth_type == "JWT_GOOGLE":
+credentials, _ = google_auth.default()
+jwt_creds = 
google_auth_jwt.OnDemandCredentials.from_signing_credentials(
+credentials)
+channel = google_auth_transport_grpc.secure_authorized_channel(
+jwt_creds, None, base_url)
+elif auth_type == "OATH_GOOGLE":
+scopes = self._get_field("scopes").split(",")
+credentials, _ = google_auth.default(scopes=scopes)
+request = google_auth_transport_requests.Request()
+channel = google_auth_transport_grpc.secure_authorized_channel(
+credentials, request, base_url)
+elif auth_type == "CUSTOM":
+if not self.custom_connection_func:
+raise AirflowConfigException(
+"Customized connection function not set, not able to 
establish a channel")
+channel = self.custom_connection_func(self.conn)
+
+if self.interceptors:
+for interceptor in self.interceptors:
+channel = grpc.intercept_channel(channel,
+ interceptor)
+
+

[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675814#comment-16675814
 ] 

ASF GitHub Bot commented on AIRFLOW-3272:
-

morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   Make sure you have checked all steps below.
   
   Jira
 My PR addresses the following Airflow Jira issues and references them in 
the PR title. For example, "[AIRFLOW-3272] My Airflow PR"
   https://issues.apache.org/jira/browse/AIRFLOW-3272
   Description
 Add support for gRPC connection in airflow. 
   
   In Airflow there are use cases of calling gPRC services, so instead of each 
time create the channel in a PythonOperator, there should be a basic GrpcHook 
to take care of it. The hook needs to take care of the authentication.
   
   Tests
 My PR adds the following unit tests OR does not need testing for this 
extremely good reason:
   Commits
 My commits all reference Jira issues in their subject lines, and I have 
squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "How to write a good git commit message":
   Subject is separated from body by a blank line
   Subject is limited to 50 characters (not including Jira issue reference)
   Subject does not end with a period
   Subject uses the imperative mood ("add", not "adding")
   Body wraps at 72 characters
   Body explains "what" and "why", not "how"
   Documentation
 In case of new functionality, my PR adds documentation that describes how 
to use it.
   When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   Code Quality
 Passes flake8


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
morgendave opened a new pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   Make sure you have checked all steps below.
   
   Jira
 My PR addresses the following Airflow Jira issues and references them in 
the PR title. For example, "[AIRFLOW-3272] My Airflow PR"
   https://issues.apache.org/jira/browse/AIRFLOW-3272
   Description
 Add support for gRPC connection in airflow. 
   
   In Airflow there are use cases of calling gPRC services, so instead of each 
time create the channel in a PythonOperator, there should be a basic GrpcHook 
to take care of it. The hook needs to take care of the authentication.
   
   Tests
 My PR adds the following unit tests OR does not need testing for this 
extremely good reason:
   Commits
 My commits all reference Jira issues in their subject lines, and I have 
squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "How to write a good git commit message":
   Subject is separated from body by a blank line
   Subject is limited to 50 characters (not including Jira issue reference)
   Subject does not end with a period
   Subject uses the imperative mood ("add", not "adding")
   Body wraps at 72 characters
   Body explains "what" and "why", not "how"
   Documentation
 In case of new functionality, my PR adds documentation that describes how 
to use it.
   When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   Code Quality
 Passes flake8


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
morgendave closed pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/incubator-airflow/pull/4101
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/grpc_hook.py 
b/airflow/contrib/hooks/grpc_hook.py
new file mode 100644
index 00..b260847f19
--- /dev/null
+++ b/airflow/contrib/hooks/grpc_hook.py
@@ -0,0 +1,118 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+import grpc
+from google import auth as google_auth
+from google.auth import jwt as google_auth_jwt
+from google.auth.transport import grpc as google_auth_transport_grpc
+from google.auth.transport import requests as google_auth_transport_requests
+
+from airflow.hooks.base_hook import BaseHook
+from airflow.exceptions import AirflowConfigException
+
+
+class GrpcHook(BaseHook):
+"""
+General interaction with gRPC servers.
+:param grpc_conn_id: The connection ID to use when fetching connection 
info.
+:type grpc_conn_id: str
+:param interceptors: a list of gRPC interceptor objects which would be 
applied
+to the connected gRPC channle. None by default.
+:type interceptors: a list of gRPC interceptors based on or extends the 
four
+official gRPC interceptors, eg, UnaryUnaryClientInterceptor, 
UnaryStreamClientInterceptor,
+StreamUnaryClientInterceptor, StreamStreamClientInterceptor.
+::param custom_connection_func: The customized connection function to 
return gRPC channel.
+:type custom_connection_func: python callable objects that accept the 
connection as
+its only arg. Could be partial or lambda.
+"""
+
+def __init__(self, grpc_conn_id, interceptors=None, 
custom_connection_func=None):
+self.grpc_conn_id = grpc_conn_id
+self.conn = self.get_connection(self.grpc_conn_id)
+self.extras = self.conn.extra_dejson
+self.interceptors = interceptors if interceptors else []
+self.custom_connection_func = custom_connection_func
+
+def get_conn(self):
+if "://" in self.conn.host:
+base_url = self.conn.host
+else:
+# schema defaults to HTTP
+schema = self.conn.schema if self.conn.schema else "http"
+base_url = schema + "://" + self.conn.host
+
+if self.conn.port:
+base_url = base_url + ":" + str(self.conn.port) + "/"
+
+auth_type = self._get_field("auth_type")
+
+if auth_type == "NO_AUTH":
+channel = grpc.insecure_channel(base_url)
+elif auth_type == "SSL" or auth_type == "TLS":
+credential_file_name = self._get_field("credential_pem_file")
+creds = 
grpc.ssl_channel_credentials(open(credential_file_name).read())
+channel = grpc.secure_channel(base_url, creds)
+elif auth_type == "JWT_GOOGLE":
+credentials, _ = google_auth.default()
+jwt_creds = 
google_auth_jwt.OnDemandCredentials.from_signing_credentials(
+credentials)
+channel = google_auth_transport_grpc.secure_authorized_channel(
+jwt_creds, None, base_url)
+elif auth_type == "OATH_GOOGLE":
+scopes = self._get_field("scopes").split(",")
+credentials, _ = google_auth.default(scopes=scopes)
+request = google_auth_transport_requests.Request()
+channel = google_auth_transport_grpc.secure_authorized_channel(
+credentials, request, base_url)
+elif auth_type == "CUSTOM":
+if not self.custom_connection_func:
+raise AirflowConfigException(
+"Customized connection function not set, not able to 
establish a channel")
+channel = self.custom_connection_func(self.conn)
+
+if self.interceptors:
+for interceptor in self.interceptors:
+channel = grpc.intercept_channel(channel,
+ interceptor)
+
+return channel
+
+def run(self, stub_class, call_func, streaming=False, data={}):
+with self.get_conn() as channel:
+stub = stub_class(channel)
+try:
+response = stub.call_func(**data)
+

[GitHub] odracci commented on issue #3770: [AIRFLOW-3281] Fix Kubernetes operator with git-sync

2018-11-05 Thread GitBox
odracci commented on issue #3770: [AIRFLOW-3281] Fix Kubernetes operator with 
git-sync
URL: 
https://github.com/apache/incubator-airflow/pull/3770#issuecomment-436051806
 
 
   @Fokko rebased


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] odracci commented on a change in pull request #3770: [AIRFLOW-3281] Fix Kubernetes operator with git-sync

2018-11-05 Thread GitBox
odracci commented on a change in pull request #3770: [AIRFLOW-3281] Fix 
Kubernetes operator with git-sync
URL: https://github.com/apache/incubator-airflow/pull/3770#discussion_r230929668
 
 

 ##
 File path: scripts/ci/kubernetes/kube/deploy.sh
 ##
 @@ -22,16 +22,130 @@ set -x
 IMAGE=${1:-airflow/ci}
 TAG=${2:-latest}
 DIRNAME=$(cd "$(dirname "$0")"; pwd)
+TEMPLATE_DIRNAME=${DIRNAME}/templates
+BUILD_DIRNAME=${DIRNAME}/build
+
+usage() {
+cat << EOF
+  usage: $0 options
+  OPTIONS:
+-d Use PersistentVolume or GitSync for dags_folder. Available options are 
"persistent" or "git"
 
 Review comment:
   Fixed


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] odracci commented on a change in pull request #3770: [AIRFLOW-3281] Fix Kubernetes operator with git-sync

2018-11-05 Thread GitBox
odracci commented on a change in pull request #3770: [AIRFLOW-3281] Fix 
Kubernetes operator with git-sync
URL: https://github.com/apache/incubator-airflow/pull/3770#discussion_r230929468
 
 

 ##
 File path: airflow/contrib/executors/kubernetes_executor.py
 ##
 @@ -197,10 +209,15 @@ def __init__(self):
 self._validate()
 
 def _validate(self):
-if not self.dags_volume_claim and (not self.git_repo or not 
self.git_branch):
+# TODO: use XOR for dags_volume_claim and git_dags_folder_mount_point
+if not self.dags_volume_claim \
+   and (not self.git_repo or not self.git_branch
+or not self.git_dags_folder_mount_point) \
+   and not self.dags_volume_host:
 
 Review comment:
   fixed


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cbandy commented on a change in pull request #3770: [AIRFLOW-3281] Fix Kubernetes operator with git-sync

2018-11-05 Thread GitBox
cbandy commented on a change in pull request #3770: [AIRFLOW-3281] Fix 
Kubernetes operator with git-sync
URL: https://github.com/apache/incubator-airflow/pull/3770#discussion_r230925125
 
 

 ##
 File path: airflow/contrib/executors/kubernetes_executor.py
 ##
 @@ -197,10 +209,15 @@ def __init__(self):
 self._validate()
 
 def _validate(self):
-if not self.dags_volume_claim and (not self.git_repo or not 
self.git_branch):
+# TODO: use XOR for dags_volume_claim and git_dags_folder_mount_point
+if not self.dags_volume_claim \
+   and (not self.git_repo or not self.git_branch
+or not self.git_dags_folder_mount_point) \
+   and not self.dags_volume_host:
 
 Review comment:
   Can this condition look like the error message? e.g.
   
   ```python
   if not self.dags_volume_claim \
  and not self.dags_volume_host \
  and not (self.git_repo and self.git_branch and 
self.git_dags_folder_mount_point):
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] BasPH commented on issue #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
BasPH commented on issue #4101: [AIRFLOW-3272] Add base grpc hook
URL: 
https://github.com/apache/incubator-airflow/pull/4101#issuecomment-436046161
 
 
   At the bottom of your PR, click “close pull request”, and then re-open it to 
trigger the build.
   
   > On 5 Nov 2018, at 20:59, morgendave  wrote:
   > 
   > New test failures seem to be irrelevant to this PR, how can I re trigger 
the build?
   > 
   > —
   > You are receiving this because you commented.
   > Reply to this email directly, view it on GitHub 
, 
or mute the thread 
.
   > 
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] morgendave commented on issue #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-05 Thread GitBox
morgendave commented on issue #4101: [AIRFLOW-3272] Add base grpc hook
URL: 
https://github.com/apache/incubator-airflow/pull/4101#issuecomment-436014689
 
 
   New test failures seem to be irrelevant to this PR, how can I re trigger the 
build?  


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2192) Don't authenticate on Google Authentication

2018-11-05 Thread Fokko Driesprong (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675696#comment-16675696
 ] 

Fokko Driesprong commented on AIRFLOW-2192:
---

Ah check, thanks! 

> Don't authenticate on Google Authentication
> ---
>
> Key: AIRFLOW-2192
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2192
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Affects Versions: 1.8.0
> Environment: OS: Amazon Linux AMI release 2017.09
> RAM: 30.5
> CPU: 4
> Amazon Instance Type: R4.xlarge
> Python: 2.7.13
>Reporter: Fernando Ike
>Assignee: holdenk
>Priority: Critical
> Fix For: 1.10.1
>
> Attachments: airflow.log
>
>
> It's a weird, I tried to login using Google Authentication and Airflow 
> returned "_UnicodeEncodeError: 'latin-1' codec can't encode character 
> u'\u200b' in position 8: ordinal not in range(256)_". 
> So, my google profile was:
> _First Name: Fernando_
> _Last Name: Ike_
> I changed my profile just "_Ike"_ in the "First Name" and now I can login. In 
> the attachment is the log related:



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko commented on issue #4133: [AIRFLOW-3270] Allow passwordless-binding for LDAP auth backend

2018-11-05 Thread GitBox
Fokko commented on issue #4133: [AIRFLOW-3270] Allow passwordless-binding for 
LDAP auth backend
URL: 
https://github.com/apache/incubator-airflow/pull/4133#issuecomment-436011338
 
 
   @ashb Would it be possible to test this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r230886597
 
 

 ##
 File path: airflow/contrib/operators/azure_container_instances_operator.py
 ##
 @@ -0,0 +1,233 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from time import sleep
+
+from airflow.contrib.hooks.azure_container_hook import 
(AzureContainerInstanceHook,
+
AzureContainerRegistryHook,
+
AzureContainerVolumeHook)
+from airflow.exceptions import AirflowException, AirflowTaskTimeout
+from airflow.models import BaseOperator
+
+from azure.mgmt.containerinstance.models import (EnvironmentVariable,
+ VolumeMount,
+ ResourceRequests,
+ ResourceRequirements,
+ Container,
+ ContainerGroup)
+from msrestazure.azure_exceptions import CloudError
+
+
+class AzureContainerInstancesOperator(BaseOperator):
+"""
+Start a container on Azure Container Instances
+
+:param ci_conn_id: connection id of a service principal which will be used
+to start the container instance
+:type ci_conn_id: str
+:param registry_conn_id: connection id of a user which can login to a
+private docker registry. If None, we assume a public registry
+:type registry_conn_id: str
+:param resource_group: name of the resource group wherein this container
+instance should be started
+:type resource_group: str
+:param name: name of this container instance. Please note this name has
+to be unique in order to run containers in parallel.
+:type name: str
+:param image: the docker image to be used
+:type image: str
+:param region: the region wherein this container instance should be started
+:type region: str
+:param: environment_variables: key,value pairs containing environment 
variables
+which will be passed to the running container
+:type: environment_variables: dict
+:param: volumes: list of volumes to be mounted to the container.
+Currently only Azure Fileshares are supported.
+:type: volumes: list[]
+:param: memory_in_gb: the amount of memory to allocate to this container
+:type: memory_in_gb: double
+:param: cpu: the number of cpus to allocate to this container
+:type: cpu: double
+
+:Example:
+
+>>>  a = AzureContainerInstancesOperator(
+'azure_service_principal',
+'azure_registry_user',
+'my-resource-group',
+'my-container-name-{{ ds }}',
+'myprivateregistry.azurecr.io/my_container:latest',
+'westeurope',
+{'EXECUTION_DATE': '{{ ds }}'},
+[('azure_wasb_conn_id',
+  'my_storage_container',
+  'my_fileshare',
+  '/input-data',
+  True),],
+memory_in_gb=14.0,
+cpu=4.0,
+task_id='start_container'
+)
+"""
+
+template_fields = ('name', 'environment_variables')
+template_ext = tuple()
+
 
 Review comment:
   Can you set `@apply_defaults`: 
https://github.com/apache/incubator-airflow/blob/master/airflow/operators/oracle_operator.py#L46


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r230887240
 
 

 ##
 File path: airflow/contrib/operators/azure_container_instances_operator.py
 ##
 @@ -0,0 +1,233 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from time import sleep
+
+from airflow.contrib.hooks.azure_container_hook import 
(AzureContainerInstanceHook,
+
AzureContainerRegistryHook,
+
AzureContainerVolumeHook)
+from airflow.exceptions import AirflowException, AirflowTaskTimeout
+from airflow.models import BaseOperator
+
+from azure.mgmt.containerinstance.models import (EnvironmentVariable,
+ VolumeMount,
+ ResourceRequests,
+ ResourceRequirements,
+ Container,
+ ContainerGroup)
+from msrestazure.azure_exceptions import CloudError
+
+
+class AzureContainerInstancesOperator(BaseOperator):
+"""
+Start a container on Azure Container Instances
+
+:param ci_conn_id: connection id of a service principal which will be used
+to start the container instance
+:type ci_conn_id: str
+:param registry_conn_id: connection id of a user which can login to a
+private docker registry. If None, we assume a public registry
+:type registry_conn_id: str
+:param resource_group: name of the resource group wherein this container
+instance should be started
+:type resource_group: str
+:param name: name of this container instance. Please note this name has
+to be unique in order to run containers in parallel.
+:type name: str
+:param image: the docker image to be used
+:type image: str
+:param region: the region wherein this container instance should be started
+:type region: str
+:param: environment_variables: key,value pairs containing environment 
variables
+which will be passed to the running container
+:type: environment_variables: dict
+:param: volumes: list of volumes to be mounted to the container.
+Currently only Azure Fileshares are supported.
+:type: volumes: list[]
+:param: memory_in_gb: the amount of memory to allocate to this container
+:type: memory_in_gb: double
+:param: cpu: the number of cpus to allocate to this container
+:type: cpu: double
+
+:Example:
+
+>>>  a = AzureContainerInstancesOperator(
+'azure_service_principal',
+'azure_registry_user',
+'my-resource-group',
+'my-container-name-{{ ds }}',
+'myprivateregistry.azurecr.io/my_container:latest',
+'westeurope',
+{'EXECUTION_DATE': '{{ ds }}'},
+[('azure_wasb_conn_id',
+  'my_storage_container',
+  'my_fileshare',
+  '/input-data',
+  True),],
+memory_in_gb=14.0,
+cpu=4.0,
+task_id='start_container'
+)
+"""
+
+template_fields = ('name', 'environment_variables')
+template_ext = tuple()
+
+def __init__(self, ci_conn_id, registry_conn_id, resource_group, name, 
image, region,
+ environment_variables={}, volumes=[], memory_in_gb=2.0, 
cpu=1.0,
 
 Review comment:
   Can you set `None` as default args: 
https://graysonkoonce.com/always-use-none-for-default-args-in-python/


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r230887303
 
 

 ##
 File path: airflow/contrib/operators/azure_container_instances_operator.py
 ##
 @@ -0,0 +1,233 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from time import sleep
+
+from airflow.contrib.hooks.azure_container_hook import 
(AzureContainerInstanceHook,
+
AzureContainerRegistryHook,
+
AzureContainerVolumeHook)
+from airflow.exceptions import AirflowException, AirflowTaskTimeout
+from airflow.models import BaseOperator
+
+from azure.mgmt.containerinstance.models import (EnvironmentVariable,
+ VolumeMount,
+ ResourceRequests,
+ ResourceRequirements,
+ Container,
+ ContainerGroup)
+from msrestazure.azure_exceptions import CloudError
+
+
+class AzureContainerInstancesOperator(BaseOperator):
+"""
+Start a container on Azure Container Instances
+
+:param ci_conn_id: connection id of a service principal which will be used
+to start the container instance
+:type ci_conn_id: str
+:param registry_conn_id: connection id of a user which can login to a
+private docker registry. If None, we assume a public registry
+:type registry_conn_id: str
+:param resource_group: name of the resource group wherein this container
+instance should be started
+:type resource_group: str
+:param name: name of this container instance. Please note this name has
+to be unique in order to run containers in parallel.
+:type name: str
+:param image: the docker image to be used
+:type image: str
+:param region: the region wherein this container instance should be started
+:type region: str
+:param: environment_variables: key,value pairs containing environment 
variables
+which will be passed to the running container
+:type: environment_variables: dict
+:param: volumes: list of volumes to be mounted to the container.
+Currently only Azure Fileshares are supported.
+:type: volumes: list[]
+:param: memory_in_gb: the amount of memory to allocate to this container
+:type: memory_in_gb: double
+:param: cpu: the number of cpus to allocate to this container
+:type: cpu: double
+
+:Example:
+
+>>>  a = AzureContainerInstancesOperator(
+'azure_service_principal',
+'azure_registry_user',
+'my-resource-group',
+'my-container-name-{{ ds }}',
+'myprivateregistry.azurecr.io/my_container:latest',
+'westeurope',
+{'EXECUTION_DATE': '{{ ds }}'},
+[('azure_wasb_conn_id',
+  'my_storage_container',
+  'my_fileshare',
+  '/input-data',
+  True),],
+memory_in_gb=14.0,
+cpu=4.0,
+task_id='start_container'
+)
+"""
+
+template_fields = ('name', 'environment_variables')
+template_ext = tuple()
+
+def __init__(self, ci_conn_id, registry_conn_id, resource_group, name, 
image, region,
+ environment_variables={}, volumes=[], memory_in_gb=2.0, 
cpu=1.0,
+ *args, **kwargs):
+self.ci_conn_id = ci_conn_id
+self.resource_group = resource_group
+self.name = name
+self.image = image
+self.region = region
+self.registry_conn_id = registry_conn_id
+self.environment_variables = environment_variables
+self.volumes = volumes
+self.memory_in_gb = memory_in_gb
+self.cpu = cpu
+
+super(AzureContainerInstancesOperator, self).__init__(*args, **kwargs)
 
 Review comment:
   Can we apply `super()` first?


[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r230887887
 
 

 ##
 File path: airflow/contrib/hooks/azure_container_hook.py
 ##
 @@ -0,0 +1,129 @@
+
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import os
+
+from airflow.hooks.base_hook import BaseHook
+from airflow.exceptions import AirflowException
+
+from azure.common.client_factory import get_client_from_auth_file
+from azure.common.credentials import ServicePrincipalCredentials
+
+from azure.mgmt.containerinstance import ContainerInstanceManagementClient
+from azure.mgmt.containerinstance.models import (ImageRegistryCredential,
+ Volume,
+ AzureFileVolume)
+
+
+class AzureContainerInstanceHook(BaseHook):
+
+def __init__(self, conn_id='azure_default'):
+self.conn_id = conn_id
+self.connection = self.get_conn()
+
+def get_conn(self):
+conn = self.get_connection(self.conn_id)
+key_path = conn.extra_dejson.get('key_path', False)
+if key_path:
+if key_path.endswith('.json'):
+self.log.info('Getting connection using a JSON key file.')
+return 
get_client_from_auth_file(ContainerInstanceManagementClient,
+ key_path)
+else:
+raise AirflowException('Unrecognised extension for key file.')
+
+if os.environ.get('AZURE_AUTH_LOCATION'):
+key_path = os.environ.get('AZURE_AUTH_LOCATION')
+if key_path.endswith('.json'):
+self.log.info('Getting connection using a JSON key file.')
+return 
get_client_from_auth_file(ContainerInstanceManagementClient,
+ key_path)
+else:
+raise AirflowException('Unrecognised extension for key file.')
+
+credentials = ServicePrincipalCredentials(
+client_id=conn.login,
+secret=conn.password,
+tenant=conn.extra_dejson['tenantId']
+)
+
+subscription_id = conn.extra_dejson['subscriptionId']
+return ContainerInstanceManagementClient(credentials, 
str(subscription_id))
+
+def create_or_update(self, resource_group, name, container_group):
+self.connection.container_groups.create_or_update(resource_group,
+  name,
+  container_group)
+
+def get_state_exitcode(self, resource_group, name):
+response = self.connection.container_groups.get(resource_group,
+name,
+
raw=True).response.json()
+containers = response['properties']['containers']
+instance_view = containers[0]['properties'].get('instanceView', {})
+current_state = instance_view.get('currentState', {})
+
+return current_state.get('state'), current_state.get('exitCode', 0)
+
+def get_messages(self, resource_group, name):
+response = self.connection.container_groups.get(resource_group,
+name,
+
raw=True).response.json()
+containers = response['properties']['containers']
+instance_view = containers[0]['properties'].get('instanceView', {})
+
+return [event['message'] for event in instance_view.get('events', [])]
+
+def get_logs(self, resource_group, name, tail=1000):
+logs = self.connection.container.list_logs(resource_group, name, name, 
tail=tail)
+return logs.content.splitlines(True)
+
+def delete(self, resource_group, name):
+self.connection.container_groups.delete(resource_group, name)
+
+
+class AzureContainerRegistryHook(BaseHook):
 
 Review comment:
   Can we split these in seperate files? In case these classes will grow in the 
future.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact 

[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r230887912
 
 

 ##
 File path: airflow/contrib/hooks/azure_container_hook.py
 ##
 @@ -0,0 +1,129 @@
+
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import os
+
+from airflow.hooks.base_hook import BaseHook
+from airflow.exceptions import AirflowException
+
+from azure.common.client_factory import get_client_from_auth_file
+from azure.common.credentials import ServicePrincipalCredentials
+
+from azure.mgmt.containerinstance import ContainerInstanceManagementClient
+from azure.mgmt.containerinstance.models import (ImageRegistryCredential,
+ Volume,
+ AzureFileVolume)
+
+
+class AzureContainerInstanceHook(BaseHook):
+
+def __init__(self, conn_id='azure_default'):
+self.conn_id = conn_id
+self.connection = self.get_conn()
+
+def get_conn(self):
+conn = self.get_connection(self.conn_id)
+key_path = conn.extra_dejson.get('key_path', False)
+if key_path:
+if key_path.endswith('.json'):
+self.log.info('Getting connection using a JSON key file.')
+return 
get_client_from_auth_file(ContainerInstanceManagementClient,
+ key_path)
+else:
+raise AirflowException('Unrecognised extension for key file.')
+
+if os.environ.get('AZURE_AUTH_LOCATION'):
+key_path = os.environ.get('AZURE_AUTH_LOCATION')
+if key_path.endswith('.json'):
+self.log.info('Getting connection using a JSON key file.')
+return 
get_client_from_auth_file(ContainerInstanceManagementClient,
+ key_path)
+else:
+raise AirflowException('Unrecognised extension for key file.')
+
+credentials = ServicePrincipalCredentials(
+client_id=conn.login,
+secret=conn.password,
+tenant=conn.extra_dejson['tenantId']
+)
+
+subscription_id = conn.extra_dejson['subscriptionId']
+return ContainerInstanceManagementClient(credentials, 
str(subscription_id))
+
+def create_or_update(self, resource_group, name, container_group):
+self.connection.container_groups.create_or_update(resource_group,
+  name,
+  container_group)
+
+def get_state_exitcode(self, resource_group, name):
+response = self.connection.container_groups.get(resource_group,
+name,
+
raw=True).response.json()
+containers = response['properties']['containers']
+instance_view = containers[0]['properties'].get('instanceView', {})
+current_state = instance_view.get('currentState', {})
+
+return current_state.get('state'), current_state.get('exitCode', 0)
+
+def get_messages(self, resource_group, name):
+response = self.connection.container_groups.get(resource_group,
+name,
+
raw=True).response.json()
+containers = response['properties']['containers']
+instance_view = containers[0]['properties'].get('instanceView', {})
+
+return [event['message'] for event in instance_view.get('events', [])]
+
+def get_logs(self, resource_group, name, tail=1000):
+logs = self.connection.container.list_logs(resource_group, name, name, 
tail=tail)
+return logs.content.splitlines(True)
+
+def delete(self, resource_group, name):
+self.connection.container_groups.delete(resource_group, name)
+
+
+class AzureContainerRegistryHook(BaseHook):
+
+def __init__(self, conn_id='azure_registry'):
+self.conn_id = conn_id
+self.connection = self.get_conn()
+
+def get_conn(self):
+conn = self.get_connection(self.conn_id)
+return ImageRegistryCredential(conn.host, conn.login, conn.password)
+
+
+class AzureContainerVolumeHook(BaseHook):
 
 Review comment:
   Can we split these in seperate 

[jira] [Comment Edited] (AIRFLOW-2192) Don't authenticate on Google Authentication

2018-11-05 Thread holdenk (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675691#comment-16675691
 ] 

holdenk edited comment on AIRFLOW-2192 at 11/5/18 7:47 PM:
---

[~Fokko]So looking up the specific character the report as the issue it's "zero 
width whitespace", so it could either be hiding in the original string or being 
used as a separator between first/last name automatically by something else.


was (Author: holdenk):
[~Fokko]So looking up the specific character the user reports as the issue it's 
"zero width whitespace", so it could either be hiding in the original string or 
being used as a separator between first/last name automatically by something 
else.

> Don't authenticate on Google Authentication
> ---
>
> Key: AIRFLOW-2192
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2192
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Affects Versions: 1.8.0
> Environment: OS: Amazon Linux AMI release 2017.09
> RAM: 30.5
> CPU: 4
> Amazon Instance Type: R4.xlarge
> Python: 2.7.13
>Reporter: Fernando Ike
>Assignee: holdenk
>Priority: Critical
> Fix For: 1.10.1
>
> Attachments: airflow.log
>
>
> It's a weird, I tried to login using Google Authentication and Airflow 
> returned "_UnicodeEncodeError: 'latin-1' codec can't encode character 
> u'\u200b' in position 8: ordinal not in range(256)_". 
> So, my google profile was:
> _First Name: Fernando_
> _Last Name: Ike_
> I changed my profile just "_Ike"_ in the "First Name" and now I can login. In 
> the attachment is the log related:



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2192) Don't authenticate on Google Authentication

2018-11-05 Thread holdenk (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675691#comment-16675691
 ] 

holdenk commented on AIRFLOW-2192:
--

[~Fokko]So looking up the specific character the user reports as the issue it's 
"zero width whitespace", so it could either be hiding in the original string or 
being used as a separator between first/last name automatically by something 
else.

> Don't authenticate on Google Authentication
> ---
>
> Key: AIRFLOW-2192
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2192
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Affects Versions: 1.8.0
> Environment: OS: Amazon Linux AMI release 2017.09
> RAM: 30.5
> CPU: 4
> Amazon Instance Type: R4.xlarge
> Python: 2.7.13
>Reporter: Fernando Ike
>Assignee: holdenk
>Priority: Critical
> Fix For: 1.10.1
>
> Attachments: airflow.log
>
>
> It's a weird, I tried to login using Google Authentication and Airflow 
> returned "_UnicodeEncodeError: 'latin-1' codec can't encode character 
> u'\u200b' in position 8: ordinal not in range(256)_". 
> So, my google profile was:
> _First Name: Fernando_
> _Last Name: Ike_
> I changed my profile just "_Ike"_ in the "First Name" and now I can login. In 
> the attachment is the log related:



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io commented on issue #4133: [AIRFLOW-3270] Allow passwordless-binding for LDAP auth backend

2018-11-05 Thread GitBox
codecov-io commented on issue #4133: [AIRFLOW-3270] Allow passwordless-binding 
for LDAP auth backend
URL: 
https://github.com/apache/incubator-airflow/pull/4133#issuecomment-436009600
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4133?src=pr=h1)
 Report
   > Merging 
[#4133](https://codecov.io/gh/apache/incubator-airflow/pull/4133?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/5ea0d97b494674a52c0e91e6a04d201cbecd8f86?src=pr=desc)
 will **decrease** coverage by `6.93%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4133/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4133?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4133  +/-   ##
   ==
   - Coverage   77.49%   70.55%   -6.94% 
   ==
 Files 199  199  
 Lines   1623318116+1883 
   ==
   + Hits1257912782 +203 
   - Misses   3654 5334+1680
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4133?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/mysql\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9teXNxbF9ob29rLnB5)
 | `61.03% <0%> (-29.35%)` | :arrow_down: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `72.17% <0%> (-25.48%)` | :arrow_down: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `51.83% <0%> (-20.55%)` | :arrow_down: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `50.42% <0%> (-18.38%)` | :arrow_down: |
   | 
[airflow/sensors/external\_task\_sensor.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy9zZW5zb3JzL2V4dGVybmFsX3Rhc2tfc2Vuc29yLnB5)
 | `81.57% <0%> (-15.3%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `78.35% <0%> (-13.81%)` | :arrow_down: |
   | 
[airflow/settings.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy9zZXR0aW5ncy5weQ==)
 | `78.18% <0%> (-2.98%)` | :arrow_down: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (-0.28%)` | :arrow_down: |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/incubator-airflow/pull/4133/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `89.05% <0%> (+0.36%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4133?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4133?src=pr=footer).
 Last update 
[5ea0d97...618f82d](https://codecov.io/gh/apache/incubator-airflow/pull/4133?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko closed pull request #4135: [AIRFLOW-3262] Update SimpleHttpOpTests to check Example.com

2018-11-05 Thread GitBox
Fokko closed pull request #4135: [AIRFLOW-3262] Update SimpleHttpOpTests to 
check Example.com
URL: https://github.com/apache/incubator-airflow/pull/4135
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3262) Can't get log containing Response when using SimpleHttpOperator

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3262?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675671#comment-16675671
 ] 

ASF GitHub Bot commented on AIRFLOW-3262:
-

Fokko closed pull request #4135: [AIRFLOW-3262] Update SimpleHttpOpTests to 
check Example.com
URL: https://github.com/apache/incubator-airflow/pull/4135
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Can't get log containing Response when using SimpleHttpOperator
> ---
>
> Key: AIRFLOW-3262
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3262
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Trivial
> Fix For: 1.10.1
>
>
> When you use SimpleHttpOperator for things like ElasticSearch, you want to 
> get the response in the logs as well. Currently, the only workaround is to 
> use `xcom_push` and push the content to xcom and in the next task get the 
> response.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS to GCS operator

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS 
to GCS operator
URL: https://github.com/apache/incubator-airflow/pull/4134#discussion_r230882277
 
 

 ##
 File path: airflow/contrib/operators/adls_to_gcs.py
 ##
 @@ -0,0 +1,144 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.hooks.azure_data_lake_hook import AzureDataLakeHook
+from airflow.contrib.operators.adls_list_operator import 
AzureDataLakeStorageListOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.gcs_hook import GoogleCloudStorageHook, 
_parse_gcs_url
+
+import os
+from tempfile import NamedTemporaryFile
+
+
+class AdlsToGoogleCloudStorageOperator(AzureDataLakeStorageListOperator):
+"""
+Synchronizes an Azure Data Lake Storage path with a GCS bucket
+:param path: The Azure Data Lake path to find the objects (templated)
+:type path: str
+:param dest_gcs: The Google Cloud Storage bucket and prefix to
+store the objects. (templated)
+:type dest_gcs: str
+:param replace: If true, replaces same-named files in GCS
+:type replace: bool
+:param azure_data_lake_conn_id: The connection ID to use when
+connecting to Azure Data Lake Storage.
+:type azure_data_lake_conn_id: str
+:param dest_google_cloud_storage_conn_id: The connection ID to use when
+connecting to Google Cloud Storage.
+:type dest_google_cloud_storage_conn_id: str
+:param delegate_to: The account to impersonate, if any.
+For this to work, the service account making the request must have
+domain-wide delegation enabled.
+:type delegate_to: str
+
+**Examples**:
+The following Operator would copy a single file named
+``hello/world.avro`` from ADLS to the GCS bucket ``mybucket``. Its full
+resulting gcs path will be ``gs://mybucket/hello/world.avro`` ::
+copy_single_file = AdlsToGoogleCloudStorageOperator(
+task_id='copy_single_file',
+path='hello/world.avro',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+
+The following Operator would copy all parquet files from ADLS
+to the GCS bucket ``mybucket``. ::
+copy_all_files = AdlsToGoogleCloudStorageOperator(
+task_id='copy_all_files',
+path='*.parquet',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+
+ The following Operator would copy all parquet files from ADLS
+path ``/hello/world``to the GCS bucket ``mybucket``. ::
 
 Review comment:
   Indentation


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS to GCS operator

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS 
to GCS operator
URL: https://github.com/apache/incubator-airflow/pull/4134#discussion_r230882680
 
 

 ##
 File path: airflow/contrib/operators/adls_to_gcs.py
 ##
 @@ -0,0 +1,144 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.hooks.azure_data_lake_hook import AzureDataLakeHook
+from airflow.contrib.operators.adls_list_operator import 
AzureDataLakeStorageListOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.gcs_hook import GoogleCloudStorageHook, 
_parse_gcs_url
+
+import os
+from tempfile import NamedTemporaryFile
+
+
+class AdlsToGoogleCloudStorageOperator(AzureDataLakeStorageListOperator):
+"""
+Synchronizes an Azure Data Lake Storage path with a GCS bucket
+:param path: The Azure Data Lake path to find the objects (templated)
+:type path: str
+:param dest_gcs: The Google Cloud Storage bucket and prefix to
+store the objects. (templated)
+:type dest_gcs: str
+:param replace: If true, replaces same-named files in GCS
+:type replace: bool
+:param azure_data_lake_conn_id: The connection ID to use when
+connecting to Azure Data Lake Storage.
+:type azure_data_lake_conn_id: str
+:param dest_google_cloud_storage_conn_id: The connection ID to use when
+connecting to Google Cloud Storage.
+:type dest_google_cloud_storage_conn_id: str
+:param delegate_to: The account to impersonate, if any.
+For this to work, the service account making the request must have
+domain-wide delegation enabled.
+:type delegate_to: str
+
+**Examples**:
+The following Operator would copy a single file named
+``hello/world.avro`` from ADLS to the GCS bucket ``mybucket``. Its full
+resulting gcs path will be ``gs://mybucket/hello/world.avro`` ::
+copy_single_file = AdlsToGoogleCloudStorageOperator(
+task_id='copy_single_file',
+path='hello/world.avro',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+
+The following Operator would copy all parquet files from ADLS
+to the GCS bucket ``mybucket``. ::
+copy_all_files = AdlsToGoogleCloudStorageOperator(
+task_id='copy_all_files',
+path='*.parquet',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+
+ The following Operator would copy all parquet files from ADLS
+path ``/hello/world``to the GCS bucket ``mybucket``. ::
+copy_world_files = AdlsToGoogleCloudStorageOperator(
+task_id='copy_world_files',
+path='hello/world/*.parquet',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+"""
+template_fields = ('path', 'dest_gcs')
+ui_color = '#f0eee4'
+
+@apply_defaults
+def __init__(self,
+ path,
+ dest_gcs=None,
+ replace=False,
+ azure_data_lake_conn_id='azure_data_lake_default',
+ google_cloud_storage_conn_id='google_cloud_default',
+ delegate_to=None,
+ *args,
+ **kwargs):
+
+super(AdlsToGoogleCloudStorageOperator, self).__init__(
+path=path,
+azure_data_lake_conn_id=azure_data_lake_conn_id,
+*args,
+**kwargs
+)
+self.dest_gcs = dest_gcs
+self.replace = replace
+self.google_cloud_storage_conn_id = google_cloud_storage_conn_id
+self.delegate_to = delegate_to
+
+def execute(self, context):
+# use the super to 

[GitHub] Fokko commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS to GCS operator

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4134: [AIRFLOW-3213] Create ADLS 
to GCS operator
URL: https://github.com/apache/incubator-airflow/pull/4134#discussion_r230882903
 
 

 ##
 File path: airflow/contrib/operators/adls_to_gcs.py
 ##
 @@ -0,0 +1,144 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.hooks.azure_data_lake_hook import AzureDataLakeHook
+from airflow.contrib.operators.adls_list_operator import 
AzureDataLakeStorageListOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.contrib.hooks.gcs_hook import GoogleCloudStorageHook, 
_parse_gcs_url
+
+import os
+from tempfile import NamedTemporaryFile
+
+
+class AdlsToGoogleCloudStorageOperator(AzureDataLakeStorageListOperator):
+"""
+Synchronizes an Azure Data Lake Storage path with a GCS bucket
+:param path: The Azure Data Lake path to find the objects (templated)
+:type path: str
+:param dest_gcs: The Google Cloud Storage bucket and prefix to
+store the objects. (templated)
+:type dest_gcs: str
+:param replace: If true, replaces same-named files in GCS
+:type replace: bool
+:param azure_data_lake_conn_id: The connection ID to use when
+connecting to Azure Data Lake Storage.
+:type azure_data_lake_conn_id: str
+:param dest_google_cloud_storage_conn_id: The connection ID to use when
+connecting to Google Cloud Storage.
+:type dest_google_cloud_storage_conn_id: str
+:param delegate_to: The account to impersonate, if any.
+For this to work, the service account making the request must have
+domain-wide delegation enabled.
+:type delegate_to: str
+
+**Examples**:
+The following Operator would copy a single file named
+``hello/world.avro`` from ADLS to the GCS bucket ``mybucket``. Its full
+resulting gcs path will be ``gs://mybucket/hello/world.avro`` ::
+copy_single_file = AdlsToGoogleCloudStorageOperator(
+task_id='copy_single_file',
+path='hello/world.avro',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+
+The following Operator would copy all parquet files from ADLS
+to the GCS bucket ``mybucket``. ::
+copy_all_files = AdlsToGoogleCloudStorageOperator(
+task_id='copy_all_files',
+path='*.parquet',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+
+ The following Operator would copy all parquet files from ADLS
+path ``/hello/world``to the GCS bucket ``mybucket``. ::
+copy_world_files = AdlsToGoogleCloudStorageOperator(
+task_id='copy_world_files',
+path='hello/world/*.parquet',
+dest_gcs='gs://mybucket',
+replace=False,
+azure_data_lake_conn_id='azure_data_lake_default',
+google_cloud_storage_conn_id='google_cloud_default'
+)
+"""
+template_fields = ('path', 'dest_gcs')
+ui_color = '#f0eee4'
+
+@apply_defaults
+def __init__(self,
+ path,
+ dest_gcs=None,
+ replace=False,
+ azure_data_lake_conn_id='azure_data_lake_default',
+ google_cloud_storage_conn_id='google_cloud_default',
+ delegate_to=None,
+ *args,
+ **kwargs):
+
+super(AdlsToGoogleCloudStorageOperator, self).__init__(
+path=path,
+azure_data_lake_conn_id=azure_data_lake_conn_id,
+*args,
+**kwargs
+)
+self.dest_gcs = dest_gcs
+self.replace = replace
+self.google_cloud_storage_conn_id = google_cloud_storage_conn_id
+self.delegate_to = delegate_to
+
+def execute(self, context):
+# use the super to 

[GitHub] codecov-io commented on issue #4125: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-05 Thread GitBox
codecov-io commented on issue #4125: [AIRFLOW-2715] Pick up the region setting 
while launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4125#issuecomment-436005504
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4125?src=pr=h1)
 Report
   > Merging 
[#4125](https://codecov.io/gh/apache/incubator-airflow/pull/4125?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/bc3108edc1f63208be1d3bf8893c22bb12c7bc9f?src=pr=desc)
 will **increase** coverage by `2.13%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4125/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4125?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4125  +/-   ##
   ==
   + Coverage   76.66%   78.79%   +2.13% 
   ==
 Files 199  199  
 Lines   1620918211+2002 
   ==
   + Hits1242714350+1923 
   - Misses   3782 3861  +79
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4125?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/http\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvaHR0cF9vcGVyYXRvci5weQ==)
 | `90.69% <0%> (-2.16%)` | :arrow_down: |
   | 
[airflow/executors/dask\_executor.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvZGFza19leGVjdXRvci5weQ==)
 | `1.63% <0%> (-0.37%)` | :arrow_down: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <0%> (+0.09%)` | :arrow_up: |
   | 
[airflow/sensors/external\_task\_sensor.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy9zZW5zb3JzL2V4dGVybmFsX3Rhc2tfc2Vuc29yLnB5)
 | `97.36% <0%> (+0.49%)` | :arrow_up: |
   | 
[airflow/settings.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy9zZXR0aW5ncy5weQ==)
 | `83.03% <0%> (+1.87%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `94.26% <0%> (+2.17%)` | :arrow_up: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `82.25% <0%> (+2.41%)` | :arrow_up: |
   | 
[airflow/hooks/postgres\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9wb3N0Z3Jlc19ob29rLnB5)
 | `94.44% <0%> (+2.77%)` | :arrow_up: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `77.24% <0%> (+4.86%)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `73.88% <0%> (+5.14%)` | :arrow_up: |
   | ... and [3 
more](https://codecov.io/gh/apache/incubator-airflow/pull/4125/diff?src=pr=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4125?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4125?src=pr=footer).
 Last update 
[bc3108e...6759626](https://codecov.io/gh/apache/incubator-airflow/pull/4125?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1368) Automatically remove the container when it exits

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675652#comment-16675652
 ] 

ASF GitHub Bot commented on AIRFLOW-1368:
-

Fokko closed pull request #3741: [AIRFLOW-1368] Add auto_remove for 
DockerOperator
URL: https://github.com/apache/incubator-airflow/pull/3741
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/operators/docker_operator.py 
b/airflow/operators/docker_operator.py
index 69dc1ebef7..72cd931793 100644
--- a/airflow/operators/docker_operator.py
+++ b/airflow/operators/docker_operator.py
@@ -66,6 +66,8 @@ class DockerOperator(BaseOperator):
 :type mem_limit: float or str
 :param network_mode: Network mode for the container.
 :type network_mode: str
+:param auto_remove: Remove the container after execution. Default is false.
+:type auto_remove: bool
 :param tls_ca_cert: Path to a PEM-encoded certificate authority
 to secure the docker connection.
 :type tls_ca_cert: str
@@ -127,10 +129,12 @@ def __init__(
 xcom_push=False,
 xcom_all=False,
 docker_conn_id=None,
+auto_remove=False,
 *args,
 **kwargs):
 
 super(DockerOperator, self).__init__(*args, **kwargs)
+self.auto_remove = auto_remove
 self.api_version = api_version
 self.command = command
 self.cpus = cpus
@@ -203,7 +207,8 @@ def execute(self, context):
 host_config=self.cli.create_host_config(
 binds=self.volumes,
 network_mode=self.network_mode,
-shm_size=self.shm_size),
+shm_size=self.shm_size,
+auto_remove=self.auto_remove),
 image=image,
 mem_limit=self.mem_limit,
 user=self.user,
diff --git a/tests/operators/docker_operator.py 
b/tests/operators/docker_operator.py
index 59d6d58416..0576842063 100644
--- a/tests/operators/docker_operator.py
+++ b/tests/operators/docker_operator.py
@@ -77,7 +77,8 @@ def test_execute(self, client_class_mock, mkdtemp_mock):
 
client_mock.create_host_config.assert_called_with(binds=['/host/path:/container/path',
  
'/mkdtemp:/tmp/airflow'],
   
network_mode='bridge',
-  shm_size=1000)
+  shm_size=1000,
+  auto_remove=False)
 client_mock.images.assert_called_with(name='ubuntu:latest')
 client_mock.logs.assert_called_with(container='some_id', stream=True)
 client_mock.pull.assert_called_with('ubuntu:latest', stream=True)


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Automatically remove the container when it exits
> 
>
> Key: AIRFLOW-1368
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1368
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: docker, operators
> Environment: MacOS Sierra Version 10.12.5
> Docker Community Edition Version 17.06.0-ce-mac18 Stable Channel
> Docker Base Image: puckel/docker-airflow
>Reporter: Nathaniel Varona
>Assignee: Nathaniel Varona
>Priority: Major
>  Labels: docker
> Fix For: 1.9.0
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> A container should automatically remove when it exits for short-term 
> foreground processes.
> Manual Example:
> {{$ docker run -i --rm busybox echo 'Hello World!'}}
> {{> Hello World!}}
> {{$ docker ps -a}}
> Command output should have a list of clean running processes without having 
> an {{Exited (0)}} status.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko closed pull request #3741: [AIRFLOW-1368] Add auto_remove for DockerOperator

2018-11-05 Thread GitBox
Fokko closed pull request #3741: [AIRFLOW-1368] Add auto_remove for 
DockerOperator
URL: https://github.com/apache/incubator-airflow/pull/3741
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/operators/docker_operator.py 
b/airflow/operators/docker_operator.py
index 69dc1ebef7..72cd931793 100644
--- a/airflow/operators/docker_operator.py
+++ b/airflow/operators/docker_operator.py
@@ -66,6 +66,8 @@ class DockerOperator(BaseOperator):
 :type mem_limit: float or str
 :param network_mode: Network mode for the container.
 :type network_mode: str
+:param auto_remove: Remove the container after execution. Default is false.
+:type auto_remove: bool
 :param tls_ca_cert: Path to a PEM-encoded certificate authority
 to secure the docker connection.
 :type tls_ca_cert: str
@@ -127,10 +129,12 @@ def __init__(
 xcom_push=False,
 xcom_all=False,
 docker_conn_id=None,
+auto_remove=False,
 *args,
 **kwargs):
 
 super(DockerOperator, self).__init__(*args, **kwargs)
+self.auto_remove = auto_remove
 self.api_version = api_version
 self.command = command
 self.cpus = cpus
@@ -203,7 +207,8 @@ def execute(self, context):
 host_config=self.cli.create_host_config(
 binds=self.volumes,
 network_mode=self.network_mode,
-shm_size=self.shm_size),
+shm_size=self.shm_size,
+auto_remove=self.auto_remove),
 image=image,
 mem_limit=self.mem_limit,
 user=self.user,
diff --git a/tests/operators/docker_operator.py 
b/tests/operators/docker_operator.py
index 59d6d58416..0576842063 100644
--- a/tests/operators/docker_operator.py
+++ b/tests/operators/docker_operator.py
@@ -77,7 +77,8 @@ def test_execute(self, client_class_mock, mkdtemp_mock):
 
client_mock.create_host_config.assert_called_with(binds=['/host/path:/container/path',
  
'/mkdtemp:/tmp/airflow'],
   
network_mode='bridge',
-  shm_size=1000)
+  shm_size=1000,
+  auto_remove=False)
 client_mock.images.assert_called_with(name='ubuntu:latest')
 client_mock.logs.assert_called_with(container='some_id', stream=True)
 client_mock.pull.assert_called_with('ubuntu:latest', stream=True)


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #3741: [AIRFLOW-1368] Add auto_remove for DockerOperator

2018-11-05 Thread GitBox
Fokko commented on issue #3741: [AIRFLOW-1368] Add auto_remove for 
DockerOperator
URL: 
https://github.com/apache/incubator-airflow/pull/3741#issuecomment-436003996
 
 
   Supersed by https://github.com/apache/incubator-airflow/pull/3977


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3132) Allow to specify auto_remove option for DockerOperator

2018-11-05 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3132?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-3132.
---
Resolution: Fixed

> Allow to specify auto_remove option for DockerOperator
> --
>
> Key: AIRFLOW-3132
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3132
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Guoqiang Ding
>Assignee: Guoqiang Ding
>Priority: Major
>
> Sometimes we want to run docker container command just once. Docker API 
> client allows to specify the auto_remove option when starting a container.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Reopened] (AIRFLOW-3132) Allow to specify auto_remove option for DockerOperator

2018-11-05 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3132?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong reopened AIRFLOW-3132:
---

> Allow to specify auto_remove option for DockerOperator
> --
>
> Key: AIRFLOW-3132
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3132
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Guoqiang Ding
>Assignee: Guoqiang Ding
>Priority: Major
> Fix For: 2.0.0
>
>
> Sometimes we want to run docker container command just once. Docker API 
> client allows to specify the auto_remove option when starting a container.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3132) Allow to specify auto_remove option for DockerOperator

2018-11-05 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3132?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-3132.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

> Allow to specify auto_remove option for DockerOperator
> --
>
> Key: AIRFLOW-3132
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3132
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Guoqiang Ding
>Assignee: Guoqiang Ding
>Priority: Major
> Fix For: 2.0.0
>
>
> Sometimes we want to run docker container command just once. Docker API 
> client allows to specify the auto_remove option when starting a container.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-516) docker_operator - remove unused container

2018-11-05 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-516?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-516.
--
   Resolution: Fixed
Fix Version/s: 2.0.0

> docker_operator - remove unused container 
> --
>
> Key: AIRFLOW-516
> URL: https://issues.apache.org/jira/browse/AIRFLOW-516
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: docker
>Reporter: Amikam Snir
>Assignee: Amikam Snir
>Priority: Minor
> Fix For: 2.0.0
>
>
> The docker operator didn't clean everything. It's equivalent to the following 
> command: docker rm
> reference: https://docs.docker.com/engine/reference/commandline/rm/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-465) docker_operator - Destroy docker container on success

2018-11-05 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-465?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-465.
--
   Resolution: Fixed
Fix Version/s: 2.0.0

> docker_operator - Destroy docker container on success
> -
>
> Key: AIRFLOW-465
> URL: https://issues.apache.org/jira/browse/AIRFLOW-465
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: docker, operators
>Reporter: Felipe Lolas
>Priority: Trivial
> Fix For: 2.0.0
>
>
> It would be nice to have an option to automatically delete dockers after a 
> successful run(docker-py doesn't implements --rm args on API).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3132) Allow to specify auto_remove option for DockerOperator

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3132?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675646#comment-16675646
 ] 

ASF GitHub Bot commented on AIRFLOW-3132:
-

Fokko closed pull request #3977: [AIRFLOW-3132] Add option for DockerOperator
URL: https://github.com/apache/incubator-airflow/pull/3977
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/operators/docker_operator.py 
b/airflow/operators/docker_operator.py
index 517199be51..697f809657 100644
--- a/airflow/operators/docker_operator.py
+++ b/airflow/operators/docker_operator.py
@@ -47,6 +47,10 @@ class DockerOperator(BaseOperator):
 :param api_version: Remote API version. Set to ``auto`` to automatically
 detect the server's version.
 :type api_version: str
+:param auto_remove: Auto-removal of the container on daemon side when the
+container's process exits.
+The default is False.
+:type auto_remove: bool
 :param command: Command to be run in the container. (templated)
 :type command: str or list
 :param cpus: Number of CPUs to assign to the container.
@@ -133,11 +137,13 @@ def __init__(
 docker_conn_id=None,
 dns=None,
 dns_search=None,
+auto_remove=False,
 *args,
 **kwargs):
 
 super(DockerOperator, self).__init__(*args, **kwargs)
 self.api_version = api_version
+self.auto_remove = auto_remove
 self.command = command
 self.cpus = cpus
 self.dns = dns
@@ -209,6 +215,7 @@ def execute(self, context):
 cpu_shares=cpu_shares,
 environment=self.environment,
 host_config=self.cli.create_host_config(
+auto_remove=self.auto_remove,
 binds=self.volumes,
 network_mode=self.network_mode,
 shm_size=self.shm_size,


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow to specify auto_remove option for DockerOperator
> --
>
> Key: AIRFLOW-3132
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3132
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Guoqiang Ding
>Assignee: Guoqiang Ding
>Priority: Major
>
> Sometimes we want to run docker container command just once. Docker API 
> client allows to specify the auto_remove option when starting a container.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko closed pull request #3977: [AIRFLOW-3132] Add option for DockerOperator

2018-11-05 Thread GitBox
Fokko closed pull request #3977: [AIRFLOW-3132] Add option for DockerOperator
URL: https://github.com/apache/incubator-airflow/pull/3977
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/operators/docker_operator.py 
b/airflow/operators/docker_operator.py
index 517199be51..697f809657 100644
--- a/airflow/operators/docker_operator.py
+++ b/airflow/operators/docker_operator.py
@@ -47,6 +47,10 @@ class DockerOperator(BaseOperator):
 :param api_version: Remote API version. Set to ``auto`` to automatically
 detect the server's version.
 :type api_version: str
+:param auto_remove: Auto-removal of the container on daemon side when the
+container's process exits.
+The default is False.
+:type auto_remove: bool
 :param command: Command to be run in the container. (templated)
 :type command: str or list
 :param cpus: Number of CPUs to assign to the container.
@@ -133,11 +137,13 @@ def __init__(
 docker_conn_id=None,
 dns=None,
 dns_search=None,
+auto_remove=False,
 *args,
 **kwargs):
 
 super(DockerOperator, self).__init__(*args, **kwargs)
 self.api_version = api_version
+self.auto_remove = auto_remove
 self.command = command
 self.cpus = cpus
 self.dns = dns
@@ -209,6 +215,7 @@ def execute(self, context):
 cpu_shares=cpu_shares,
 environment=self.environment,
 host_config=self.cli.create_host_config(
+auto_remove=self.auto_remove,
 binds=self.volumes,
 network_mode=self.network_mode,
 shm_size=self.shm_size,


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on issue #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-05 Thread GitBox
bolkedebruin commented on issue #4006: [AIRFLOW-3164] Verify server certificate 
when connecting to LDAP
URL: 
https://github.com/apache/incubator-airflow/pull/4006#issuecomment-436002752
 
 
   @ashb I think we should require ldap to be secure. If somebody doesn't want 
secure they should hack the code. You should not pass usernames and passwords 
unencrypted. Kind of defeats the purpose. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3193) Pin docker requirement version

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3193?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675639#comment-16675639
 ] 

ASF GitHub Bot commented on AIRFLOW-3193:
-

Fokko closed pull request #4130: [AIRFLOW-3193] Pin docker requirement version
URL: https://github.com/apache/incubator-airflow/pull/4130
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/setup.py b/setup.py
index 8c6c927153..1d2d1598e5 100644
--- a/setup.py
+++ b/setup.py
@@ -174,7 +174,7 @@ def write_version(filename=os.path.join(*['airflow',
 'sphinx-rtd-theme>=0.1.6',
 'Sphinx-PyPI-upload>=0.2.1'
 ]
-docker = ['docker>=2.0.0']
+docker = ['docker>=3.0.0']
 druid = ['pydruid>=0.4.1']
 elasticsearch = [
 'elasticsearch>=5.0.0,<6.0.0',


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Pin docker requirement version
> --
>
> Key: AIRFLOW-3193
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3193
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Guoqiang Ding
>Assignee: Guoqiang Ding
>Priority: Major
> Fix For: 2.0.0
>
>
> The method "create_container" in APIClient of docker has been incompatible 
> from version 3.0.0.
>  
> Usage in `_airflow.operators.docker_operator_` as follows.
>  
> {code:java}
> self.container = self.cli.create_container(
> command=self.get_command(),
> cpu_shares=cpu_shares,
> environment=self.environment,
> host_config=self.cli.create_host_config(
> binds=self.volumes,
> network_mode=self.network_mode,
> shm_size=self.shm_size,
> dns=self.dns,
> dns_search=self.dns_search),
> image=image,
> mem_limit=self.mem_limit,
> user=self.user,
> working_dir=self.working_dir
> )
> {code}
>  
> The arguments such as "cpu_shares" and "mem_limit" has gone off. In other 
> words, after version 3.0.0, they should be passed into `create_host_config` 
> method.
>  
> {quote}airflow usage code link:
> https://github.com/apache/incubator-airflow/blob/cdbdcae7c0645ac2987360fced43407202716b99/airflow/operators/docker_operator.py#L207
> {quote}
>  
> {quote}version 3.0.0 code link: 
> https://github.com/docker/docker-py/blob/91bc75cc92f578ae9d659ad7e8ed11a0877b70aa/docker/api/container.py#L206
> {quote}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3193) Pin docker requirement version

2018-11-05 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3193?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-3193.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

> Pin docker requirement version
> --
>
> Key: AIRFLOW-3193
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3193
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Guoqiang Ding
>Assignee: Guoqiang Ding
>Priority: Major
> Fix For: 2.0.0
>
>
> The method "create_container" in APIClient of docker has been incompatible 
> from version 3.0.0.
>  
> Usage in `_airflow.operators.docker_operator_` as follows.
>  
> {code:java}
> self.container = self.cli.create_container(
> command=self.get_command(),
> cpu_shares=cpu_shares,
> environment=self.environment,
> host_config=self.cli.create_host_config(
> binds=self.volumes,
> network_mode=self.network_mode,
> shm_size=self.shm_size,
> dns=self.dns,
> dns_search=self.dns_search),
> image=image,
> mem_limit=self.mem_limit,
> user=self.user,
> working_dir=self.working_dir
> )
> {code}
>  
> The arguments such as "cpu_shares" and "mem_limit" has gone off. In other 
> words, after version 3.0.0, they should be passed into `create_host_config` 
> method.
>  
> {quote}airflow usage code link:
> https://github.com/apache/incubator-airflow/blob/cdbdcae7c0645ac2987360fced43407202716b99/airflow/operators/docker_operator.py#L207
> {quote}
>  
> {quote}version 3.0.0 code link: 
> https://github.com/docker/docker-py/blob/91bc75cc92f578ae9d659ad7e8ed11a0877b70aa/docker/api/container.py#L206
> {quote}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Reopened] (AIRFLOW-3193) Pin docker requirement version

2018-11-05 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3193?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong reopened AIRFLOW-3193:
---

> Pin docker requirement version
> --
>
> Key: AIRFLOW-3193
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3193
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Guoqiang Ding
>Assignee: Guoqiang Ding
>Priority: Major
> Fix For: 2.0.0
>
>
> The method "create_container" in APIClient of docker has been incompatible 
> from version 3.0.0.
>  
> Usage in `_airflow.operators.docker_operator_` as follows.
>  
> {code:java}
> self.container = self.cli.create_container(
> command=self.get_command(),
> cpu_shares=cpu_shares,
> environment=self.environment,
> host_config=self.cli.create_host_config(
> binds=self.volumes,
> network_mode=self.network_mode,
> shm_size=self.shm_size,
> dns=self.dns,
> dns_search=self.dns_search),
> image=image,
> mem_limit=self.mem_limit,
> user=self.user,
> working_dir=self.working_dir
> )
> {code}
>  
> The arguments such as "cpu_shares" and "mem_limit" has gone off. In other 
> words, after version 3.0.0, they should be passed into `create_host_config` 
> method.
>  
> {quote}airflow usage code link:
> https://github.com/apache/incubator-airflow/blob/cdbdcae7c0645ac2987360fced43407202716b99/airflow/operators/docker_operator.py#L207
> {quote}
>  
> {quote}version 3.0.0 code link: 
> https://github.com/docker/docker-py/blob/91bc75cc92f578ae9d659ad7e8ed11a0877b70aa/docker/api/container.py#L206
> {quote}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko closed pull request #4130: [AIRFLOW-3193] Pin docker requirement version

2018-11-05 Thread GitBox
Fokko closed pull request #4130: [AIRFLOW-3193] Pin docker requirement version
URL: https://github.com/apache/incubator-airflow/pull/4130
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/setup.py b/setup.py
index 8c6c927153..1d2d1598e5 100644
--- a/setup.py
+++ b/setup.py
@@ -174,7 +174,7 @@ def write_version(filename=os.path.join(*['airflow',
 'sphinx-rtd-theme>=0.1.6',
 'Sphinx-PyPI-upload>=0.2.1'
 ]
-docker = ['docker>=2.0.0']
+docker = ['docker>=3.0.0']
 druid = ['pydruid>=0.4.1']
 elasticsearch = [
 'elasticsearch>=5.0.0,<6.0.0',


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on a change in pull request #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-05 Thread GitBox
ashb commented on a change in pull request #4006: [AIRFLOW-3164] Verify server 
certificate when connecting to LDAP
URL: https://github.com/apache/incubator-airflow/pull/4006#discussion_r230875122
 
 

 ##
 File path: airflow/contrib/auth/backends/ldap_auth.py
 ##
 @@ -55,16 +55,20 @@ class LdapException(Exception):
 
 
 def get_ldap_connection(dn=None, password=None):
-tls_configuration = None
-use_ssl = False
+cacert = None
 try:
 cacert = configuration.conf.get("ldap", "cacert")
-tls_configuration = Tls(validate=ssl.CERT_REQUIRED, 
ca_certs_file=cacert)
-use_ssl = True
-except Exception:
+except AirflowConfigException:
 pass
 
-server = Server(configuration.conf.get("ldap", "uri"), use_ssl, 
tls_configuration)
+tls_configuration = Tls(validate=ssl.CERT_REQUIRED,
+version=ssl.PROTOCOL_SSLv23,
 
 Review comment:
   Elsewhere in that doc:
   
   > PROTOCOL_SSLv23
   >
   >  Alias for PROTOCOL_TLS
   
   The wording I quoted above is from 
https://docs.python.org/2/library/ssl.html#ssl.create_default_context and is 
called here 
https://github.com/cannatag/ldap3/blob/master/ldap3/core/tls.py#L172-L187 - it 
still sets the validate flag.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on issue #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-05 Thread GitBox
bolkedebruin commented on issue #4006: [AIRFLOW-3164] Verify server certificate 
when connecting to LDAP
URL: 
https://github.com/apache/incubator-airflow/pull/4006#issuecomment-435997392
 
 
   Please verify if you are really negotating the highest level of security. 
Maybe with a test. Docs seem ambiguous here. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on a change in pull request #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-05 Thread GitBox
bolkedebruin commented on a change in pull request #4006: [AIRFLOW-3164] Verify 
server certificate when connecting to LDAP
URL: https://github.com/apache/incubator-airflow/pull/4006#discussion_r230873322
 
 

 ##
 File path: airflow/contrib/auth/backends/ldap_auth.py
 ##
 @@ -55,16 +55,20 @@ class LdapException(Exception):
 
 
 def get_ldap_connection(dn=None, password=None):
-tls_configuration = None
-use_ssl = False
+cacert = None
 try:
 cacert = configuration.conf.get("ldap", "cacert")
-tls_configuration = Tls(validate=ssl.CERT_REQUIRED, 
ca_certs_file=cacert)
-use_ssl = True
-except Exception:
+except AirflowConfigException:
 pass
 
-server = Server(configuration.conf.get("ldap", "uri"), use_ssl, 
tls_configuration)
+tls_configuration = Tls(validate=ssl.CERT_REQUIRED,
+version=ssl.PROTOCOL_SSLv23,
 
 Review comment:
   I.probably misread the docs, but I'm not sure if you are looking at the 
right place either. From https://docs.python.org/2/library/ssl.html
   
   The parameter ssl_version specifies which version of the SSL protocol to 
use. Typically, the server chooses a particular protocol version, and the 
client must adapt to the server’s choice. Most of the versions are not 
interoperable with the other versions. If not specified, the default is 
PROTOCOL_SSLv23; it provides the most compatibility with other versions.
   
   @ashn


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3253) KubernetesPodOperator Unauthorized Code 401

2018-11-05 Thread Trevor Edwards (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3253?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675618#comment-16675618
 ] 

Trevor Edwards commented on AIRFLOW-3253:
-

The fix is merged now. I think it will be in the Kubernetes Python client 
v8.0.1 or later, but that is not out yet. Once it's out, we can consider 
updating setup.py, as it requires a rather outdated client version (from 
august, 2017): 
https://github.com/apache/incubator-airflow/blob/master/setup.py#L207

> KubernetesPodOperator Unauthorized Code 401
> ---
>
> Key: AIRFLOW-3253
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3253
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication, gcp, kubernetes
>Affects Versions: 1.10.0
>Reporter: Sunny Gupta
>Assignee: Trevor Edwards
>Priority: Minor
> Attachments: Screenshot from 2018-10-25 02-08-28.png
>
>
> apache-airflow==1.10.0
> kubernetes==7.0.0 (Tried)
>  kubernetes==8.0.0b1 (Tried)
>  
> Everytime after couple of successful scheduled runs, some runs failed and 
> throw below error.
> Error looks related  to k8s authorization and it seems like a pattern in my 
> case, everytime expiry comes near, job fails and after new expiry updates it 
> runs for a while and fails.
> !Screenshot from 2018-10-25 02-08-28.png!
> Above speculation could be wrong, need help to fix this issue. I am running 
> one sample python hello DAG and planning to move production workload but this 
> is blocker for me.
> Tried :(
>  * ~/.kube folder clearing and regenerate token by `gcloud container clusters 
> get-credentials ***` even tried setting as cron to force update tokens.
>  * Tried kubernetes==7.0.0 to latest beta version.
> Below is my kubectl config. When I run *kubectl* cli to do GET ops on pods, 
> nodes resources,no issues.
>  
> {code:java}
> $ kubectl config view 
> apiVersion: v1
> clusters:
> - cluster:
>     certificate-authority-data: DATA+OMITTED
>     server: https://XX.XX.XX.XX
>   name: gke_us-central1-b_dev-kube-cluster
> contexts:
> - context:
>     cluster: gke_us-central1-b_dev-kube-cluster
>     user: gke_us-central1-b_dev-kube-cluster
>   name: gke_us-central1-b_dev-kube-cluster
> current-context: gke_us-central1-b_dev-kube-cluster
> kind: Config
> preferences: {}
> users:
> - name: gke_us-central1-b_dev-kube-cluster
>   user:
>     auth-provider:
>   config:
>     access-token: ya29.c.TOKEN5EREdigv
>     cmd-args: config config-helper --format=json
>     cmd-path: /usr/lib/google-cloud-sdk/bin/gcloud
>     expiry: 2018-10-24T20:54:37Z
>     expiry-key: '{.credential.token_expiry}'
>     token-key: '{.credential.access_token}'
>   name: gcp
> {code}
>  
>  
> In an hour, running every */5 min, 2-3 jobs fails. with below error.
>  
> {code:java}
> kubernetes.client.rest.ApiException: (401)
> Reason: Unauthorized
> HTTP response headers: HTTPHeaderDict({'Date': 'Wed, 24 Oct 2018 06:20:04 
> GMT', 'Content-Length': '129', 'Audit-Id': 
> '89dcda61-a60f-4b23-85d6-9d28a6bfeed0', 'Www-Authenticate': 'Basic 
> realm="kubernetes-master"', 'Content-Type': 'application/json'})
> HTTP response body: 
> {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"Unauthorized","reason":"Unauthorized","code":401}{code}
>  
>  
> {code:java}
> // complete logs
> 
> *** Log file does not exist: 
> /root/airflow/logs/pyk8s.v3/python-hello/2018-10-24T06:16:00+00:00/1.log
> *** Fetching from: 
> http://aflow-worker.internal:8793/log/pyk8s.v3/python-hello/2018-10-24T06:16:00+00:00/1.log
> [2018-10-24 06:20:02,947] {models.py:1335} INFO - Dependencies all met for 
> 
> [2018-10-24 06:20:02,952] {models.py:1335} INFO - Dependencies all met for 
> 
> [2018-10-24 06:20:02,952] {models.py:1547} INFO -
> 
> Starting attempt 1 of 1
> 
> [2018-10-24 06:20:02,966] {models.py:1569} INFO - Executing 
>  on 2018-10-24T06:16:00+00:00
> [2018-10-24 06:20:02,967] {base_task_runner.py:124} INFO - Running: ['bash', 
> '-c', 'airflow run pyk8s.v3 python-hello 2018-10-24T06:16:00+00:00 --job_id 
> 354 --raw -sd DAGS_FOLDER/pyk8s.v3.py --cfg_path /tmp/tmpf0saygt7']
> [2018-10-24 06:20:03,405] {base_task_runner.py:107} INFO - Job 354: Subtask 
> python-hello [2018-10-24 06:20:03,404] {settings.py:174} INFO - 
> setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
> [2018-10-24 06:20:03,808] {base_task_runner.py:107} INFO - Job 354: Subtask 
> python-hello [2018-10-24 06:20:03,807] {__init__.py:51} INFO - Using executor 
> CeleryExecutor
> [2018-10-24 06:20:03,970] {base_task_runner.py:107} INFO 

[GitHub] Fokko commented on issue #4127: Bug Fix: Secrets object and key separated by ":"

2018-11-05 Thread GitBox
Fokko commented on issue #4127: Bug Fix: Secrets object and key separated by ":"
URL: 
https://github.com/apache/incubator-airflow/pull/4127#issuecomment-435996676
 
 
   @uesenthi Travis is failing. Can't we supply a dict instead of exploding 
strings?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on a change in pull request #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-05 Thread GitBox
bolkedebruin commented on a change in pull request #4006: [AIRFLOW-3164] Verify 
server certificate when connecting to LDAP
URL: https://github.com/apache/incubator-airflow/pull/4006#discussion_r230873322
 
 

 ##
 File path: airflow/contrib/auth/backends/ldap_auth.py
 ##
 @@ -55,16 +55,20 @@ class LdapException(Exception):
 
 
 def get_ldap_connection(dn=None, password=None):
-tls_configuration = None
-use_ssl = False
+cacert = None
 try:
 cacert = configuration.conf.get("ldap", "cacert")
-tls_configuration = Tls(validate=ssl.CERT_REQUIRED, 
ca_certs_file=cacert)
-use_ssl = True
-except Exception:
+except AirflowConfigException:
 pass
 
-server = Server(configuration.conf.get("ldap", "uri"), use_ssl, 
tls_configuration)
+tls_configuration = Tls(validate=ssl.CERT_REQUIRED,
+version=ssl.PROTOCOL_SSLv23,
 
 Review comment:
   I.probably misread the docs, but I'm not sure if you are looking at the 
right place either. From https://docs.python.org/2/library/ssl.html
   
   The parameter ssl_version specifies which version of the SSL protocol to 
use. Typically, the server chooses a particular protocol version, and the 
client must adapt to the server’s choice. Most of the versions are not 
interoperable with the other versions. If not specified, the default is 
PROTOCOL_SSLv23; it provides the most compatibility with other versions.
   
   @ashb


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on a change in pull request #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-05 Thread GitBox
bolkedebruin commented on a change in pull request #4006: [AIRFLOW-3164] Verify 
server certificate when connecting to LDAP
URL: https://github.com/apache/incubator-airflow/pull/4006#discussion_r230873322
 
 

 ##
 File path: airflow/contrib/auth/backends/ldap_auth.py
 ##
 @@ -55,16 +55,20 @@ class LdapException(Exception):
 
 
 def get_ldap_connection(dn=None, password=None):
-tls_configuration = None
-use_ssl = False
+cacert = None
 try:
 cacert = configuration.conf.get("ldap", "cacert")
-tls_configuration = Tls(validate=ssl.CERT_REQUIRED, 
ca_certs_file=cacert)
-use_ssl = True
-except Exception:
+except AirflowConfigException:
 pass
 
-server = Server(configuration.conf.get("ldap", "uri"), use_ssl, 
tls_configuration)
+tls_configuration = Tls(validate=ssl.CERT_REQUIRED,
+version=ssl.PROTOCOL_SSLv23,
 
 Review comment:
   I.probably misread the docs, but I'm not sure if you are looking at the 
right place either. From 
   
   The parameter ssl_version specifies which version of the SSL protocol to 
use. Typically, the server chooses a particular protocol version, and the 
client must adapt to the server’s choice. Most of the versions are not 
interoperable with the other versions. If not specified, the default is 
PROTOCOL_SSLv23; it provides the most compatibility with other versions.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4126: [AIRFLOW-2524] More AWS SageMaker operators, sensors for model, endpoint-config and endpoint

2018-11-05 Thread GitBox
Fokko commented on a change in pull request #4126: [AIRFLOW-2524] More AWS 
SageMaker operators, sensors for model, endpoint-config and endpoint
URL: https://github.com/apache/incubator-airflow/pull/4126#discussion_r230872182
 
 

 ##
 File path: airflow/contrib/operators/sagemaker_endpoint_config_operator.py
 ##
 @@ -0,0 +1,67 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.operators.sagemaker_base_operator import 
SageMakerBaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+
+
+class SageMakerEndpointConfigOperator(SageMakerBaseOperator):
+
+"""
+Create a SageMaker endpoint config.
+
+This operator returns The ARN of the endpoint config created in Amazon 
SageMaker
+
+:param config: The configuration necessary to create an endpoint config.
+
+For details of the configuration parameter, See:
+
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sagemaker.html#SageMaker.Client.create_endpoint_config
+:type config: dict
+:param aws_conn_id: The AWS connection ID to use.
+:type aws_conn_id: str
+"""  # noqa
 
 Review comment:
   Why the `noqa?`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4125: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-05 Thread GitBox
Fokko commented on issue #4125: [AIRFLOW-2715] Pick up the region setting while 
launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4125#issuecomment-435994411
 
 
   @janhicken Retriggered the CI.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4006: [AIRFLOW-3164] Verify server certificate when connecting to LDAP

2018-11-05 Thread GitBox
ashb commented on issue #4006: [AIRFLOW-3164] Verify server certificate when 
connecting to LDAP
URL: 
https://github.com/apache/incubator-airflow/pull/4006#issuecomment-435993735
 
 
   In the fixup commit I just pushed I have re-enabled the ability to not have 
to use a certificate on the ldap server - do we think we should force people to 
use ldaps or not?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko closed pull request #4087: [AIRFLOW-2192] Allow non-latin1 usernames with MySQL back-end

2018-11-05 Thread GitBox
Fokko closed pull request #4087: [AIRFLOW-2192] Allow non-latin1 usernames with 
MySQL back-end
URL: https://github.com/apache/incubator-airflow/pull/4087
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 9a60bb99a7..b09f62c16d 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -86,6 +86,9 @@ executor = SequentialExecutor
 # their website
 sql_alchemy_conn = sqlite:///{AIRFLOW_HOME}/airflow.db
 
+# The encoding for the databases
+sql_engine_encoding = utf-8
+
 # If SqlAlchemy should pool database connections.
 sql_alchemy_pool_enabled = True
 
diff --git a/airflow/settings.py b/airflow/settings.py
index 098164cdc7..8f8420ea22 100644
--- a/airflow/settings.py
+++ b/airflow/settings.py
@@ -155,7 +155,7 @@ def configure_orm(disable_connection_pool=False):
 engine_args['poolclass'] = NullPool
 log.debug("settings.configure_orm(): Using NullPool")
 elif 'sqlite' not in SQL_ALCHEMY_CONN:
-# Engine args not supported by sqlite.
+# Pool size engine args not supported by sqlite.
 # If no config value is defined for the pool size, select a reasonable 
value.
 # 0 means no limit, which could lead to exceeding the Database 
connection limit.
 try:
@@ -177,6 +177,16 @@ def configure_orm(disable_connection_pool=False):
 engine_args['pool_size'] = pool_size
 engine_args['pool_recycle'] = pool_recycle
 
+try:
+# Allow the user to specify an encoding for their DB otherwise default
+# to utf-8 so jobs & users with non-latin1 characters can still use
+# us.
+engine_args['encoding'] = conf.get('core', 'SQL_ENGINE_ENCODING')
+except conf.AirflowConfigException:
+engine_args['encoding'] = 'utf-8'
+# For Python2 we get back a newstr and need a str
+engine_args['encoding'] = engine_args['encoding'].__str__()
+
 engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
 reconnect_timeout = conf.getint('core', 'SQL_ALCHEMY_RECONNECT_TIMEOUT')
 setup_event_handlers(engine, reconnect_timeout)
diff --git a/tests/core.py b/tests/core.py
index 679ddbc125..191af11ff3 100644
--- a/tests/core.py
+++ b/tests/core.py
@@ -2121,6 +2121,11 @@ def test_password_user_authenticate(self):
 self.password_user.password = "secure_password"
 self.assertTrue(self.password_user.authenticate("secure_password"))
 
+def test_password_unicode_user_authenticate(self):
+self.password_user.username = u""  # This is a panda
+self.password_user.password = "secure_password"
+self.assertTrue(self.password_user.authenticate("secure_password"))
+
 def test_password_authenticate_session(self):
 from airflow.contrib.auth.backends.password_auth import PasswordUser
 self.password_user.password = 'test_password'


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2192) Don't authenticate on Google Authentication

2018-11-05 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16675596#comment-16675596
 ] 

ASF GitHub Bot commented on AIRFLOW-2192:
-

Fokko closed pull request #4087: [AIRFLOW-2192] Allow non-latin1 usernames with 
MySQL back-end
URL: https://github.com/apache/incubator-airflow/pull/4087
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 9a60bb99a7..b09f62c16d 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -86,6 +86,9 @@ executor = SequentialExecutor
 # their website
 sql_alchemy_conn = sqlite:///{AIRFLOW_HOME}/airflow.db
 
+# The encoding for the databases
+sql_engine_encoding = utf-8
+
 # If SqlAlchemy should pool database connections.
 sql_alchemy_pool_enabled = True
 
diff --git a/airflow/settings.py b/airflow/settings.py
index 098164cdc7..8f8420ea22 100644
--- a/airflow/settings.py
+++ b/airflow/settings.py
@@ -155,7 +155,7 @@ def configure_orm(disable_connection_pool=False):
 engine_args['poolclass'] = NullPool
 log.debug("settings.configure_orm(): Using NullPool")
 elif 'sqlite' not in SQL_ALCHEMY_CONN:
-# Engine args not supported by sqlite.
+# Pool size engine args not supported by sqlite.
 # If no config value is defined for the pool size, select a reasonable 
value.
 # 0 means no limit, which could lead to exceeding the Database 
connection limit.
 try:
@@ -177,6 +177,16 @@ def configure_orm(disable_connection_pool=False):
 engine_args['pool_size'] = pool_size
 engine_args['pool_recycle'] = pool_recycle
 
+try:
+# Allow the user to specify an encoding for their DB otherwise default
+# to utf-8 so jobs & users with non-latin1 characters can still use
+# us.
+engine_args['encoding'] = conf.get('core', 'SQL_ENGINE_ENCODING')
+except conf.AirflowConfigException:
+engine_args['encoding'] = 'utf-8'
+# For Python2 we get back a newstr and need a str
+engine_args['encoding'] = engine_args['encoding'].__str__()
+
 engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
 reconnect_timeout = conf.getint('core', 'SQL_ALCHEMY_RECONNECT_TIMEOUT')
 setup_event_handlers(engine, reconnect_timeout)
diff --git a/tests/core.py b/tests/core.py
index 679ddbc125..191af11ff3 100644
--- a/tests/core.py
+++ b/tests/core.py
@@ -2121,6 +2121,11 @@ def test_password_user_authenticate(self):
 self.password_user.password = "secure_password"
 self.assertTrue(self.password_user.authenticate("secure_password"))
 
+def test_password_unicode_user_authenticate(self):
+self.password_user.username = u""  # This is a panda
+self.password_user.password = "secure_password"
+self.assertTrue(self.password_user.authenticate("secure_password"))
+
 def test_password_authenticate_session(self):
 from airflow.contrib.auth.backends.password_auth import PasswordUser
 self.password_user.password = 'test_password'


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Don't authenticate on Google Authentication
> ---
>
> Key: AIRFLOW-2192
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2192
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Affects Versions: 1.8.0
> Environment: OS: Amazon Linux AMI release 2017.09
> RAM: 30.5
> CPU: 4
> Amazon Instance Type: R4.xlarge
> Python: 2.7.13
>Reporter: Fernando Ike
>Assignee: holdenk
>Priority: Critical
> Fix For: 1.10.1
>
> Attachments: airflow.log
>
>
> It's a weird, I tried to login using Google Authentication and Airflow 
> returned "_UnicodeEncodeError: 'latin-1' codec can't encode character 
> u'\u200b' in position 8: ordinal not in range(256)_". 
> So, my google profile was:
> _First Name: Fernando_
> _Last Name: Ike_
> I changed my profile just "_Ike"_ in the "First Name" and now I can login. In 
> the attachment is the log related:



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko commented on issue #4087: [AIRFLOW-2192] Allow non-latin1 usernames with MySQL back-end

2018-11-05 Thread GitBox
Fokko commented on issue #4087: [AIRFLOW-2192] Allow non-latin1 usernames with 
MySQL back-end
URL: 
https://github.com/apache/incubator-airflow/pull/4087#issuecomment-435992032
 
 
   Thanks Holden, another happy saved


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   3   >