[GitHub] Fokko commented on issue #4283: [AIRFLOW-3450] Remove unnecessary sigint handler

2018-12-13 Thread GitBox
Fokko commented on issue #4283: [AIRFLOW-3450] Remove unnecessary sigint handler
URL: 
https://github.com/apache/incubator-airflow/pull/4283#issuecomment-446878286
 
 
   Very good point @NielsZeilemaker Thanks for the elaboration. Setting a flag 
which will cause the scheduler to shutdown sounds like a better plan. In this 
case we should catch the `SIGTERM`, set a flag which will stop the scheduler 
loop, and shutdown the application in 30 seconds. Ref: 
https://pracucci.com/graceful-shutdown-of-kubernetes-pods.html


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko closed pull request #4283: [AIRFLOW-3450] Remove unnecessary sigint handler

2018-12-13 Thread GitBox
Fokko closed pull request #4283: [AIRFLOW-3450] Remove unnecessary sigint 
handler
URL: https://github.com/apache/incubator-airflow/pull/4283
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index de6f0c9cab..bcb65c1052 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -85,10 +85,6 @@
 DAGS_FOLDER = '[AIRFLOW_HOME]/dags'
 
 
-def sigint_handler(sig, frame):
-sys.exit(0)
-
-
 def sigquit_handler(sig, frame):
 """Helps debug deadlocks by printing stacktraces when this gets a SIGQUIT
 e.g. kill -s QUIT  or CTRL+\
@@ -997,8 +993,6 @@ def scheduler(args):
 stdout.close()
 stderr.close()
 else:
-signal.signal(signal.SIGINT, sigint_handler)
-signal.signal(signal.SIGTERM, sigint_handler)
 signal.signal(signal.SIGQUIT, sigquit_handler)
 job.run()
 
@@ -1076,9 +1070,6 @@ def worker(args):
 stdout.close()
 stderr.close()
 else:
-signal.signal(signal.SIGINT, sigint_handler)
-signal.signal(signal.SIGTERM, sigint_handler)
-
 sp = subprocess.Popen(['airflow', 'serve_logs'], env=env, 
close_fds=True)
 
 worker.run(**options)
@@ -1306,9 +1297,6 @@ def flower(args):
 stdout.close()
 stderr.close()
 else:
-signal.signal(signal.SIGINT, sigint_handler)
-signal.signal(signal.SIGTERM, sigint_handler)
-
 os.execvp("flower", ['flower', '-b',
  broka, address, port, api, flower_conf, 
url_prefix, basic_auth])
 


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3450) Remove unnecessary signal handlers

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3450?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16719892#comment-16719892
 ] 

ASF GitHub Bot commented on AIRFLOW-3450:
-

Fokko closed pull request #4283: [AIRFLOW-3450] Remove unnecessary sigint 
handler
URL: https://github.com/apache/incubator-airflow/pull/4283
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index de6f0c9cab..bcb65c1052 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -85,10 +85,6 @@
 DAGS_FOLDER = '[AIRFLOW_HOME]/dags'
 
 
-def sigint_handler(sig, frame):
-sys.exit(0)
-
-
 def sigquit_handler(sig, frame):
 """Helps debug deadlocks by printing stacktraces when this gets a SIGQUIT
 e.g. kill -s QUIT  or CTRL+\
@@ -997,8 +993,6 @@ def scheduler(args):
 stdout.close()
 stderr.close()
 else:
-signal.signal(signal.SIGINT, sigint_handler)
-signal.signal(signal.SIGTERM, sigint_handler)
 signal.signal(signal.SIGQUIT, sigquit_handler)
 job.run()
 
@@ -1076,9 +1070,6 @@ def worker(args):
 stdout.close()
 stderr.close()
 else:
-signal.signal(signal.SIGINT, sigint_handler)
-signal.signal(signal.SIGTERM, sigint_handler)
-
 sp = subprocess.Popen(['airflow', 'serve_logs'], env=env, 
close_fds=True)
 
 worker.run(**options)
@@ -1306,9 +1297,6 @@ def flower(args):
 stdout.close()
 stderr.close()
 else:
-signal.signal(signal.SIGINT, sigint_handler)
-signal.signal(signal.SIGTERM, sigint_handler)
-
 os.execvp("flower", ['flower', '-b',
  broka, address, port, api, flower_conf, 
url_prefix, basic_auth])
 


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Remove unnecessary signal handlers 
> ---
>
> Key: AIRFLOW-3450
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3450
> Project: Apache Airflow
>  Issue Type: Task
>  Components: cli
>Reporter: Fokko Driesprong
>Assignee: Fokko Driesprong
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow

2018-12-13 Thread GitBox
Fokko commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility 
to Airflow
URL: 
https://github.com/apache/incubator-airflow/pull/4068#issuecomment-446880175
 
 
   Like stated earlier. My preference would be to have the operator itself poll 
for it. If you think about atomicity, if the job fails, you will need to clear 
both the sensor and the operator. This makes life more complicated and adds 
very little value. Also, your dags will be twice as big.
   
   Please fix the flake8 issues: 
`./tests/contrib/operators/test_aws_glue_job_operator.py:42:9: F841 local 
variable 'some_script' is assigned to but never used`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4298: [AIRFLOW-3478] Make sure that the session is closed

2018-12-13 Thread GitBox
Fokko commented on a change in pull request #4298: [AIRFLOW-3478] Make sure 
that the session is closed
URL: https://github.com/apache/incubator-airflow/pull/4298#discussion_r241308955
 
 

 ##
 File path: airflow/bin/cli.py
 ##
 @@ -456,14 +448,12 @@ def _run(args, dag, ti):
 if args.ship_dag:
 try:
 # Running remotely, so pickling the DAG
-session = settings.Session()
-pickle = DagPickle(dag)
-session.add(pickle)
-session.commit()
-pickle_id = pickle.id
-# TODO: This should be written to a log
-print('Pickled dag {dag} as pickle_id:{pickle_id}'
 
 Review comment:
   Good point @mik-laj 
   
   The line is unrelated to the code, but I will push the fix in this PR anyway.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4057: [AIRFLOW-3216] HiveServer2Hook need a password with LDAP authentication

2018-12-13 Thread GitBox
Fokko commented on issue #4057: [AIRFLOW-3216] HiveServer2Hook need a password 
with LDAP authentication
URL: 
https://github.com/apache/incubator-airflow/pull/4057#issuecomment-446887716
 
 
   Thanks @jongyoul 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3216) HiveServer2Hook need a password with LDAP authentication

2018-12-13 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-3216.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

> HiveServer2Hook need a password with LDAP authentication
> 
>
> Key: AIRFLOW-3216
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3216
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hive_hooks
>Affects Versions: 1.10.0
>Reporter: Jongyoul Lee
>Assignee: Jongyoul Lee
>Priority: Major
> Fix For: 2.0.0
>
>
> In the case where HiveServer2 is used under LDAP auth environment, we need to 
> set a password to access HiveServer2



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] NielsZeilemaker commented on issue #4283: [AIRFLOW-3450] Remove unnecessary sigint handler

2018-12-13 Thread GitBox
NielsZeilemaker commented on issue #4283: [AIRFLOW-3450] Remove unnecessary 
sigint handler
URL: 
https://github.com/apache/incubator-airflow/pull/4283#issuecomment-446890350
 
 
   If you're going to fix/mess with it, also have a look at this line:
   
https://github.com/apache/incubator-airflow/blob/1d53f939669102cd0c8461ad9d756b3e0cf74dbe/airflow/jobs.py#L206
   
   Doesn't seem like a good idea to mark a job as success upon a systemexit….


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] oelesinsc24 commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow

2018-12-13 Thread GitBox
oelesinsc24 commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job 
Compatibility to Airflow
URL: 
https://github.com/apache/incubator-airflow/pull/4068#issuecomment-446891406
 
 
   > Do you ever do anything else than Submit the job and the immediately start 
a sensor to poll for it?
   
   Now I get the point and I agree as well. I will add the necessary changes. 
@Fokko , thanks for the clarification as well.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4283: [AIRFLOW-3450] Remove unnecessary sigint handler

2018-12-13 Thread GitBox
Fokko commented on issue #4283: [AIRFLOW-3450] Remove unnecessary sigint handler
URL: 
https://github.com/apache/incubator-airflow/pull/4283#issuecomment-446891697
 
 
   @NielsZeilemaker My suggestion would be to keep the PR's nice and small, so 
that might be something for you to pick up :-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Bl3f commented on issue #4084: [AIRFLOW-3205] Support multipart uploads to GCS

2018-12-13 Thread GitBox
Bl3f commented on issue #4084: [AIRFLOW-3205] Support multipart uploads to GCS
URL: 
https://github.com/apache/incubator-airflow/pull/4084#issuecomment-446894266
 
 
   I'm sorry to ask but why don't you use the resumable param of the 
`MediaFileUpload`? Actually the upload by chunks is already implemented in this 
class (cf. 
https://developers.google.com/api-client-library/python/guide/media_upload) :/ 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Bl3f edited a comment on issue #4084: [AIRFLOW-3205] Support multipart uploads to GCS

2018-12-13 Thread GitBox
Bl3f edited a comment on issue #4084: [AIRFLOW-3205] Support multipart uploads 
to GCS
URL: 
https://github.com/apache/incubator-airflow/pull/4084#issuecomment-446894266
 
 
   I'm sorry to ask but why don't you use the resumable param of the 
`MediaFileUpload`? Actually the upload by chunks is already implemented in this 
class (cf. 
https://developers.google.com/api-client-library/python/guide/media_upload) :/ 
   
   It will avoid maintain in a operator some duplicated code I guess.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow

2018-12-13 Thread GitBox
Fokko commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility 
to Airflow
URL: 
https://github.com/apache/incubator-airflow/pull/4068#issuecomment-446896104
 
 
   Any time @oelesinsc24. Let me know when you're ready, so we can get this in! 
đź‘Ť 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] MarcusSorealheis commented on issue #4295: AIRFLOW-3452 removed an unused/dangerous display-none

2018-12-13 Thread GitBox
MarcusSorealheis commented on issue #4295: AIRFLOW-3452 removed an 
unused/dangerous display-none
URL: 
https://github.com/apache/incubator-airflow/pull/4295#issuecomment-446905720
 
 
   this is ready to go. solves a simple problem.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] MarcusSorealheis removed a comment on issue #4295: AIRFLOW-3452 removed an unused/dangerous display-none

2018-12-13 Thread GitBox
MarcusSorealheis removed a comment on issue #4295: AIRFLOW-3452 removed an 
unused/dangerous display-none
URL: 
https://github.com/apache/incubator-airflow/pull/4295#issuecomment-446905720
 
 
   this is ready to go. solves a simple problem.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on a change in pull request #2450: [Airflow-1413] Fix FTPSensor failing on error message with unexpected text.

2018-12-13 Thread GitBox
ashb commented on a change in pull request #2450: [Airflow-1413] Fix FTPSensor 
failing on error message with unexpected text.
URL: https://github.com/apache/incubator-airflow/pull/2450#discussion_r241337462
 
 

 ##
 File path: tests/contrib/sensors/test_ftp_sensor.py
 ##
 @@ -66,6 +71,27 @@ def test_poke_fails_due_error(self):
 
 self.assertTrue("530" in str(context.exception))
 
+def test_poke_fail_on_transient_error(self):
+op = FTPSensor(path="foobar.json", ftp_conn_id="bob_ftp",
+   task_id="test_task")
+
+self.hook_mock.get_mod_time.side_effect = \
+error_perm("434: Host unavailable")
+
+with self.assertRaises(error_perm) as context:
+op.execute(None)
+
+self.assertTrue("434" in str(context.exception))
+
+def test_poke_ignore_transient_error(self):
+op = FTPSensor(path="foobar.json", ftp_conn_id="bob_ftp",
+   task_id="test_task", fail_on_transient_errors=False)
+
+self.hook_mock.get_mod_time.side_effect = \
+[error_perm("434: Host unavailable"), None]
+
+self.assertFalse(op.poke(None))
+self.assertTrue(op.poke(None))
 
 if __name__ == '__main__':
 
 Review comment:
   Tests are now failing the style check here with:
   
   > ./tests/contrib/sensors/test_ftp_sensor.py:96:1: E305 expected 2 blank 
lines after class or function definition, found 1
   
   ```suggestion
   
   if __name__ == '__main__':
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3223) RBAC with GitHub Authentication

2018-12-13 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3223?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720011#comment-16720011
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3223:


FAB uses Flask-OpenID for Oauth, and that should be workable for Github. I 
don't know the exact config you'd need but it feels like it should be doable 
without needing any code changes.

Useful docs:
https://flask-appbuilder.readthedocs.io/en/latest/security.html#authentication-openid
https://pythonhosted.org/Flask-OpenID/
https://help.github.com/articles/authorizing-oauth-apps/

> RBAC with  GitHub Authentication
> 
>
> Key: AIRFLOW-3223
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3223
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: authentication
>Affects Versions: 1.10.0
>Reporter: Vikram Fugro
>Assignee: Sai Phanindhra
>Priority: Major
>
> With airflow 1.10 released having RBAC support, I was wondering how I do 
> configure GitHub Auth with airflow's RBAC.  In which case, I believe we don't 
> have to create any users using airflow.  Are there any notes on this?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Bl3f removed a comment on issue #4084: [AIRFLOW-3205] Support multipart uploads to GCS

2018-12-13 Thread GitBox
Bl3f removed a comment on issue #4084: [AIRFLOW-3205] Support multipart uploads 
to GCS
URL: 
https://github.com/apache/incubator-airflow/pull/4084#issuecomment-446894266
 
 
   I'm sorry to ask but why don't you use the resumable param of the 
`MediaFileUpload`? Actually the upload by chunks is already implemented in this 
class (cf. 
https://developers.google.com/api-client-library/python/guide/media_upload) :/ 
   
   It will avoid maintain in a operator some duplicated code I guess.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3509) add dataflow paramaters validation

2018-12-13 Thread Tomoki Takahashi (JIRA)
Tomoki Takahashi created AIRFLOW-3509:
-

 Summary: add dataflow paramaters validation
 Key: AIRFLOW-3509
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3509
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Tomoki Takahashi


[https://github.com/apache/incubator-airflow/blob/9f7f5e4a1eaae1da5f3ecdabe26984b9bcaa69fb/airflow/contrib/hooks/gcp_dataflow_hook.py#L274]

[https://cloud.google.com/dataflow/docs/reference/rest/v1b3/RuntimeEnvironment]

additionalExperiments, network, subnetwork parameters were added, I would like 
you to add it to validation



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] gyamxxx opened a new pull request #4313: Update gcp_dataflow_hook.py

2018-12-13 Thread GitBox
gyamxxx opened a new pull request #4313: Update gcp_dataflow_hook.py
URL: https://github.com/apache/incubator-airflow/pull/4313
 
 
   #1
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ]  My PR addresses the following 
https://issues.apache.org/jira/browse/AIRFLOW-3509
   
   
   ### Description
   
   - [ ] dataflow launch parameter were added. so I would like to add 
validation params
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3510) DockerOperator on OSX: Mounts denied. The path /var/folders/mk/xxx is not shared from OS X and is not known to Docker.\r\nYou can configure shared paths from Docker ->

2018-12-13 Thread Nar Kumar Chhantyal (JIRA)
Nar Kumar Chhantyal created AIRFLOW-3510:


 Summary: DockerOperator on OSX: Mounts denied. The path 
/var/folders/mk/xxx is not shared from OS X and is not known to Docker.\r\nYou 
can configure shared paths from Docker -> Preferences... -> File Sharing.
 Key: AIRFLOW-3510
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3510
 Project: Apache Airflow
  Issue Type: Bug
  Components: docker, operators
Affects Versions: 1.10.1
Reporter: Nar Kumar Chhantyal
Assignee: Nar Kumar Chhantyal
 Fix For: 1.10.2


{{I get this when using DockerOperator on OSX}}
{code:java}
Mounts denied: \r\nThe path 
/var/folders/mk/_n3w1bts11bg3wvy1ln5d7c4k9_mgh/T/airflowtmpj94b7r9v\r\nis not 
shared from OS X and is not known to Docker.\r\nYou can configure shared paths 
from Docker -> Preferences... -> File Sharing.\r\nSee 
https://docs.docker.com/docker-for-mac/osxfs/#namespaces for more info.\r\n.'
{code}
{{This is well known issue with Docker for Mac: 
[https://stackoverflow.com/questions/45122459/docker-mounts-denied-the-paths-are-not-shared-from-os-x-and-are-not-known]}}

{{Solution mentioned doesn't work because it always creates directory with 
cryptic name like:}}
{code:java}
var/folders/mk/_n3w1bts11bg3wvy1ln5d7c4k9_mgh/T/airflowtmpj94b7r9v{code}

Solution could be to pass directory name to TemporaryDirectory. I will send a 
patch later.

`



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3510) DockerOperator on OSX: Mounts denied. The path /var/folders/mk/xxx is not shared from OS X and is not known to Docker.\r\nYou can configure shared paths from Docker ->

2018-12-13 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3510?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3510.

Resolution: Duplicate

Duplicate of AIRFLOW-1381 which has an abandoned PR - if you want to pick it up 
and open a new PR that would be ace.

> DockerOperator on OSX: Mounts denied. The path /var/folders/mk/xxx is not 
> shared from OS X and is not known to Docker.\r\nYou can configure shared 
> paths from Docker -> Preferences... -> File Sharing.
> ---
>
> Key: AIRFLOW-3510
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3510
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: docker, operators
>Affects Versions: 1.10.1
>Reporter: Nar Kumar Chhantyal
>Assignee: Nar Kumar Chhantyal
>Priority: Major
>  Labels: bug
> Fix For: 1.10.2
>
>
> {{I get this when using DockerOperator on OSX}}
> {code:java}
> Mounts denied: \r\nThe path 
> /var/folders/mk/_n3w1bts11bg3wvy1ln5d7c4k9_mgh/T/airflowtmpj94b7r9v\r\nis not 
> shared from OS X and is not known to Docker.\r\nYou can configure shared 
> paths from Docker -> Preferences... -> File Sharing.\r\nSee 
> https://docs.docker.com/docker-for-mac/osxfs/#namespaces for more info.\r\n.'
> {code}
> {{This is well known issue with Docker for Mac: 
> [https://stackoverflow.com/questions/45122459/docker-mounts-denied-the-paths-are-not-shared-from-os-x-and-are-not-known]}}
> {{Solution mentioned doesn't work because it always creates directory with 
> cryptic name like:}}
> {code:java}
> var/folders/mk/_n3w1bts11bg3wvy1ln5d7c4k9_mgh/T/airflowtmpj94b7r9v{code}
> Solution could be to pass directory name to TemporaryDirectory. I will send a 
> patch later.
> `



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] sprzedwojski opened a new pull request #4314: [AIRFLOW-3398] Google Cloud Spanner instance database query operator

2018-12-13 Thread GitBox
sprzedwojski opened a new pull request #4314: [AIRFLOW-3398] Google Cloud 
Spanner instance database query operator
URL: https://github.com/apache/incubator-airflow/pull/4314
 
 
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3398) issues and references 
them in the PR title.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Created an operator to enable executing arbitrary DML query (INSERT, UPDATE, 
DELETE) in a Transaction in Cloud Spanner:
   `CloudSpannerInstanceDatabaseQueryOperator`
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   `test_gcp_spanner_operator.py`
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3398) Google Cloud Spanner instance database query operator

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3398?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720046#comment-16720046
 ] 

ASF GitHub Bot commented on AIRFLOW-3398:
-

sprzedwojski opened a new pull request #4314: [AIRFLOW-3398] Google Cloud 
Spanner instance database query operator
URL: https://github.com/apache/incubator-airflow/pull/4314
 
 
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3398) issues and references 
them in the PR title.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Created an operator to enable executing arbitrary DML query (INSERT, UPDATE, 
DELETE) in a Transaction in Cloud Spanner:
   `CloudSpannerInstanceDatabaseQueryOperator`
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   `test_gcp_spanner_operator.py`
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Google Cloud Spanner instance database query operator
> -
>
> Key: AIRFLOW-3398
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3398
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: gcp
>Reporter: Szymon Przedwojski
>Assignee: Szymon Przedwojski
>Priority: Minor
>
> Creating an operator to enable executing arbitrary SQL in a Transaction in 
> Cloud Spanner.
> https://googleapis.github.io/google-cloud-python/latest/spanner/index.html#executing-arbitrary-sql-in-a-transaction



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3511) Create GCP Memorystore Redis Hook and Operators

2018-12-13 Thread Ryan Yuan (JIRA)
Ryan Yuan created AIRFLOW-3511:
--

 Summary: Create GCP Memorystore Redis Hook and Operators
 Key: AIRFLOW-3511
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3511
 Project: Apache Airflow
  Issue Type: New Feature
Reporter: Ryan Yuan
Assignee: Ryan Yuan


Add Google Cloud Memorystore hook and operator to Airflow 
(https://cloud.google.com/memorystore/docs/redis/reference/rest/)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-3433) Create Google Cloud Spanner Hook

2018-12-13 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Yuan closed AIRFLOW-3433.
--
Resolution: Duplicate

> Create Google Cloud Spanner Hook
> 
>
> Key: AIRFLOW-3433
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3433
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>
> Add Google Cloud Spanner hook to Airflow 
> (https://cloud.google.com/spanner/docs/apis)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3511) Create GCP Memorystore Redis Hook and Operators

2018-12-13 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3511?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3511 started by Ryan Yuan.
--
> Create GCP Memorystore Redis Hook and Operators
> ---
>
> Key: AIRFLOW-3511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3511
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>
> Add Google Cloud Memorystore hook and operator to Airflow 
> (https://cloud.google.com/memorystore/docs/redis/reference/rest/)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3511) Create GCP Memorystore Redis Hook

2018-12-13 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3511?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Yuan updated AIRFLOW-3511:
---
Summary: Create GCP Memorystore Redis Hook  (was: Create GCP Memorystore 
Redis Hook and Operators)

> Create GCP Memorystore Redis Hook
> -
>
> Key: AIRFLOW-3511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3511
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>
> Add Google Cloud Memorystore hook and operator to Airflow 
> (https://cloud.google.com/memorystore/docs/redis/reference/rest/)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3511) Create GCP Memorystore Redis Hook

2018-12-13 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3511?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Yuan updated AIRFLOW-3511:
---
Description: Add Google Cloud Memorystore hook to Airflow 
([https://cloud.google.com/memorystore/docs/redis/reference/rest/])  (was: Add 
Google Cloud Memorystore hook and operator to Airflow 
(https://cloud.google.com/memorystore/docs/redis/reference/rest/))

> Create GCP Memorystore Redis Hook
> -
>
> Key: AIRFLOW-3511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3511
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>
> Add Google Cloud Memorystore hook to Airflow 
> ([https://cloud.google.com/memorystore/docs/redis/reference/rest/])



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3512) Create GCP Memorystore Redis Operators

2018-12-13 Thread Ryan Yuan (JIRA)
Ryan Yuan created AIRFLOW-3512:
--

 Summary: Create GCP Memorystore Redis Operators
 Key: AIRFLOW-3512
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3512
 Project: Apache Airflow
  Issue Type: New Feature
Reporter: Ryan Yuan
Assignee: Ryan Yuan


Add Google Cloud Memorystore operators to Airflow 
([https://cloud.google.com/memorystore/docs/redis/reference/rest/])



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3512) Create GCP Memorystore Redis Operators

2018-12-13 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3512 started by Ryan Yuan.
--
> Create GCP Memorystore Redis Operators
> --
>
> Key: AIRFLOW-3512
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3512
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>
> Add Google Cloud Memorystore operators to Airflow 
> ([https://cloud.google.com/memorystore/docs/redis/reference/rest/])



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] stale[bot] commented on issue #3783: [AIRFLOW-2937] Support HTTPS in Http connection form environment variables

2018-12-13 Thread GitBox
stale[bot] commented on issue #3783: [AIRFLOW-2937] Support HTTPS in Http 
connection form environment variables
URL: 
https://github.com/apache/incubator-airflow/pull/3783#issuecomment-446943837
 
 
   This issue has been automatically marked as stale because it has not had 
recent activity. It will be closed if no further activity occurs. Thank you for 
your contributions.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] chhantyal opened a new pull request #4315: [AIRFLOW-1381] Specify host temporary directory

2018-12-13 Thread GitBox
chhantyal opened a new pull request #4315: [AIRFLOW-1381] Specify host 
temporary directory
URL: https://github.com/apache/incubator-airflow/pull/4315
 
 
   
   Allow user to specify temporary directory to use on the host machine;
   default settings will cause an error on OS X due to the standard
   temporary directory not being shared to Docker.
   
   This is updated PR for https://github.com/apache/incubator-airflow/pull/2418
   
   Dear Airflow maintainers,
   
   Please accept this PR. I understand that it will not be reviewed until I 
have checked off all the steps below!
   JIRA
   
   My PR addresses the following Airflow JIRA issues and references them in 
the PR title. For example, "[AIRFLOW-XXX] My Airflow PR"
   https://issues.apache.org/jira/browse/AIRFLOW-1381
   
   Description
   
   Here are some details about my PR, including screenshots of any UI 
changes:
   
   DockerOperator currently uses the standard TemporaryDirectory helper, which 
calls the standard libary. On OS X the default directory cannot be shared to 
Docker containers. This PR allows the host temporary directory to be specified 
by the user.
   Tests
   
   My PR adds the following unit tests OR does not need testing for this 
extremely good reason:
   
   Added a test into docker_operator to ensure that mkdtemp is called correctly.
   Commits
   
   My commits all reference JIRA issues in their subject lines, and I have 
squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "How to write a good git commit message":
   
   Subject is separated from body by a blank line
   Subject is limited to 50 characters
   Subject does not end with a period
   Subject uses the imperative mood ("add", not "adding")
   Body wraps at 72 characters
   Body explains "what" and "why", not "how"
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1381) DockerOperator does not allow specification of temporary directory on host

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1381?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720093#comment-16720093
 ] 

ASF GitHub Bot commented on AIRFLOW-1381:
-

chhantyal opened a new pull request #4315: [AIRFLOW-1381] Specify host 
temporary directory
URL: https://github.com/apache/incubator-airflow/pull/4315
 
 
   
   Allow user to specify temporary directory to use on the host machine;
   default settings will cause an error on OS X due to the standard
   temporary directory not being shared to Docker.
   
   This is updated PR for https://github.com/apache/incubator-airflow/pull/2418
   
   Dear Airflow maintainers,
   
   Please accept this PR. I understand that it will not be reviewed until I 
have checked off all the steps below!
   JIRA
   
   My PR addresses the following Airflow JIRA issues and references them in 
the PR title. For example, "[AIRFLOW-XXX] My Airflow PR"
   https://issues.apache.org/jira/browse/AIRFLOW-1381
   
   Description
   
   Here are some details about my PR, including screenshots of any UI 
changes:
   
   DockerOperator currently uses the standard TemporaryDirectory helper, which 
calls the standard libary. On OS X the default directory cannot be shared to 
Docker containers. This PR allows the host temporary directory to be specified 
by the user.
   Tests
   
   My PR adds the following unit tests OR does not need testing for this 
extremely good reason:
   
   Added a test into docker_operator to ensure that mkdtemp is called correctly.
   Commits
   
   My commits all reference JIRA issues in their subject lines, and I have 
squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "How to write a good git commit message":
   
   Subject is separated from body by a blank line
   Subject is limited to 50 characters
   Subject does not end with a period
   Subject uses the imperative mood ("add", not "adding")
   Body wraps at 72 characters
   Body explains "what" and "why", not "how"
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> DockerOperator does not allow specification of temporary directory on host
> --
>
> Key: AIRFLOW-1381
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1381
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: docker, operators
>Affects Versions: 1.9.0
> Environment: Causes failure under default conditions in OS X
>Reporter: Benjamin Sims
>Priority: Minor
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> The Docker uses the standard TemporaryDirectory mechanism to add a volume to 
> the container which is mapped to a temporary directory on the host. 
> By default, TemporaryDirectory places this in a location such as 
> /var/folders/xxx/x/T/airflowtmpyi9pnn6w. 
> However, Docker on OS X only allows certain folders to be shared into a 
> container. /var/folders is not shared on OS X (and cannot be shared) and the 
> operator will therefore fail with a 'Mounts denied' message.
> This can be solved by setting the environment variable TMPDIR to a path which 
> Docker is able to share. However a solution to set this in the Operator 
> itself would also be useful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ryanyuan opened a new pull request #4316: [AIRFLOW-3511] Create GCP Memorystore Redis Hook

2018-12-13 Thread GitBox
ryanyuan opened a new pull request #4316:  [AIRFLOW-3511] Create GCP 
Memorystore Redis Hook
URL: https://github.com/apache/incubator-airflow/pull/4316
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following 
[Airflow-3511](https://issues.apache.org/jira/browse/AIRFLOW-3511) issues and 
references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   Add a hook to connect to GCP Cloud Memorystore and perform operations such 
as instance creation, instance deletion, etc.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   tests.contrib.hooks.test_gcp_memorystore_hook
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3511) Create GCP Memorystore Redis Hook

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3511?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720096#comment-16720096
 ] 

ASF GitHub Bot commented on AIRFLOW-3511:
-

ryanyuan opened a new pull request #4316:  [AIRFLOW-3511] Create GCP 
Memorystore Redis Hook
URL: https://github.com/apache/incubator-airflow/pull/4316
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following 
[Airflow-3511](https://issues.apache.org/jira/browse/AIRFLOW-3511) issues and 
references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   Add a hook to connect to GCP Cloud Memorystore and perform operations such 
as instance creation, instance deletion, etc.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   tests.contrib.hooks.test_gcp_memorystore_hook
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Create GCP Memorystore Redis Hook
> -
>
> Key: AIRFLOW-3511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3511
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Major
>
> Add Google Cloud Memorystore hook to Airflow 
> ([https://cloud.google.com/memorystore/docs/redis/reference/rest/])



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #4316: [AIRFLOW-3511] Create GCP Memorystore Redis Hook

2018-12-13 Thread GitBox
ashb commented on issue #4316:  [AIRFLOW-3511] Create GCP Memorystore Redis Hook
URL: 
https://github.com/apache/incubator-airflow/pull/4316#issuecomment-446956464
 
 
   I'm not too familar with GCP, so this may not be a sensible question.
   
   What is the use case here that you want to create and delete these resources 
from Airflow? My immediate thought is that this sounds like slightly the wrong 
fit for Airflow and a tool like Terraform would be better suited to this task


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] chhantyal commented on issue #4315: [AIRFLOW-1381] Specify host temporary directory

2018-12-13 Thread GitBox
chhantyal commented on issue #4315: [AIRFLOW-1381] Specify host temporary 
directory
URL: 
https://github.com/apache/incubator-airflow/pull/4315#issuecomment-446965912
 
 
   CI build failed but I think it's not related to this PR?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eladkal opened a new pull request #4317: [AIRFLOW-2629] Change refrence of hive_hooks to hive_hook everywhere

2018-12-13 Thread GitBox
eladkal opened a new pull request #4317: [AIRFLOW-2629] Change refrence of 
hive_hooks to hive_hook everywhere
URL: https://github.com/apache/incubator-airflow/pull/4317
 
 
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-2629) 
   
   ### Description
   Renaming hive_hooks to hive_hook for consistency with other packages.
   Changing all references to match the update hook.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2629) Rename a.h.hive_hooks to a.h.hive_hook

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2629?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720163#comment-16720163
 ] 

ASF GitHub Bot commented on AIRFLOW-2629:
-

eladkal opened a new pull request #4317: [AIRFLOW-2629] Change refrence of 
hive_hooks to hive_hook everywhere
URL: https://github.com/apache/incubator-airflow/pull/4317
 
 
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-2629) 
   
   ### Description
   Renaming hive_hooks to hive_hook for consistency with other packages.
   Changing all references to match the update hook.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Rename a.h.hive_hooks to a.h.hive_hook
> --
>
> Key: AIRFLOW-2629
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2629
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hive_hooks, hooks
>Reporter: Kengo Seki
>Priority: Minor
>
> As with AIRFLOW-2211, {{airflow.hooks.hive_hooks}} should be renamed to 
> {{airflow.hooks.hive_hook}} for consistency with other packages.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
ashb commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-446969349
 
 
   Test are failing cos Cloudflare are having issues :(
   
   > ERROR: error pulling image configuration: Get 
https://production.cloudflare.docker.com/registry-v2/docker/registry/v2/blobs/sha256/c1/c188f257942c5263c7c3063363c0b876884eb458d212cf57aa2c063219016ace/data?verify=1544687920-4x9vDIdsWCdPsFeS%2B%2FnKninhw50%3D:
 read tcp 10.20.1.242:56400->104.18.123.25:443: read: connection reset by peer


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4317: [AIRFLOW-2629] Change reference of hive_hooks to hive_hook everywhere

2018-12-13 Thread GitBox
ashb commented on issue #4317: [AIRFLOW-2629] Change reference of hive_hooks to 
hive_hook everywhere
URL: 
https://github.com/apache/incubator-airflow/pull/4317#issuecomment-446969682
 
 
   Please add a mention of this in UPDATING.md


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb closed pull request #2209: [AIRFLOW-766] Skip conn.commit() when in Auto-commit

2018-12-13 Thread GitBox
ashb closed pull request #2209: [AIRFLOW-766] Skip conn.commit() when in 
Auto-commit
URL: https://github.com/apache/incubator-airflow/pull/2209
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/hooks/dbapi_hook.py b/airflow/hooks/dbapi_hook.py
index df52e54ea9..753dea179f 100644
--- a/airflow/hooks/dbapi_hook.py
+++ b/airflow/hooks/dbapi_hook.py
@@ -170,7 +170,16 @@ def run(self, sql, autocommit=False, parameters=None):
 else:
 cur.execute(s)
 cur.close()
-conn.commit()
+
+# Skip commit when autocommit is activated
+if self.supports_autocommit and autocommit:
+pass
+elif not self.supports_autocommit and autocommit:
+logging.warn(("%s connection doesn't support " +
+"autocommit but autocommit activated: ")
+% getattr(self, self.conn_name_attr))
+else:
+conn.commit()
 conn.close()
 
 def set_autocommit(self, conn, autocommit):
diff --git a/airflow/hooks/jdbc_hook.py b/airflow/hooks/jdbc_hook.py
index bc1f352ecc..5a1271f204 100644
--- a/airflow/hooks/jdbc_hook.py
+++ b/airflow/hooks/jdbc_hook.py
@@ -64,4 +64,4 @@ def set_autocommit(self, conn, autocommit):
 :param conn: The connection
 :return:
 """
-conn.jconn.autocommit = autocommit
+conn.jconn.setAutoCommit(autocommit)
diff --git a/airflow/utils/db.py b/airflow/utils/db.py
index 54254f61dd..a993dd8a76 100644
--- a/airflow/utils/db.py
+++ b/airflow/utils/db.py
@@ -253,6 +253,17 @@ def initdb():
 models.Connection(
 conn_id='databricks_default', conn_type='databricks',
 host='localhost'))
+merge_conn(
+models.Connection(
+conn_id='jdbc_default', conn_type='jdbc',
+host='jdbc:mysql://localhost:3306/airflow', port=3306,
+login='airflow', password='airflow', schema='airflow',
+extra='''
+{
+"extra__jdbc__drv_path": 
"/tmp/mysql-connector-java-5.1.40-bin.jar",
+"extra__jdbc__drv_clsname": "com.mysql.jdbc.Driver"
+}
+'''))
 
 # Known event types
 KET = models.KnownEventType
diff --git a/tests/hooks/__init__.py b/tests/hooks/__init__.py
new file mode 100644
index 00..a85b77269c
--- /dev/null
+++ b/tests/hooks/__init__.py
@@ -0,0 +1,13 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/tests/hooks/dbapi_hook.py b/tests/hooks/dbapi_hook.py
new file mode 100644
index 00..281ab033e7
--- /dev/null
+++ b/tests/hooks/dbapi_hook.py
@@ -0,0 +1,52 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import logging
+import unittest
+import mock
+from mock import MagicMock, Mock
+from airflow.hooks.jdbc_hook import JdbcHook
+from airflow.hooks.postgres_hook import PostgresHook
+
+class TestDbApiHook(unittest.TestCase):
+
+def test_set_autocommit(self): 
+hook = JdbcHook(jdbc_conn_id='jdbc_default')
+conn = MagicMock(name='conn')
+conn.jconn.setAutoCommit = Mock(return_value=None)
+
+hook.set_autocommit(conn, False)
+conn.jconn.setAutoCommit.assert_called_with(False)
+
+hook.set_autocommit(conn, True)
+conn.jconn.setAutoCommit.assert_called_with(True)
+
+def test_autocommit(self):
+logging.info("Test autocommit when connection supports autocommit")
+jdbc_hook = JdbcHook(jdbc_conn_id='jdbc_default')
+jdbc_hook.run(sql='SELECT 1', autocommit=True)
+self.assertTrue('Query ran success with supports_autocommit=True, 
autocommit=True')
+jdbc_hook.run(sql='SELECT 1', autocommit=False)
+self.as

[jira] [Commented] (AIRFLOW-766) Skip conn.commit() when in Auto-commit

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-766?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720179#comment-16720179
 ] 

ASF GitHub Bot commented on AIRFLOW-766:


ashb closed pull request #2209: [AIRFLOW-766] Skip conn.commit() when in 
Auto-commit
URL: https://github.com/apache/incubator-airflow/pull/2209
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/hooks/dbapi_hook.py b/airflow/hooks/dbapi_hook.py
index df52e54ea9..753dea179f 100644
--- a/airflow/hooks/dbapi_hook.py
+++ b/airflow/hooks/dbapi_hook.py
@@ -170,7 +170,16 @@ def run(self, sql, autocommit=False, parameters=None):
 else:
 cur.execute(s)
 cur.close()
-conn.commit()
+
+# Skip commit when autocommit is activated
+if self.supports_autocommit and autocommit:
+pass
+elif not self.supports_autocommit and autocommit:
+logging.warn(("%s connection doesn't support " +
+"autocommit but autocommit activated: ")
+% getattr(self, self.conn_name_attr))
+else:
+conn.commit()
 conn.close()
 
 def set_autocommit(self, conn, autocommit):
diff --git a/airflow/hooks/jdbc_hook.py b/airflow/hooks/jdbc_hook.py
index bc1f352ecc..5a1271f204 100644
--- a/airflow/hooks/jdbc_hook.py
+++ b/airflow/hooks/jdbc_hook.py
@@ -64,4 +64,4 @@ def set_autocommit(self, conn, autocommit):
 :param conn: The connection
 :return:
 """
-conn.jconn.autocommit = autocommit
+conn.jconn.setAutoCommit(autocommit)
diff --git a/airflow/utils/db.py b/airflow/utils/db.py
index 54254f61dd..a993dd8a76 100644
--- a/airflow/utils/db.py
+++ b/airflow/utils/db.py
@@ -253,6 +253,17 @@ def initdb():
 models.Connection(
 conn_id='databricks_default', conn_type='databricks',
 host='localhost'))
+merge_conn(
+models.Connection(
+conn_id='jdbc_default', conn_type='jdbc',
+host='jdbc:mysql://localhost:3306/airflow', port=3306,
+login='airflow', password='airflow', schema='airflow',
+extra='''
+{
+"extra__jdbc__drv_path": 
"/tmp/mysql-connector-java-5.1.40-bin.jar",
+"extra__jdbc__drv_clsname": "com.mysql.jdbc.Driver"
+}
+'''))
 
 # Known event types
 KET = models.KnownEventType
diff --git a/tests/hooks/__init__.py b/tests/hooks/__init__.py
new file mode 100644
index 00..a85b77269c
--- /dev/null
+++ b/tests/hooks/__init__.py
@@ -0,0 +1,13 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/tests/hooks/dbapi_hook.py b/tests/hooks/dbapi_hook.py
new file mode 100644
index 00..281ab033e7
--- /dev/null
+++ b/tests/hooks/dbapi_hook.py
@@ -0,0 +1,52 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import logging
+import unittest
+import mock
+from mock import MagicMock, Mock
+from airflow.hooks.jdbc_hook import JdbcHook
+from airflow.hooks.postgres_hook import PostgresHook
+
+class TestDbApiHook(unittest.TestCase):
+
+def test_set_autocommit(self): 
+hook = JdbcHook(jdbc_conn_id='jdbc_default')
+conn = MagicMock(name='conn')
+conn.jconn.setAutoCommit = Mock(return_value=None)
+
+hook.set_autocommit(conn, False)
+conn.jconn.setAutoCommit.assert_called_with(False)
+
+hook.set_autocommit(conn, True)
+conn.jconn.setAutoCommit.assert_called_with(True)
+
+def test_autocommit(self):
+logging.info("Test autocommit when connection supports autocommit")
+jdbc_hook = J

[jira] [Assigned] (AIRFLOW-1552) Airflow Filter_by_owner not working with password_auth

2018-12-13 Thread Thomas Brockmeier (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1552?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Brockmeier reassigned AIRFLOW-1552:
--

Assignee: Thomas Brockmeier

> Airflow Filter_by_owner not working with password_auth
> --
>
> Key: AIRFLOW-1552
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1552
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration
>Affects Versions: 1.8.0
> Environment: CentOS , python 2.7
>Reporter: raghu ram reddy
>Assignee: Thomas Brockmeier
>Priority: Major
> Fix For: 1.10.2
>
>
> Airflow Filter_by_owner parameter is not working with password_auth.
> I created sample user using the below code from airflow documentation and 
> enabled password_auth.
> I'm able to login as the user created but by default this user is superuser 
> and there is noway to modify it, default all users created by PasswordUser 
> are superusers.
> import airflow
> from airflow import models, settings
> from airflow.contrib.auth.backends.password_auth import PasswordUser
> user = PasswordUser(models.User())
> user.username = 'test1'
> user.password = 'test1'
> user.is_superuser()
> session = settings.Session()
> session.add(user)
> session.commit()
> session.close()
> exit()



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3223) RBAC with GitHub Authentication

2018-12-13 Thread Sai Phanindhra (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3223?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720211#comment-16720211
 ] 

Sai Phanindhra commented on AIRFLOW-3223:
-

[~ashb] I think airflow already supports github authentication

> RBAC with  GitHub Authentication
> 
>
> Key: AIRFLOW-3223
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3223
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: authentication
>Affects Versions: 1.10.0
>Reporter: Vikram Fugro
>Assignee: Sai Phanindhra
>Priority: Major
>
> With airflow 1.10 released having RBAC support, I was wondering how I do 
> configure GitHub Auth with airflow's RBAC.  In which case, I believe we don't 
> have to create any users using airflow.  Are there any notes on this?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] eladkal commented on issue #4317: [AIRFLOW-2629] Change reference of hive_hooks to hive_hook everywhere

2018-12-13 Thread GitBox
eladkal commented on issue #4317: [AIRFLOW-2629] Change reference of hive_hooks 
to hive_hook everywhere
URL: 
https://github.com/apache/incubator-airflow/pull/4317#issuecomment-446987017
 
 
   @ashb  done.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3176) Duration tooltip on Tree View of Tasks

2018-12-13 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3176.

Resolution: Duplicate

> Duration tooltip on Tree View of Tasks
> --
>
> Key: AIRFLOW-3176
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3176
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: ui
>Affects Versions: 1.10.0
>Reporter: Nicolás Kittsteiner
>Priority: Minor
>  Labels: easy-fix
> Attachments: Screen Shot 2018-10-09 at 13.27.42.png
>
>
> On the Tree View of the UI over de squares of tasks, are a tooltip that show 
> details of tasks. The field duration could be in a more friendly format like 
> HH:MM:SS.
> Thanks :)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] stale[bot] commented on issue #4003: [AIRFLOW-3163] add operator to enable setting table description in BigQuery table

2018-12-13 Thread GitBox
stale[bot] commented on issue #4003: [AIRFLOW-3163] add operator to enable 
setting table description in BigQuery table
URL: 
https://github.com/apache/incubator-airflow/pull/4003#issuecomment-446998073
 
 
   This issue has been automatically marked as stale because it has not had 
recent activity. It will be closed if no further activity occurs. Thank you for 
your contributions.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] stale[bot] commented on issue #4050: [AIRFLOW-3178] Don't bake ENV and _cmd into tmp config for non-sudo

2018-12-13 Thread GitBox
stale[bot] commented on issue #4050: [AIRFLOW-3178] Don't bake ENV and _cmd 
into tmp config for non-sudo
URL: 
https://github.com/apache/incubator-airflow/pull/4050#issuecomment-446998066
 
 
   This issue has been automatically marked as stale because it has not had 
recent activity. It will be closed if no further activity occurs. Thank you for 
your contributions.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #3683: [AIRFLOW-2770] kubernetes: add support for dag folder in the docker i…

2018-12-13 Thread GitBox
ashb commented on issue #3683: [AIRFLOW-2770] kubernetes: add support for dag 
folder in the docker i…
URL: 
https://github.com/apache/incubator-airflow/pull/3683#issuecomment-447017461
 
 
   I'd like it if we reverted this before the weekend - it's not great to see 
tests failing on other PRs and I feel uneasy about "merging those PRs anyway".


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
ashb commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-447018890
 
 
   What is this change needed for? Travis already sets that env var globally 
https://github.com/apache/incubator-airflow/blob/457ad83e4eb02b7348e5ce00292ca9bd27032651/.travis.yml#L25
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1919) Add option to query for DAG runs given a DAG ID

2018-12-13 Thread JIRA


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1919?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720335#comment-16720335
 ] 

Victor Villas BĂ´as Chaves commented on AIRFLOW-1919:


[~kaxilnaik]

AFAICS this is not in the CLI documentation, but the docs are auto generated. 
So is the feature still missing or are the docs incomplete?

> Add option to query for DAG runs given a DAG ID
> ---
>
> Key: AIRFLOW-1919
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1919
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 1.8.0
>Reporter: Steen Manniche
>Assignee: Tao Feng
>Priority: Trivial
> Fix For: 2.0.0
>
>
> Having a way to list all DAG runs for a given DAG identifier would be useful 
> when trying to get a programmatic overview of running DAGs. Something along 
> the lines of
> {code}
> airflow list_runs $DAG_ID
> {code}
> Which would return the running DAGs for {{$DAG_ID}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] eran-levy commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
eran-levy commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh 
script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-447023278
 
 
   @ashb 
   we need this change otherwise the build.sh fails - it executes the followig:
   python setup.py sdist -q
   and verify_gpl_dependency() in setup.py throws a runetime error


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eran-levy commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
eran-levy commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh 
script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-447024110
 
 
   can we re-run the travis build as soon as cloudflare issues fixed? it seems 
like build failed for all PRs


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] tiagovrtr commented on issue #3880: [AIRFLOW-461] Support autodetected schemas in BigQuery run_load

2018-12-13 Thread GitBox
tiagovrtr commented on issue #3880: [AIRFLOW-461]  Support autodetected schemas 
in BigQuery run_load
URL: 
https://github.com/apache/incubator-airflow/pull/3880#issuecomment-447025564
 
 
   Great commit, thank you
   Can you also please add the argument to the docstring?
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #3880: [AIRFLOW-461] Support autodetected schemas in BigQuery run_load

2018-12-13 Thread GitBox
kaxil commented on issue #3880: [AIRFLOW-461]  Support autodetected schemas in 
BigQuery run_load
URL: 
https://github.com/apache/incubator-airflow/pull/3880#issuecomment-447026406
 
 
   @tiagovtr it has been added to the docstring


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil removed a comment on issue #3880: [AIRFLOW-461] Support autodetected schemas in BigQuery run_load

2018-12-13 Thread GitBox
kaxil removed a comment on issue #3880: [AIRFLOW-461]  Support autodetected 
schemas in BigQuery run_load
URL: 
https://github.com/apache/incubator-airflow/pull/3880#issuecomment-447026406
 
 
   @tiagovtr it has been added to the docstring


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
ashb commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-447042090
 
 
   Something on master broke the Kube tests - we're working on fixing/reverting 
that break.
   
   But before that the Kube tests were running fine, so I'm a little bit 
confused why this change is needed? Where is this script being run from?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kppullin commented on issue #4307: [AIRFLOW-3501] k8s executor - Support loading dags from image.

2018-12-13 Thread GitBox
kppullin commented on issue #4307: [AIRFLOW-3501] k8s executor - Support 
loading dags from image.
URL: 
https://github.com/apache/incubator-airflow/pull/4307#issuecomment-447049548
 
 
   Yup, this is a dupe of 2270.  I'll close this out and make a separate PR to 
include the tests from this PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kppullin closed pull request #4307: [AIRFLOW-3501] k8s executor - Support loading dags from image.

2018-12-13 Thread GitBox
kppullin closed pull request #4307: [AIRFLOW-3501] k8s executor - Support 
loading dags from image.
URL: https://github.com/apache/incubator-airflow/pull/4307
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index a9473178c1..2024bfc34e 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -624,6 +624,10 @@ git_user =
 git_password =
 git_subpath =
 
+# If True, use the dags that exist in the docker container instead of pulling
+# from git or a dag volume claim.
+use_container_dags =
+
 # For cloning DAGs from git repositories into volumes: 
https://github.com/kubernetes/git-sync
 git_sync_container_repository = gcr.io/google-containers/git-sync-amd64
 git_sync_container_tag = v2.0.5
diff --git a/airflow/contrib/executors/kubernetes_executor.py 
b/airflow/contrib/executors/kubernetes_executor.py
index 6c1bd222b9..d4ac7f0dd7 100644
--- a/airflow/contrib/executors/kubernetes_executor.py
+++ b/airflow/contrib/executors/kubernetes_executor.py
@@ -150,6 +150,10 @@ def __init__(self):
 self.git_user = conf.get(self.kubernetes_section, 'git_user')
 self.git_password = conf.get(self.kubernetes_section, 'git_password')
 
+# If True, use the dags that exist in the docker container instead of 
pulling
+# from git or a dag volume claim.
+self.use_container_dags = conf.get(self.kubernetes_section, 
'use_container_dags')
+
 # NOTE: The user may optionally use a volume claim to mount a PV 
containing
 # DAGs directly
 self.dags_volume_claim = conf.get(self.kubernetes_section, 
'dags_volume_claim')
@@ -204,10 +208,13 @@ def __init__(self):
 self._validate()
 
 def _validate(self):
-if not self.dags_volume_claim and (not self.git_repo or not 
self.git_branch):
+if not self.dags_volume_claim \
+and (not self.git_repo or not self.git_branch) \
+and not self.use_container_dags:
 raise AirflowConfigException(
 'In kubernetes mode the following must be set in the 
`kubernetes` '
-'config section: `dags_volume_claim` or `git_repo and 
git_branch`')
+'config section: `dags_volume_claim` or `use_container_dags` '
+'or `git_repo and git_branch`')
 
 
 class KubernetesJobWatcher(multiprocessing.Process, LoggingMixin, object):
diff --git a/airflow/contrib/kubernetes/worker_configuration.py 
b/airflow/contrib/kubernetes/worker_configuration.py
index f857cbc237..a98d083b72 100644
--- a/airflow/contrib/kubernetes/worker_configuration.py
+++ b/airflow/contrib/kubernetes/worker_configuration.py
@@ -37,8 +37,8 @@ def __init__(self, kube_config):
 
 def _get_init_containers(self, volume_mounts):
 """When using git to retrieve the DAGs, use the GitSync Init 
Container"""
-# If we're using volume claims to mount the dags, no init container is 
needed
-if self.kube_config.dags_volume_claim:
+# If we're using container dags or a volume claim to mount the dags, 
no init container is needed
+if self.kube_config.dags_volume_claim or 
self.kube_config.use_container_dags:
 return []
 
 # Otherwise, define a git-sync init container
@@ -128,33 +128,12 @@ def _construct_volume(name, claim):
 return volume
 
 volumes = [
-_construct_volume(
-dags_volume_name,
-self.kube_config.dags_volume_claim
-),
 _construct_volume(
 logs_volume_name,
 self.kube_config.logs_volume_claim
 )
 ]
 
-dag_volume_mount_path = ""
-
-if self.kube_config.dags_volume_claim:
-dag_volume_mount_path = self.worker_airflow_dags
-else:
-dag_volume_mount_path = os.path.join(
-self.worker_airflow_dags,
-self.kube_config.git_subpath
-)
-dags_volume_mount = {
-'name': dags_volume_name,
-'mountPath': dag_volume_mount_path,
-'readOnly': True,
-}
-if self.kube_config.dags_volume_subpath:
-dags_volume_mount['subPath'] = self.kube_config.dags_volume_subpath
-
 logs_volume_mount = {
 'name': logs_volume_name,
 'mountPath': self.worker_airflow_logs,
@@ -162,10 +141,34 @@ def _construct_volume(name, claim):
 if self.kube_config.logs_volume_subpath:
 logs_volume_mount['subPath'] = self.kube_config.logs_volume_subpath
 
-volume_mounts = [
-dags_volume_mount,
-logs_volume_mou

[jira] [Commented] (AIRFLOW-3501) Add config option to load dags in an image with the kubernetes executor.

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3501?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720414#comment-16720414
 ] 

ASF GitHub Bot commented on AIRFLOW-3501:
-

kppullin closed pull request #4307: [AIRFLOW-3501] k8s executor - Support 
loading dags from image.
URL: https://github.com/apache/incubator-airflow/pull/4307
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index a9473178c1..2024bfc34e 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -624,6 +624,10 @@ git_user =
 git_password =
 git_subpath =
 
+# If True, use the dags that exist in the docker container instead of pulling
+# from git or a dag volume claim.
+use_container_dags =
+
 # For cloning DAGs from git repositories into volumes: 
https://github.com/kubernetes/git-sync
 git_sync_container_repository = gcr.io/google-containers/git-sync-amd64
 git_sync_container_tag = v2.0.5
diff --git a/airflow/contrib/executors/kubernetes_executor.py 
b/airflow/contrib/executors/kubernetes_executor.py
index 6c1bd222b9..d4ac7f0dd7 100644
--- a/airflow/contrib/executors/kubernetes_executor.py
+++ b/airflow/contrib/executors/kubernetes_executor.py
@@ -150,6 +150,10 @@ def __init__(self):
 self.git_user = conf.get(self.kubernetes_section, 'git_user')
 self.git_password = conf.get(self.kubernetes_section, 'git_password')
 
+# If True, use the dags that exist in the docker container instead of 
pulling
+# from git or a dag volume claim.
+self.use_container_dags = conf.get(self.kubernetes_section, 
'use_container_dags')
+
 # NOTE: The user may optionally use a volume claim to mount a PV 
containing
 # DAGs directly
 self.dags_volume_claim = conf.get(self.kubernetes_section, 
'dags_volume_claim')
@@ -204,10 +208,13 @@ def __init__(self):
 self._validate()
 
 def _validate(self):
-if not self.dags_volume_claim and (not self.git_repo or not 
self.git_branch):
+if not self.dags_volume_claim \
+and (not self.git_repo or not self.git_branch) \
+and not self.use_container_dags:
 raise AirflowConfigException(
 'In kubernetes mode the following must be set in the 
`kubernetes` '
-'config section: `dags_volume_claim` or `git_repo and 
git_branch`')
+'config section: `dags_volume_claim` or `use_container_dags` '
+'or `git_repo and git_branch`')
 
 
 class KubernetesJobWatcher(multiprocessing.Process, LoggingMixin, object):
diff --git a/airflow/contrib/kubernetes/worker_configuration.py 
b/airflow/contrib/kubernetes/worker_configuration.py
index f857cbc237..a98d083b72 100644
--- a/airflow/contrib/kubernetes/worker_configuration.py
+++ b/airflow/contrib/kubernetes/worker_configuration.py
@@ -37,8 +37,8 @@ def __init__(self, kube_config):
 
 def _get_init_containers(self, volume_mounts):
 """When using git to retrieve the DAGs, use the GitSync Init 
Container"""
-# If we're using volume claims to mount the dags, no init container is 
needed
-if self.kube_config.dags_volume_claim:
+# If we're using container dags or a volume claim to mount the dags, 
no init container is needed
+if self.kube_config.dags_volume_claim or 
self.kube_config.use_container_dags:
 return []
 
 # Otherwise, define a git-sync init container
@@ -128,33 +128,12 @@ def _construct_volume(name, claim):
 return volume
 
 volumes = [
-_construct_volume(
-dags_volume_name,
-self.kube_config.dags_volume_claim
-),
 _construct_volume(
 logs_volume_name,
 self.kube_config.logs_volume_claim
 )
 ]
 
-dag_volume_mount_path = ""
-
-if self.kube_config.dags_volume_claim:
-dag_volume_mount_path = self.worker_airflow_dags
-else:
-dag_volume_mount_path = os.path.join(
-self.worker_airflow_dags,
-self.kube_config.git_subpath
-)
-dags_volume_mount = {
-'name': dags_volume_name,
-'mountPath': dag_volume_mount_path,
-'readOnly': True,
-}
-if self.kube_config.dags_volume_subpath:
-dags_volume_mount['subPath'] = self.kube_config.dags_volume_subpath
-
 logs_volume_mount = {
 'name': logs_volume_name,
 'mountPath': self.worker_airflow_logs,
@@ -162,10 +141

[jira] [Commented] (AIRFLOW-1919) Add option to query for DAG runs given a DAG ID

2018-12-13 Thread Tao Feng (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1919?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720421#comment-16720421
 ] 

Tao Feng commented on AIRFLOW-1919:
---

[~villasv] , the code has been checked in.

> Add option to query for DAG runs given a DAG ID
> ---
>
> Key: AIRFLOW-1919
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1919
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 1.8.0
>Reporter: Steen Manniche
>Assignee: Tao Feng
>Priority: Trivial
> Fix For: 2.0.0
>
>
> Having a way to list all DAG runs for a given DAG identifier would be useful 
> when trying to get a programmatic overview of running DAGs. Something along 
> the lines of
> {code}
> airflow list_runs $DAG_ID
> {code}
> Which would return the running DAGs for {{$DAG_ID}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-1919) Add option to query for DAG runs given a DAG ID

2018-12-13 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng closed AIRFLOW-1919.
-

> Add option to query for DAG runs given a DAG ID
> ---
>
> Key: AIRFLOW-1919
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1919
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 1.8.0
>Reporter: Steen Manniche
>Assignee: Tao Feng
>Priority: Trivial
> Fix For: 2.0.0
>
>
> Having a way to list all DAG runs for a given DAG identifier would be useful 
> when trying to get a programmatic overview of running DAGs. Something along 
> the lines of
> {code}
> airflow list_runs $DAG_ID
> {code}
> Which would return the running DAGs for {{$DAG_ID}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] feng-tao opened a new pull request #4318: Revert [AIRFLOW-2770] [AIRFLOW-3505]

2018-12-13 Thread GitBox
feng-tao opened a new pull request #4318: Revert  [AIRFLOW-2770] [AIRFLOW-3505]
URL: https://github.com/apache/incubator-airflow/pull/4318
 
 
   The k8s CI seems to be broken after 
https://github.com/apache/incubator-airflow/pull/3683 is merged(although local 
pr test if pass. suspect the pr doesn't rebase with master)
   
   Revert the prs to unblock CI. @dimberman will do some investigations before 
we re-push the change.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2770) kubernetes: add support for dag folder in the docker image

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2770?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720426#comment-16720426
 ] 

ASF GitHub Bot commented on AIRFLOW-2770:
-

feng-tao opened a new pull request #4318: Revert  [AIRFLOW-2770] [AIRFLOW-3505]
URL: https://github.com/apache/incubator-airflow/pull/4318
 
 
   The k8s CI seems to be broken after 
https://github.com/apache/incubator-airflow/pull/3683 is merged(although local 
pr test if pass. suspect the pr doesn't rebase with master)
   
   Revert the prs to unblock CI. @dimberman will do some investigations before 
we re-push the change.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> kubernetes: add support for dag folder in the docker image
> --
>
> Key: AIRFLOW-2770
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2770
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Rurui Ye
>Assignee: Rurui Ye
>Priority: Critical
> Fix For: 1.10.2
>
>
> currently the kube executor need to provider dag_volume_chain or git repo in 
> the config file, but if the user has build dag into their docker image, they 
> doesn't need to provider these two options, and they can manager their dag 
> version by manager the docker image version. 
> So I suppose we can add the a new configuration as 
> kube.config.dag_folder_path along with dag_volume_chain and git repo. with 
> this config, we can run the worker just from the dags in docker image.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] feng-tao commented on issue #4318: Revert [AIRFLOW-2770] [AIRFLOW-3505]

2018-12-13 Thread GitBox
feng-tao commented on issue #4318: Revert  [AIRFLOW-2770] [AIRFLOW-3505]
URL: 
https://github.com/apache/incubator-airflow/pull/4318#issuecomment-447053612
 
 
   PTAL @dimberman  @ashb 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #3683: [AIRFLOW-2770] kubernetes: add support for dag folder in the docker i…

2018-12-13 Thread GitBox
feng-tao commented on issue #3683: [AIRFLOW-2770] kubernetes: add support for 
dag folder in the docker i…
URL: 
https://github.com/apache/incubator-airflow/pull/3683#issuecomment-447053900
 
 
   @ashb  @dimberman revert branch is 
created(https://github.com/apache/incubator-airflow/pull/4318).  @dimberman , 
we could repush the change once the CI is fixed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-3501) Add config option to load dags in an image with the kubernetes executor.

2018-12-13 Thread Anonymous (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3501?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anonymous reassigned AIRFLOW-3501:
--

Assignee: Kevin Pullin

> Add config option to load dags in an image with the kubernetes executor.
> 
>
> Key: AIRFLOW-3501
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3501
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: kubernetes
>Reporter: Kevin Pullin
>Assignee: Kevin Pullin
>Priority: Major
>
> Currently the airflow kubernetes executor forces loading dags either from a 
> volume claim or an init container.  There should be an option to bypass these 
> settings and instead use dags packaged into the running image.
> The motivation for this change is to allow for an airflow image to be built 
> and released via a CI/CD pipeline upon a new commit to a dag repository.  For 
> example, given a new git commit to a dag repo, a CI/CD server can build an 
> airflow docker image, run tests against the current dags, and finally push 
> the entire bundle as a single, complete, well-known unit to kubernetes.
> There's no need to worry that a git init container will fail, having to have 
> a separate pipeline to update dags on a shared volume, etc.  And if issues 
> arise from an update, the configuration can be easily rolled back to the 
> prior version of the image.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-3501) Add config option to load dags in an image with the kubernetes executor.

2018-12-13 Thread Kevin Pullin (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3501?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kevin Pullin closed AIRFLOW-3501.
-
Resolution: Duplicate

> Add config option to load dags in an image with the kubernetes executor.
> 
>
> Key: AIRFLOW-3501
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3501
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: kubernetes
>Reporter: Kevin Pullin
>Assignee: Kevin Pullin
>Priority: Major
>
> Currently the airflow kubernetes executor forces loading dags either from a 
> volume claim or an init container.  There should be an option to bypass these 
> settings and instead use dags packaged into the running image.
> The motivation for this change is to allow for an airflow image to be built 
> and released via a CI/CD pipeline upon a new commit to a dag repository.  For 
> example, given a new git commit to a dag repo, a CI/CD server can build an 
> airflow docker image, run tests against the current dags, and finally push 
> the entire bundle as a single, complete, well-known unit to kubernetes.
> There's no need to worry that a git init container will fail, having to have 
> a separate pipeline to update dags on a shared volume, etc.  And if issues 
> arise from an update, the configuration can be easily rolled back to the 
> prior version of the image.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io commented on issue #4318: Revert [AIRFLOW-2770] [AIRFLOW-3505]

2018-12-13 Thread GitBox
codecov-io commented on issue #4318: Revert  [AIRFLOW-2770] [AIRFLOW-3505]
URL: 
https://github.com/apache/incubator-airflow/pull/4318#issuecomment-447069143
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4318?src=pr&el=h1)
 Report
   > Merging 
[#4318](https://codecov.io/gh/apache/incubator-airflow/pull/4318?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/457ad83e4eb02b7348e5ce00292ca9bd27032651?src=pr&el=desc)
 will **increase** coverage by `0.06%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4318/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4318?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4318  +/-   ##
   ==
   + Coverage   78.02%   78.08%   +0.06% 
   ==
 Files 201  201  
 Lines   1646616466  
   ==
   + Hits1284712858  +11 
   + Misses   3619 3608  -11
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4318?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4318/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.3% <0%> (-0.05%)` | :arrow_down: |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/incubator-airflow/pull/4318/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `89.24% <0%> (+4.3%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4318?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4318?src=pr&el=footer).
 Last update 
[457ad83...37969ce](https://codecov.io/gh/apache/incubator-airflow/pull/4318?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao closed pull request #4318: Revert [AIRFLOW-2770] [AIRFLOW-3505]

2018-12-13 Thread GitBox
feng-tao closed pull request #4318: Revert  [AIRFLOW-2770] [AIRFLOW-3505]
URL: https://github.com/apache/incubator-airflow/pull/4318
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 9c21f5d47e..a9473178c1 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -605,10 +605,6 @@ namespace = default
 # The name of the Kubernetes ConfigMap Containing the Airflow Configuration 
(this file)
 airflow_configmap =
 
-# For docker image already contains DAGs, this is set to `True`, and the 
worker will search for dags in dags_folder,
-# otherwise use git sync or dags volumn chaim to mount DAGs
-dags_in_image = FALSE
-
 # For either git sync or volume mounted DAGs, the worker will look in this 
subpath for DAGs
 dags_volume_subpath =
 
diff --git a/airflow/contrib/executors/kubernetes_executor.py 
b/airflow/contrib/executors/kubernetes_executor.py
index ca0cc1d128..f9d9ddb0fc 100644
--- a/airflow/contrib/executors/kubernetes_executor.py
+++ b/airflow/contrib/executors/kubernetes_executor.py
@@ -137,10 +137,6 @@ def __init__(self):
 self.kubernetes_section, 'worker_service_account_name')
 self.image_pull_secrets = conf.get(self.kubernetes_section, 
'image_pull_secrets')
 
-# NOTE: user can build the dags into the docker image directly,
-# this will set to True if so
-self.dags_in_image = conf.get(self.kubernetes_section, 'dags_in_image')
-
 # NOTE: `git_repo` and `git_branch` must be specified together as a 
pair
 # The http URL of the git repository to clone from
 self.git_repo = conf.get(self.kubernetes_section, 'git_repo')
@@ -208,12 +204,10 @@ def __init__(self):
 self._validate()
 
 def _validate(self):
-if not self.dags_volume_claim and not self.dags_in_image \
-and (not self.git_repo or not self.git_branch):
+if not self.dags_volume_claim and (not self.git_repo or not 
self.git_branch):
 raise AirflowConfigException(
 'In kubernetes mode the following must be set in the 
`kubernetes` '
-'config section: `dags_volume_claim` or `git_repo and 
git_branch` '
-'or `dags_in_image`')
+'config section: `dags_volume_claim` or `git_repo and 
git_branch`')
 
 
 class KubernetesJobWatcher(multiprocessing.Process, LoggingMixin, object):
diff --git a/airflow/contrib/kubernetes/worker_configuration.py 
b/airflow/contrib/kubernetes/worker_configuration.py
index 58cf9cbd20..f857cbc237 100644
--- a/airflow/contrib/kubernetes/worker_configuration.py
+++ b/airflow/contrib/kubernetes/worker_configuration.py
@@ -38,7 +38,7 @@ def __init__(self, kube_config):
 def _get_init_containers(self, volume_mounts):
 """When using git to retrieve the DAGs, use the GitSync Init 
Container"""
 # If we're using volume claims to mount the dags, no init container is 
needed
-if self.kube_config.dags_volume_claim or 
self.kube_config.dags_in_image:
+if self.kube_config.dags_volume_claim:
 return []
 
 # Otherwise, define a git-sync init container
@@ -128,19 +128,32 @@ def _construct_volume(name, claim):
 return volume
 
 volumes = [
+_construct_volume(
+dags_volume_name,
+self.kube_config.dags_volume_claim
+),
 _construct_volume(
 logs_volume_name,
 self.kube_config.logs_volume_claim
 )
 ]
 
-if not self.kube_config.dags_in_image:
-volumes.append(
-_construct_volume(
-dags_volume_name,
-self.kube_config.dags_volume_claim
-)
+dag_volume_mount_path = ""
+
+if self.kube_config.dags_volume_claim:
+dag_volume_mount_path = self.worker_airflow_dags
+else:
+dag_volume_mount_path = os.path.join(
+self.worker_airflow_dags,
+self.kube_config.git_subpath
 )
+dags_volume_mount = {
+'name': dags_volume_name,
+'mountPath': dag_volume_mount_path,
+'readOnly': True,
+}
+if self.kube_config.dags_volume_subpath:
+dags_volume_mount['subPath'] = self.kube_config.dags_volume_subpath
 
 logs_volume_mount = {
 'name': logs_volume_name,
@@ -150,28 +163,10 @@ def _construct_volume(name, claim):
 logs_volume_mount['subPath'] = self.kube_config.logs_volume_subpath
 
 volume_mounts = [
+dags_volume_mount,

[jira] [Commented] (AIRFLOW-2770) kubernetes: add support for dag folder in the docker image

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2770?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720460#comment-16720460
 ] 

ASF GitHub Bot commented on AIRFLOW-2770:
-

feng-tao closed pull request #4318: Revert  [AIRFLOW-2770] [AIRFLOW-3505]
URL: https://github.com/apache/incubator-airflow/pull/4318
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 9c21f5d47e..a9473178c1 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -605,10 +605,6 @@ namespace = default
 # The name of the Kubernetes ConfigMap Containing the Airflow Configuration 
(this file)
 airflow_configmap =
 
-# For docker image already contains DAGs, this is set to `True`, and the 
worker will search for dags in dags_folder,
-# otherwise use git sync or dags volumn chaim to mount DAGs
-dags_in_image = FALSE
-
 # For either git sync or volume mounted DAGs, the worker will look in this 
subpath for DAGs
 dags_volume_subpath =
 
diff --git a/airflow/contrib/executors/kubernetes_executor.py 
b/airflow/contrib/executors/kubernetes_executor.py
index ca0cc1d128..f9d9ddb0fc 100644
--- a/airflow/contrib/executors/kubernetes_executor.py
+++ b/airflow/contrib/executors/kubernetes_executor.py
@@ -137,10 +137,6 @@ def __init__(self):
 self.kubernetes_section, 'worker_service_account_name')
 self.image_pull_secrets = conf.get(self.kubernetes_section, 
'image_pull_secrets')
 
-# NOTE: user can build the dags into the docker image directly,
-# this will set to True if so
-self.dags_in_image = conf.get(self.kubernetes_section, 'dags_in_image')
-
 # NOTE: `git_repo` and `git_branch` must be specified together as a 
pair
 # The http URL of the git repository to clone from
 self.git_repo = conf.get(self.kubernetes_section, 'git_repo')
@@ -208,12 +204,10 @@ def __init__(self):
 self._validate()
 
 def _validate(self):
-if not self.dags_volume_claim and not self.dags_in_image \
-and (not self.git_repo or not self.git_branch):
+if not self.dags_volume_claim and (not self.git_repo or not 
self.git_branch):
 raise AirflowConfigException(
 'In kubernetes mode the following must be set in the 
`kubernetes` '
-'config section: `dags_volume_claim` or `git_repo and 
git_branch` '
-'or `dags_in_image`')
+'config section: `dags_volume_claim` or `git_repo and 
git_branch`')
 
 
 class KubernetesJobWatcher(multiprocessing.Process, LoggingMixin, object):
diff --git a/airflow/contrib/kubernetes/worker_configuration.py 
b/airflow/contrib/kubernetes/worker_configuration.py
index 58cf9cbd20..f857cbc237 100644
--- a/airflow/contrib/kubernetes/worker_configuration.py
+++ b/airflow/contrib/kubernetes/worker_configuration.py
@@ -38,7 +38,7 @@ def __init__(self, kube_config):
 def _get_init_containers(self, volume_mounts):
 """When using git to retrieve the DAGs, use the GitSync Init 
Container"""
 # If we're using volume claims to mount the dags, no init container is 
needed
-if self.kube_config.dags_volume_claim or 
self.kube_config.dags_in_image:
+if self.kube_config.dags_volume_claim:
 return []
 
 # Otherwise, define a git-sync init container
@@ -128,19 +128,32 @@ def _construct_volume(name, claim):
 return volume
 
 volumes = [
+_construct_volume(
+dags_volume_name,
+self.kube_config.dags_volume_claim
+),
 _construct_volume(
 logs_volume_name,
 self.kube_config.logs_volume_claim
 )
 ]
 
-if not self.kube_config.dags_in_image:
-volumes.append(
-_construct_volume(
-dags_volume_name,
-self.kube_config.dags_volume_claim
-)
+dag_volume_mount_path = ""
+
+if self.kube_config.dags_volume_claim:
+dag_volume_mount_path = self.worker_airflow_dags
+else:
+dag_volume_mount_path = os.path.join(
+self.worker_airflow_dags,
+self.kube_config.git_subpath
 )
+dags_volume_mount = {
+'name': dags_volume_name,
+'mountPath': dag_volume_mount_path,
+'readOnly': True,
+}
+if self.kube_config.dags_volume_subpath:
+dags_volume_mount['subPath'] = self.kube_config.dags_volume_subpath
 
 lo

[GitHub] kppullin opened a new pull request #4319: [AIRFLOW-2770] Read `dags_in_image` config value as a boolean

2018-12-13 Thread GitBox
kppullin opened a new pull request #4319: [AIRFLOW-2770] Read `dags_in_image` 
config value as a boolean
URL: https://github.com/apache/incubator-airflow/pull/4319
 
 
   This PR is a minor fix for #3683
   
   The `dags_in_image` config value is read as a string. However, the existing 
code expects this to be a boolean.
   
   For example, in `worker_configuration.py` there is the statement: `if not 
self.kube_config.dags_in_image:`
   
   Since the value is a non-empty string ('False') and not a boolean, this 
evaluates to true (since non-empty strings are truthy)
   and skips the logic to add the `dags_volume_claim` volume mount.
   
   This results in the CI tests failing because the dag volume is missing in 
the k8s pod definition.
   
   This PR reads the `dags_in_image` using the `conf.getboolean` to fix this 
error.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2770) kubernetes: add support for dag folder in the docker image

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2770?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720481#comment-16720481
 ] 

ASF GitHub Bot commented on AIRFLOW-2770:
-

kppullin opened a new pull request #4319: [AIRFLOW-2770] Read `dags_in_image` 
config value as a boolean
URL: https://github.com/apache/incubator-airflow/pull/4319
 
 
   This PR is a minor fix for #3683
   
   The `dags_in_image` config value is read as a string. However, the existing 
code expects this to be a boolean.
   
   For example, in `worker_configuration.py` there is the statement: `if not 
self.kube_config.dags_in_image:`
   
   Since the value is a non-empty string ('False') and not a boolean, this 
evaluates to true (since non-empty strings are truthy)
   and skips the logic to add the `dags_volume_claim` volume mount.
   
   This results in the CI tests failing because the dag volume is missing in 
the k8s pod definition.
   
   This PR reads the `dags_in_image` using the `conf.getboolean` to fix this 
error.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> kubernetes: add support for dag folder in the docker image
> --
>
> Key: AIRFLOW-2770
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2770
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Rurui Ye
>Assignee: Rurui Ye
>Priority: Critical
> Fix For: 1.10.2
>
>
> currently the kube executor need to provider dag_volume_chain or git repo in 
> the config file, but if the user has build dag into their docker image, they 
> doesn't need to provider these two options, and they can manager their dag 
> version by manager the docker image version. 
> So I suppose we can add the a new configuration as 
> kube.config.dag_folder_path along with dag_volume_chain and git repo. with 
> this config, we can run the worker just from the dags in docker image.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kppullin commented on a change in pull request #4319: [AIRFLOW-2770] Read `dags_in_image` config value as a boolean

2018-12-13 Thread GitBox
kppullin commented on a change in pull request #4319: [AIRFLOW-2770] Read 
`dags_in_image` config value as a boolean
URL: https://github.com/apache/incubator-airflow/pull/4319#discussion_r241518434
 
 

 ##
 File path: airflow/contrib/executors/kubernetes_executor.py
 ##
 @@ -139,7 +139,7 @@ def __init__(self):
 
 # NOTE: user can build the dags into the docker image directly,
 # this will set to True if so
-self.dags_in_image = conf.get(self.kubernetes_section, 'dags_in_image')
+self.dags_in_image = conf.getboolean(self.kubernetes_section, 
'dags_in_image')
 
 Review comment:
   This _might_ be considered a breaking change.  It appears that `getboolean` 
throws an exception if the value is empty, so if someone does not update their 
config file to include `dags_in_image` their setup will break.
   
   Looking for suggestions on how best to work around this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4318: Revert [AIRFLOW-2770] [AIRFLOW-3505]

2018-12-13 Thread GitBox
ashb commented on issue #4318: Revert  [AIRFLOW-2770] [AIRFLOW-3505]
URL: 
https://github.com/apache/incubator-airflow/pull/4318#issuecomment-447098930
 
 
   Thanks @feng-tao!


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kppullin commented on issue #3683: [AIRFLOW-2770] kubernetes: add support for dag folder in the docker i…

2018-12-13 Thread GitBox
kppullin commented on issue #3683: [AIRFLOW-2770] kubernetes: add support for 
dag folder in the docker i…
URL: 
https://github.com/apache/incubator-airflow/pull/3683#issuecomment-447107253
 
 
   PR #4319 fixes the issues with failing CI tests from this PR.
   
   I do have one concern with the fix, which may require further changes, 
called out here: 
https://github.com/apache/incubator-airflow/pull/4319#discussion_r241518434


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #4319: [AIRFLOW-2770] Read `dags_in_image` config value as a boolean

2018-12-13 Thread GitBox
feng-tao commented on issue #4319: [AIRFLOW-2770] Read `dags_in_image` config 
value as a boolean
URL: 
https://github.com/apache/incubator-airflow/pull/4319#issuecomment-447114648
 
 
   PTAL @dimberman 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ultrabug commented on issue #2455: [AIRFLOW-1423] Add logs to the scheduler DAG run decision logic

2018-12-13 Thread GitBox
ultrabug commented on issue #2455: [AIRFLOW-1423] Add logs to the scheduler DAG 
run decision logic
URL: 
https://github.com/apache/incubator-airflow/pull/2455#issuecomment-447116347
 
 
   @ron819 well AFAIK I've done what has been asked for me but ofc now there 
are conflicts..
   
   I'd be more happy to see #2460 getting attention tho, since it gives a 
better user experience over this kind of information


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3513) Pakegecloud

2018-12-13 Thread pakegecloud.atlassian.net (JIRA)
pakegecloud.atlassian.net created AIRFLOW-3513:
--

 Summary: Pakegecloud
 Key: AIRFLOW-3513
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3513
 Project: Apache Airflow
  Issue Type: Improvement
  Components: api, authentication, configuration, core, database, 
Dataflow, db, docker
Reporter: pakegecloud.atlassian.net


pakegecloud.atlassian.net



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3513) Pakegecloud

2018-12-13 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3513?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720642#comment-16720642
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3513:


It is not clear what you are asking for here.

> Pakegecloud
> ---
>
> Key: AIRFLOW-3513
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3513
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: api, authentication, configuration, core, database, 
> Dataflow, db, docker
>Reporter: pakegecloud.atlassian.net
>Priority: Major
>   Original Estimate: 1,311h
>  Remaining Estimate: 1,311h
>
> pakegecloud.atlassian.net



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3514) Documentation for run_query slightly off for bigquery_hook

2018-12-13 Thread joyce chan (JIRA)
joyce chan created AIRFLOW-3514:
---

 Summary: Documentation for run_query slightly off for bigquery_hook
 Key: AIRFLOW-3514
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3514
 Project: Apache Airflow
  Issue Type: Bug
  Components: hooks
Affects Versions: 1.10.0
Reporter: joyce chan


The python docs for the run_query method of BigQueryHook says
 
{code:java}
:param query_params a dictionary containing query parameter types and values, 
passed to BigQuery   

:type query_params: dict{code}
 
but it should be an array of dictionary, according to the documentation
https://cloud.google.com/bigquery/docs/parameterized-queries#bigquery-query-params-arrays-api



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4225: [AIRFLOW-3383] Rotate fernet keys.

2018-12-13 Thread GitBox
codecov-io edited a comment on issue #4225: [AIRFLOW-3383] Rotate fernet keys.
URL: 
https://github.com/apache/incubator-airflow/pull/4225#issuecomment-441103479
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=h1)
 Report
   > Merging 
[#4225](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/77c368fd228fe5edfdb3304ed4cb000a50667010?src=pr&el=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `62.5%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4225/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4225  +/-   ##
   ==
   - Coverage   78.09%   78.08%   -0.01% 
   ==
 Files 201  201  
 Lines   1646616480  +14 
   ==
   + Hits1285912869  +10 
   - Misses   3607 3611   +4
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/4225/diff?src=pr&el=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.09% <16.66%> (-0.34%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4225/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.4% <90%> (+0.06%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=footer).
 Last update 
[77c368f...d4008e0](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4225: [AIRFLOW-3383] Rotate fernet keys.

2018-12-13 Thread GitBox
codecov-io edited a comment on issue #4225: [AIRFLOW-3383] Rotate fernet keys.
URL: 
https://github.com/apache/incubator-airflow/pull/4225#issuecomment-441103479
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=h1)
 Report
   > Merging 
[#4225](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/77c368fd228fe5edfdb3304ed4cb000a50667010?src=pr&el=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `62.5%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4225/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4225  +/-   ##
   ==
   - Coverage   78.09%   78.08%   -0.01% 
   ==
 Files 201  201  
 Lines   1646616480  +14 
   ==
   + Hits1285912869  +10 
   - Misses   3607 3611   +4
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/4225/diff?src=pr&el=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.09% <16.66%> (-0.34%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4225/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.4% <90%> (+0.06%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=footer).
 Last update 
[77c368f...d4008e0](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4225: [AIRFLOW-3383] Rotate fernet keys.

2018-12-13 Thread GitBox
codecov-io edited a comment on issue #4225: [AIRFLOW-3383] Rotate fernet keys.
URL: 
https://github.com/apache/incubator-airflow/pull/4225#issuecomment-441103479
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=h1)
 Report
   > Merging 
[#4225](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/77c368fd228fe5edfdb3304ed4cb000a50667010?src=pr&el=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `62.5%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4225/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4225  +/-   ##
   ==
   - Coverage   78.09%   78.08%   -0.01% 
   ==
 Files 201  201  
 Lines   1646616480  +14 
   ==
   + Hits1285912869  +10 
   - Misses   3607 3611   +4
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/4225/diff?src=pr&el=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.09% <16.66%> (-0.34%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4225/diff?src=pr&el=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.4% <90%> (+0.06%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=footer).
 Last update 
[77c368f...d4008e0](https://codecov.io/gh/apache/incubator-airflow/pull/4225?src=pr&el=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eran-levy commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
eran-levy commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh 
script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-447202253
 
 
   the kube tests are running fine because in travis this slugify env variable 
is set, see logs:
   _Setting environment variables from .travis.yml
   $ export DOCKER_COMPOSE_VERSION=1.20.0
   $ export SLUGIFY_USES_TEXT_UNIDECODE=yes_
   ...
   Anyway, as I said before, in order to run airflow on minikube its needed to 
execute the build.sh script as described in the following:
   https://github.com/apache/incubator-airflow/tree/master/scripts/ci/kubernetes
   
   see in the script, its executing the following command: 
   python setup.py sdist -q
   which fails with the following error: "By default one of Airflow's 
dependencies installs a GPL dependency (unidecode). To avoid this..." - it 
happens because this env variable not set 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] eran-levy edited a comment on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
eran-levy edited a comment on issue #4312: AIRFLOW-3508: add slugify env to 
build.sh script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-447202253
 
 
   @ashb 
   the kube tests are running fine because in travis this slugify env variable 
is set, see logs:
   _Setting environment variables from .travis.yml
   $ export DOCKER_COMPOSE_VERSION=1.20.0
   $ export SLUGIFY_USES_TEXT_UNIDECODE=yes_
   ...
   Anyway, as I said before, in order to run airflow on minikube its needed to 
execute the build.sh script as described in the following:
   https://github.com/apache/incubator-airflow/tree/master/scripts/ci/kubernetes
   
   see in the script, its executing the following command: 
   python setup.py sdist -q
   which fails with the following error: "By default one of Airflow's 
dependencies installs a GPL dependency (unidecode). To avoid this..." - it 
happens because this env variable not set 



This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
feng-tao commented on issue #4312: AIRFLOW-3508: add slugify env to build.sh 
script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-447216474
 
 
   @eran-levy , please take a look at the release note for 
1.10(https://github.com/apache/incubator-airflow/blob/master/UPDATING.md) which 
requires setting this env.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao removed a comment on issue #4312: AIRFLOW-3508: add slugify env to build.sh script

2018-12-13 Thread GitBox
feng-tao removed a comment on issue #4312: AIRFLOW-3508: add slugify env to 
build.sh script
URL: 
https://github.com/apache/incubator-airflow/pull/4312#issuecomment-447216474
 
 
   @eran-levy , please take a look at the release note for 
1.10(https://github.com/apache/incubator-airflow/blob/master/UPDATING.md) which 
requires setting this env.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao closed pull request #4295: AIRFLOW-3452 removed an unused/dangerous display-none

2018-12-13 Thread GitBox
feng-tao closed pull request #4295: AIRFLOW-3452 removed an unused/dangerous 
display-none
URL: https://github.com/apache/incubator-airflow/pull/4295
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/www/templates/airflow/dags.html 
b/airflow/www/templates/airflow/dags.html
index a34c6720f6..1c4e0227e6 100644
--- a/airflow/www/templates/airflow/dags.html
+++ b/airflow/www/templates/airflow/dags.html
@@ -28,7 +28,7 @@
 {% block body %}
   DAGs
 
-  
+  
 
   
   
diff --git a/airflow/www_rbac/templates/airflow/dags.html 
b/airflow/www_rbac/templates/airflow/dags.html
index d1aab0218f..c6b59e98a0 100644
--- a/airflow/www_rbac/templates/airflow/dags.html
+++ b/airflow/www_rbac/templates/airflow/dags.html
@@ -28,7 +28,7 @@
 {% block content %}
   DAGs
 
-  
+  
 
   
   


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #4295: AIRFLOW-3452 removed an unused/dangerous display-none

2018-12-13 Thread GitBox
feng-tao commented on issue #4295: AIRFLOW-3452 removed an unused/dangerous 
display-none
URL: 
https://github.com/apache/incubator-airflow/pull/4295#issuecomment-447237713
 
 
   lgtm. thanks @MarcusSorealheis 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3452) Cannot view dags at /home page

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16721012#comment-16721012
 ] 

ASF GitHub Bot commented on AIRFLOW-3452:
-

feng-tao closed pull request #4295: AIRFLOW-3452 removed an unused/dangerous 
display-none
URL: https://github.com/apache/incubator-airflow/pull/4295
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/www/templates/airflow/dags.html 
b/airflow/www/templates/airflow/dags.html
index a34c6720f6..1c4e0227e6 100644
--- a/airflow/www/templates/airflow/dags.html
+++ b/airflow/www/templates/airflow/dags.html
@@ -28,7 +28,7 @@
 {% block body %}
   DAGs
 
-  
+  
 
   
   
diff --git a/airflow/www_rbac/templates/airflow/dags.html 
b/airflow/www_rbac/templates/airflow/dags.html
index d1aab0218f..c6b59e98a0 100644
--- a/airflow/www_rbac/templates/airflow/dags.html
+++ b/airflow/www_rbac/templates/airflow/dags.html
@@ -28,7 +28,7 @@
 {% block content %}
   DAGs
 
-  
+  
 
   
   


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Cannot view dags at /home page
> --
>
> Key: AIRFLOW-3452
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3452
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 2.0.0
>Reporter: Jinhui Zhang
>Priority: Blocker
>
> I checked out the latest master branch(commit 
> {{[9dce1f0|https://github.com/apache/incubator-airflow/commit/9dce1f0740f69af0ee86709a1a34a002b245aa3e]}})
>  and restarted my Airflow webserver. But I cannot view any dag at the home 
> page. I inspected the frontend code and found there's a 
> {{style="display:none;"}} on the \{{main-content}}, and the source code says 
> so at 
> [https://github.com/apache/incubator-airflow/blob/master/airflow/www_rbac/templates/airflow/dags.html#L31]
>  . Is this a known issue? How should I fix it? 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)