incubator-airflow git commit: closes apache/incubator-airflow#3032 *Closed for inactivity*

2018-04-25 Thread sanand
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 2a8bb0e1b -> dde066d00


closes apache/incubator-airflow#3032 *Closed for inactivity*


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/dde066d0
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/dde066d0
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/dde066d0

Branch: refs/heads/master
Commit: dde066d0042cf444afec3694f3d31f0117d6ff3a
Parents: 2a8bb0e
Author: r39132 
Authored: Wed Apr 25 21:30:25 2018 -0700
Committer: r39132 
Committed: Wed Apr 25 21:30:25 2018 -0700

--

--




[jira] [Commented] (AIRFLOW-2378) Add Groupon to README

2018-04-25 Thread steven casey (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16453450#comment-16453450
 ] 

steven casey commented on AIRFLOW-2378:
---

https://github.com/apache/incubator-airflow/pull/3267

> Add Groupon to README
> -
>
> Key: AIRFLOW-2378
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2378
> Project: Apache Airflow
>  Issue Type: Wish
>Reporter: steven casey
>Assignee: steven casey
>Priority: Trivial
>
> Add Groupon to current list of Airflow users



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2378) Add Groupon to README

2018-04-25 Thread steven casey (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2378?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

steven casey updated AIRFLOW-2378:
--
Description: Add Groupon to current list of Airflow users  (was: Add 
Bonnier Broadcasting to current list of Airflow users)

> Add Groupon to README
> -
>
> Key: AIRFLOW-2378
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2378
> Project: Apache Airflow
>  Issue Type: Wish
>Reporter: steven casey
>Assignee: steven casey
>Priority: Trivial
>
> Add Groupon to current list of Airflow users



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2378) Add Groupon to README

2018-04-25 Thread steven casey (JIRA)
steven casey created AIRFLOW-2378:
-

 Summary: Add Groupon to README
 Key: AIRFLOW-2378
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2378
 Project: Apache Airflow
  Issue Type: Wish
Reporter: steven casey
Assignee: Guillermo Rodríguez Cano


Add Bonnier Broadcasting to current list of Airflow users



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-2378) Add Groupon to README

2018-04-25 Thread steven casey (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2378?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

steven casey reassigned AIRFLOW-2378:
-

Assignee: steven casey  (was: Guillermo Rodríguez Cano)

> Add Groupon to README
> -
>
> Key: AIRFLOW-2378
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2378
> Project: Apache Airflow
>  Issue Type: Wish
>Reporter: steven casey
>Assignee: steven casey
>Priority: Trivial
>
> Add Bonnier Broadcasting to current list of Airflow users



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2363) S3 remote logging appending tuple instead of str

2018-04-25 Thread Kevin Yang (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16453189#comment-16453189
 ] 

Kevin Yang commented on AIRFLOW-2363:
-

Seems like the orm is somehow not configured. It's less intuitive to find the 
root cause, I'll set up a S3 env on my side to debug. Sry for the any trouble 
the bug might bring.

> S3 remote logging appending tuple instead of str
> 
>
> Key: AIRFLOW-2363
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2363
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Reporter: Kyle Hamlin
>Assignee: Kevin Yang
>Priority: Major
> Fix For: 1.10.0
>
>
> A recent merge into master that added support for Elasticsearch logging seems 
> to have broken S3 logging by returning a tuple instead of a string.
> [https://github.com/apache/incubator-airflow/commit/ec38ba9594395de04ec932481212a86fbe9ae107#diff-0442332ecbe42ebbf426911c68d8cd4aR128]
>  
> following errors thrown:
>  
> *Session NoneType error*
>  Traceback (most recent call last):
>    File 
> "/usr/local/lib/python3.6/site-packages/airflow/utils/log/s3_task_handler.py",
>  line 171, in s3_write
>      encrypt=configuration.conf.getboolean('core', 'ENCRYPT_S3_LOGS'),
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", 
> line 274, in load_string
>      encrypt=encrypt)
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", 
> line 313, in load_bytes
>      client = self.get_conn()
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", 
> line 34, in get_conn
>      return self.get_client_type('s3')
>    File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/aws_hook.py", 
> line 151, in get_client_type
>      session, endpoint_url = self._get_credentials(region_name)
>    File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/aws_hook.py", 
> line 97, in _get_credentials
>      connection_object = self.get_connection(self.aws_conn_id)
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/base_hook.py", 
> line 82, in get_connection
>      conn = random.choice(cls.get_connections(conn_id))
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/base_hook.py", 
> line 77, in get_connections
>      conns = cls._get_connections_from_db(conn_id)
>    File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 
> 72, in wrapper
>      with create_session() as session:
>    File "/usr/local/lib/python3.6/contextlib.py", line 81, in __enter__
>      return next(self.gen)
>    File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 
> 41, in create_session
>      session = settings.Session()
>  TypeError: 'NoneType' object is not callable
>  
> *TypeError must be str not tuple*
>  [2018-04-16 18:37:28,200] ERROR in app: Exception on 
> /admin/airflow/get_logs_with_metadata [GET]
>  Traceback (most recent call last):
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1982, in 
> wsgi_app
>      response = self.full_dispatch_request()
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1614, in 
> full_dispatch_request
>      rv = self.handle_user_exception(e)
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1517, in 
> handle_user_exception
>      reraise(exc_type, exc_value, tb)
>    File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 33, 
> in reraise
>      raise value
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1612, in 
> full_dispatch_request
>      rv = self.dispatch_request()
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1598, in 
> dispatch_request
>      return self.view_functions[rule.endpoint](**req.view_args)
>    File "/usr/local/lib/python3.6/site-packages/flask_admin/base.py", line 
> 69, in inner
>      return self._run_view(f, *args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/flask_admin/base.py", line 
> 368, in _run_view
>      return fn(self, *args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/flask_login.py", line 755, in 
> decorated_view
>      return func(*args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/airflow/www/utils.py", line 
> 269, in wrapper
>      return f(*args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 
> 74, in wrapper
>      return func(*args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/airflow/www/views.py", line 
> 770, in get_logs_with_metadata
>      logs, metadatas = handler.read(ti, try_number, metadata=metadata)
>    File 
> "/usr/local/lib/python3.6/site-packages/airflow/utils/log/file_task_handler.py",
>  line 165, in read
>      logs[i] += log
>  

[jira] [Resolved] (AIRFLOW-1835) Documentation doesn't have file format for variable file uploads

2018-04-25 Thread Siddharth Anand (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1835?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Siddharth Anand resolved AIRFLOW-1835.
--
   Resolution: Fixed
Fix Version/s: 2.0.0

Issue resolved by pull request #2802
[https://github.com/apache/incubator-airflow/pull/2802]

> Documentation doesn't have file format for variable file uploads
> 
>
> Key: AIRFLOW-1835
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1835
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: Documentation
>Reporter: Bovard Doerschuk-Tiberi
>Assignee: Bovard Doerschuk-Tiberi
>Priority: Trivial
> Fix For: 2.0.0
>
>
> Currently the documentation doesn't tell you that to upload settings you need 
> a json file.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1835) Documentation doesn't have file format for variable file uploads

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1835?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16453107#comment-16453107
 ] 

ASF subversion and git services commented on AIRFLOW-1835:
--

Commit 2a8bb0e1b7a15a70519020e69e7369700df5dc37 in incubator-airflow's branch 
refs/heads/master from [~bovard]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=2a8bb0e ]

[AIRFLOW-1835] Update docs: Variable file is json

Searching through all the documentation I couldn't
find anywhere
that explained what file format it expected for
uploading settings.

Closes #2802 from bovard/variable_files_are_json


> Documentation doesn't have file format for variable file uploads
> 
>
> Key: AIRFLOW-1835
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1835
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: Documentation
>Reporter: Bovard Doerschuk-Tiberi
>Assignee: Bovard Doerschuk-Tiberi
>Priority: Trivial
> Fix For: 2.0.0
>
>
> Currently the documentation doesn't tell you that to upload settings you need 
> a json file.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-1835] Update docs: Variable file is json

2018-04-25 Thread sanand
Repository: incubator-airflow
Updated Branches:
  refs/heads/master ec0d227ef -> 2a8bb0e1b


[AIRFLOW-1835] Update docs: Variable file is json

Searching through all the documentation I couldn't
find anywhere
that explained what file format it expected for
uploading settings.

Closes #2802 from bovard/variable_files_are_json


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/2a8bb0e1
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/2a8bb0e1
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/2a8bb0e1

Branch: refs/heads/master
Commit: 2a8bb0e1b7a15a70519020e69e7369700df5dc37
Parents: ec0d227
Author: Bovard Doerschuk-Tiberi 
Authored: Wed Apr 25 14:21:22 2018 -0700
Committer: r39132 
Committed: Wed Apr 25 14:21:35 2018 -0700

--
 docs/concepts.rst | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/2a8bb0e1/docs/concepts.rst
--
diff --git a/docs/concepts.rst b/docs/concepts.rst
index e85238d..89c25fe 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -410,7 +410,8 @@ Variables
 Variables are a generic way to store and retrieve arbitrary content or
 settings as a simple key value store within Airflow. Variables can be
 listed, created, updated and deleted from the UI (``Admin -> Variables``),
-code or CLI. While your pipeline code definition and most of your constants
+code or CLI. In addition, json settings files can be bulk uploaded through 
+the UI. While your pipeline code definition and most of your constants
 and variables should be defined in code and stored in source control,
 it can be useful to have some variables or configuration items
 accessible and modifiable through the UI.



[jira] [Commented] (AIRFLOW-2363) S3 remote logging appending tuple instead of str

2018-04-25 Thread James Davidheiser (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16453104#comment-16453104
 ] 

James Davidheiser commented on AIRFLOW-2363:


FWIW I hit this bug too, and tried installing the version from the pull request 
(hash 0f526bb6c244a974cae5d68d088706ed90d6b916) and it still failed with the 
NoneType error.  I went back to a commit before the breaking change, 
(5cb530b455be54e6b58eae19c8c10ef8f5cf955d) and it worked again.

> S3 remote logging appending tuple instead of str
> 
>
> Key: AIRFLOW-2363
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2363
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Reporter: Kyle Hamlin
>Assignee: Kevin Yang
>Priority: Major
> Fix For: 1.10.0
>
>
> A recent merge into master that added support for Elasticsearch logging seems 
> to have broken S3 logging by returning a tuple instead of a string.
> [https://github.com/apache/incubator-airflow/commit/ec38ba9594395de04ec932481212a86fbe9ae107#diff-0442332ecbe42ebbf426911c68d8cd4aR128]
>  
> following errors thrown:
>  
> *Session NoneType error*
>  Traceback (most recent call last):
>    File 
> "/usr/local/lib/python3.6/site-packages/airflow/utils/log/s3_task_handler.py",
>  line 171, in s3_write
>      encrypt=configuration.conf.getboolean('core', 'ENCRYPT_S3_LOGS'),
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", 
> line 274, in load_string
>      encrypt=encrypt)
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", 
> line 313, in load_bytes
>      client = self.get_conn()
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", 
> line 34, in get_conn
>      return self.get_client_type('s3')
>    File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/aws_hook.py", 
> line 151, in get_client_type
>      session, endpoint_url = self._get_credentials(region_name)
>    File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/aws_hook.py", 
> line 97, in _get_credentials
>      connection_object = self.get_connection(self.aws_conn_id)
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/base_hook.py", 
> line 82, in get_connection
>      conn = random.choice(cls.get_connections(conn_id))
>    File "/usr/local/lib/python3.6/site-packages/airflow/hooks/base_hook.py", 
> line 77, in get_connections
>      conns = cls._get_connections_from_db(conn_id)
>    File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 
> 72, in wrapper
>      with create_session() as session:
>    File "/usr/local/lib/python3.6/contextlib.py", line 81, in __enter__
>      return next(self.gen)
>    File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 
> 41, in create_session
>      session = settings.Session()
>  TypeError: 'NoneType' object is not callable
>  
> *TypeError must be str not tuple*
>  [2018-04-16 18:37:28,200] ERROR in app: Exception on 
> /admin/airflow/get_logs_with_metadata [GET]
>  Traceback (most recent call last):
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1982, in 
> wsgi_app
>      response = self.full_dispatch_request()
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1614, in 
> full_dispatch_request
>      rv = self.handle_user_exception(e)
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1517, in 
> handle_user_exception
>      reraise(exc_type, exc_value, tb)
>    File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 33, 
> in reraise
>      raise value
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1612, in 
> full_dispatch_request
>      rv = self.dispatch_request()
>    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1598, in 
> dispatch_request
>      return self.view_functions[rule.endpoint](**req.view_args)
>    File "/usr/local/lib/python3.6/site-packages/flask_admin/base.py", line 
> 69, in inner
>      return self._run_view(f, *args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/flask_admin/base.py", line 
> 368, in _run_view
>      return fn(self, *args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/flask_login.py", line 755, in 
> decorated_view
>      return func(*args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/airflow/www/utils.py", line 
> 269, in wrapper
>      return f(*args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 
> 74, in wrapper
>      return func(*args, **kwargs)
>    File "/usr/local/lib/python3.6/site-packages/airflow/www/views.py", line 
> 770, in get_logs_with_metadata
>      logs, metadatas = handler.read(ti, try_number, metadata=metadata)
>    File 
> 

[jira] [Resolved] (AIRFLOW-1781) Incorrect search user in group in LDAP authentication

2018-04-25 Thread Siddharth Anand (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1781?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Siddharth Anand resolved AIRFLOW-1781.
--
   Resolution: Fixed
Fix Version/s: 2.0.0

Issue resolved by pull request #2750
[https://github.com/apache/incubator-airflow/pull/2750]

> Incorrect search user in group in LDAP authentication
> -
>
> Key: AIRFLOW-1781
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1781
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Reporter: Konstantin Privezentsev
>Assignee: Konstantin Privezentsev
>Priority: Minor
> Fix For: 2.0.0
>
>
> LDAP DN is case insensitive. But search inside group is case sensitive.
> As a result user doesn't have correct permissions after login in case when he 
> logins with name "user" and  in LDAP his DN is "User".



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-1781) Incorrect search user in group in LDAP authentication

2018-04-25 Thread Siddharth Anand (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1781?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Siddharth Anand reassigned AIRFLOW-1781:


Assignee: Konstantin Privezentsev

> Incorrect search user in group in LDAP authentication
> -
>
> Key: AIRFLOW-1781
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1781
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Reporter: Konstantin Privezentsev
>Assignee: Konstantin Privezentsev
>Priority: Minor
> Fix For: 2.0.0
>
>
> LDAP DN is case insensitive. But search inside group is case sensitive.
> As a result user doesn't have correct permissions after login in case when he 
> logins with name "user" and  in LDAP his DN is "User".



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1781) Incorrect search user in group in LDAP authentication

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16453093#comment-16453093
 ] 

ASF subversion and git services commented on AIRFLOW-1781:
--

Commit ec0d227ef8bc02f63515e7cb011f15d8fcf7730b in incubator-airflow's branch 
refs/heads/master from k.privezentsev
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=ec0d227 ]

[AIRFLOW-1781] Make search case-insensitive in LDAP group

Closes #2750 from patsak/f/ldap_search


> Incorrect search user in group in LDAP authentication
> -
>
> Key: AIRFLOW-1781
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1781
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Reporter: Konstantin Privezentsev
>Priority: Minor
>
> LDAP DN is case insensitive. But search inside group is case sensitive.
> As a result user doesn't have correct permissions after login in case when he 
> logins with name "user" and  in LDAP his DN is "User".



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-1781] Make search case-insensitive in LDAP group

2018-04-25 Thread sanand
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 8e8c4eb4d -> ec0d227ef


[AIRFLOW-1781] Make search case-insensitive in LDAP group

Closes #2750 from patsak/f/ldap_search


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/ec0d227e
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/ec0d227e
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/ec0d227e

Branch: refs/heads/master
Commit: ec0d227ef8bc02f63515e7cb011f15d8fcf7730b
Parents: 8e8c4eb
Author: k.privezentsev 
Authored: Wed Apr 25 14:07:30 2018 -0700
Committer: r39132 
Committed: Wed Apr 25 14:07:53 2018 -0700

--
 airflow/contrib/auth/backends/ldap_auth.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/ec0d227e/airflow/contrib/auth/backends/ldap_auth.py
--
diff --git a/airflow/contrib/auth/backends/ldap_auth.py 
b/airflow/contrib/auth/backends/ldap_auth.py
index c887ffb..1ab5fcd 100644
--- a/airflow/contrib/auth/backends/ldap_auth.py
+++ b/airflow/contrib/auth/backends/ldap_auth.py
@@ -77,12 +77,13 @@ def get_ldap_connection(dn=None, password=None):
 
 def group_contains_user(conn, search_base, group_filter, user_name_attr, 
username):
 search_filter = '(&({0}))'.format(group_filter)
+
 if not conn.search(native(search_base), native(search_filter),
attributes=[native(user_name_attr)]):
 log.warning("Unable to find group for %s %s", search_base, 
search_filter)
 else:
 for entry in conn.entries:
-if username in getattr(entry, user_name_attr).values:
+if username.lower() in map(lambda attr: attr.lower(), 
getattr(entry, user_name_attr).values):
 return True
 
 return False



[jira] [Created] (AIRFLOW-2377) Improve Sendgrid sender support

2018-04-25 Thread Marcin Szymanski (JIRA)
Marcin Szymanski created AIRFLOW-2377:
-

 Summary: Improve Sendgrid sender support
 Key: AIRFLOW-2377
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2377
 Project: Apache Airflow
  Issue Type: Improvement
  Components: contrib
Reporter: Marcin Szymanski
Assignee: Marcin Szymanski


* Add support for for sender name
 * Allow passing sender email and name via kwargs



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-2230) [possible dup] tutorial does not specify initdb/upgradedb prerequisite command, although quick start does

2018-04-25 Thread andy dreyfuss (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

andy dreyfuss closed AIRFLOW-2230.
--
Resolution: Not A Problem

> [possible dup] tutorial does not specify initdb/upgradedb prerequisite 
> command, although quick start does
> -
>
> Key: AIRFLOW-2230
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2230
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: docs, Documentation
>Reporter: andy dreyfuss
>Priority: Critical
>
> Quick start specifies `initdb` but full tutorial docs afaict do not specify 
> this prerequisite command. If this is not run before everything else you end 
> up with:
> sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: 
> connection [SQL: 'SELECT connection.conn_id AS connection_conn_id \nFROM 
> connection GROUP BY connection.conn_id'] (Background on this error at: 
> [http://sqlalche.me/e/e3q8])



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2042) Default browser menu appears over autocomplete menu in DAG search box

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2042?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16452854#comment-16452854
 ] 

ASF subversion and git services commented on AIRFLOW-2042:
--

Commit 8e8c4eb4d6a5460de342afa7bd79f6c48dd124cc in incubator-airflow's branch 
refs/heads/master from [~hakeraj]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=8e8c4eb ]

[AIRFLOW-2042] Fix browser menu appearing over the autocomplete menu

Closes #2984 from fox/fix-autocomplete


> Default browser menu appears over autocomplete menu in DAG search box
> -
>
> Key: AIRFLOW-2042
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2042
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.9.0
> Environment: Safari 11.0.2
>Reporter: Saša Branković
>Assignee: Saša Branković
>Priority: Minor
> Fix For: 2.0.0
>
> Attachments: issue.png
>
>
> When you type something in DAG search bar, default browser menu appears over 
> the  autocomplete menu - making it difficult to search. This happens in 
> Safari only - not seeing it in Chrome or Firefox.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2042) Default browser menu appears over autocomplete menu in DAG search box

2018-04-25 Thread Siddharth Anand (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2042?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Siddharth Anand resolved AIRFLOW-2042.
--
   Resolution: Fixed
Fix Version/s: (was: 1.9.0)
   2.0.0

Issue resolved by pull request #2984
[https://github.com/apache/incubator-airflow/pull/2984]

> Default browser menu appears over autocomplete menu in DAG search box
> -
>
> Key: AIRFLOW-2042
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2042
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.9.0
> Environment: Safari 11.0.2
>Reporter: Saša Branković
>Assignee: Saša Branković
>Priority: Minor
> Fix For: 2.0.0
>
> Attachments: issue.png
>
>
> When you type something in DAG search bar, default browser menu appears over 
> the  autocomplete menu - making it difficult to search. This happens in 
> Safari only - not seeing it in Chrome or Firefox.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2042] Fix browser menu appearing over the autocomplete menu

2018-04-25 Thread sanand
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 0889770dc -> 8e8c4eb4d


[AIRFLOW-2042] Fix browser menu appearing over the autocomplete menu

Closes #2984 from fox/fix-autocomplete


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/8e8c4eb4
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/8e8c4eb4
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/8e8c4eb4

Branch: refs/heads/master
Commit: 8e8c4eb4d6a5460de342afa7bd79f6c48dd124cc
Parents: 0889770
Author: Sasa Brankovic 
Authored: Wed Apr 25 11:39:26 2018 -0700
Committer: r39132 
Committed: Wed Apr 25 11:39:26 2018 -0700

--
 airflow/www/templates/airflow/dags.html | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/8e8c4eb4/airflow/www/templates/airflow/dags.html
--
diff --git a/airflow/www/templates/airflow/dags.html 
b/airflow/www/templates/airflow/dags.html
index eb22708..d22bfb3 100644
--- a/airflow/www/templates/airflow/dags.html
+++ b/airflow/www/templates/airflow/dags.html
@@ -278,9 +278,10 @@
 
   $input.change(function() {
 var current = $input.typeahead("getActive");
-
   });
 
+  $input.attr("autocomplete", "off");
+
   $('#dags').dataTable({
 "iDisplayLength": 500,
 "bSort": false,



[jira] [Created] (AIRFLOW-2376) backports.configparser.NoSectionError: No section: u'hive'

2018-04-25 Thread Ruslan Dautkhanov (JIRA)
Ruslan Dautkhanov created AIRFLOW-2376:
--

 Summary: backports.configparser.NoSectionError: No section: u'hive'
 Key: AIRFLOW-2376
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2376
 Project: Apache Airflow
  Issue Type: Improvement
  Components: configuration, core
Affects Versions: 1.10.0, 2.0.0
Reporter: Ruslan Dautkhanov


Airflow master throws 

{noformat}
[2018-04-25 11:42:42,452] {models.py:1509} INFO - Executing 
 on 2018-04-25T10:28:11.172589-06:00
[2018-04-25 11:42:42,452] {base_task_runner.py:123} INFO - Running: ['bash', 
'-c', u'airflow run DISCOVER-Oracle-Load-Mar2017-v1 start 
2018-04-25T10:28:11.172589-06:00 --job_id 545 --pool DISCOVER-Prod --raw -sd 
DAGS_FOLDER/discover/discover-ora-load-2.py --cfg_path /tmp/tmpyI6FvX']
[2018-04-25 11:42:43,241] {base_task_runner.py:106} INFO - Job 545: Subtask 
start 
/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/psycopg2/__init__.py:144:
 UserWarning: The psycopg2 wheel package will be renamed from release 2.8; in 
order to keep installing from binary please use "pip install psycopg2-binary" 
instead. For details see: 
.
[2018-04-25 11:42:43,242] {base_task_runner.py:106} INFO - Job 545: Subtask 
start   """)
[2018-04-25 11:42:43,355] {base_task_runner.py:106} INFO - Job 545: Subtask 
start [2018-04-25 11:42:43,353] {__init__.py:50} INFO - Using executor 
LocalExecutor
[2018-04-25 11:42:43,546] {base_task_runner.py:106} INFO - Job 545: Subtask 
start Traceback (most recent call last):
[2018-04-25 11:42:43,546] {base_task_runner.py:106} INFO - Job 545: Subtask 
start   File "/opt/cloudera/parcels/Anaconda/bin/airflow", line 6, in 
[2018-04-25 11:42:43,546] {base_task_runner.py:106} INFO - Job 545: Subtask 
start exec(compile(open(__file__).read(), __file__, 'exec'))
[2018-04-25 11:42:43,546] {base_task_runner.py:106} INFO - Job 545: Subtask 
start   File 
"/opt/airflow/airflow-20180420/src/apache-airflow/airflow/bin/airflow", line 
32, in 
[2018-04-25 11:42:43,547] {base_task_runner.py:106} INFO - Job 545: Subtask 
start args.func(args)
[2018-04-25 11:42:43,547] {base_task_runner.py:106} INFO - Job 545: Subtask 
start   File 
"/opt/airflow/airflow-20180420/src/apache-airflow/airflow/utils/cli.py", line 
77, in wrapper
[2018-04-25 11:42:43,547] {base_task_runner.py:106} INFO - Job 545: Subtask 
start raise e
[2018-04-25 11:42:43,547] {base_task_runner.py:106} INFO - Job 545: Subtask 
start backports.configparser.NoSectionError: No section: u'hive'
[2018-04-25 11:42:47,415] {logging_mixin.py:95} INFO - [2018-04-25 
11:42:47,415] {jobs.py:2548} INFO - Task exited with return code 1

{noformat}

Adding empty "hive" section into airflow.cfg works around this. 





--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-XXX] Remove wheelhouse files from travis not owned by travis

2018-04-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 6c45b8c5f -> 0889770dc


[AIRFLOW-XXX] Remove wheelhouse files from travis not owned by travis


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/0889770d
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/0889770d
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/0889770d

Branch: refs/heads/master
Commit: 0889770dc56d20e4e3e57e994e0f8b5993219f76
Parents: 6c45b8c
Author: Bolke de Bruin 
Authored: Wed Apr 25 19:46:43 2018 +0200
Committer: Bolke de Bruin 
Committed: Wed Apr 25 19:46:43 2018 +0200

--
 .travis.yml | 1 +
 1 file changed, 1 insertion(+)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/0889770d/.travis.yml
--
diff --git a/.travis.yml b/.travis.yml
index d59e885..6d29a7a 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -105,6 +105,7 @@ before_install:
   - sudo apt-get install -y oracle-java8-installer
   - jdk_switcher use oraclejdk8
   - cd $TRAVIS_BUILD_DIR
+  - find ${HOME}/.wheelhouse/ \! -user ${USER} -exec sudo rm -rf {} \;
 install:
   - pip install --upgrade pip
   - pip install tox



[jira] [Updated] (AIRFLOW-2351) timezone fix for @once DAGs

2018-04-25 Thread Ruslan Dautkhanov (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2351?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ruslan Dautkhanov updated AIRFLOW-2351:
---
Fix Version/s: 1.10.0

> timezone fix for @once DAGs
> ---
>
> Key: AIRFLOW-2351
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2351
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: core, scheduler
>Affects Versions: Airflow 2.0, 1.10.0, 1.9.1, 2.0.0
>Reporter: Ruslan Dautkhanov
>Assignee: Ruslan Dautkhanov
>Priority: Major
> Fix For: 1.10.0, 2.0.0
>
>
> As discussed on the dev list: 
>  
> {quote}
> Upgraded Airflow .. getting following error [1] when processing a DAG
> We have 'start_date': None set in default_args.. but this used to work im 
> previous airflow versions.
> This is a '@once DAG.. so we don't need a start_date (no back fill).
> {quote}
>  
> {quote}[2018-01-16 16:05:25,283] \{models.py:293} ERROR - Failed to import: 
> /home/rdautkha/airflow/dags/discover/discover-ora-load-2.py
> Traceback (most recent call last):
>   File "/opt/airflow/airflow-20180116/src/apache-airflow/airflow/models.py", 
> line 290, in process_file
>     m = imp.load_source(mod_name, filepath)
>   File "/home/rdautkha/airflow/dags/discover/discover-ora-load-2.py", line 
> 66, in 
>     orientation                    = 'TB',                          # default 
> graph view
>   File "/opt/airflow/airflow-20180116/src/apache-airflow/airflow/models.py", 
> line 2951, in __init__
>     self.timezone = self.default_args['start_date'].tzinfo
> AttributeError: 'NoneType' object has no attribute 'tzinfo'{quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2375) GCS logging documentation needs updating

2018-04-25 Thread Berislav Lopac (JIRA)
Berislav Lopac created AIRFLOW-2375:
---

 Summary: GCS logging documentation needs updating
 Key: AIRFLOW-2375
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2375
 Project: Apache Airflow
  Issue Type: Task
Reporter: Berislav Lopac
Assignee: Berislav Lopac


The instructions how to save log files on Google Cloud Storage seem to be out 
of date, as the code there differs from the actual code in 
{{airflow/config_templates/airflow_local_settings.py}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2374) Airflow fails to show logs

2018-04-25 Thread Berislav Lopac (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2374?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Berislav Lopac updated AIRFLOW-2374:

Priority: Blocker  (was: Major)

> Airflow fails to show logs
> --
>
> Key: AIRFLOW-2374
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2374
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Berislav Lopac
>Assignee: Berislav Lopac
>Priority: Blocker
>
> When viewing a log in the webserver, the page shows a loading gif and the log 
> never appears. Looking in the Javascript console, the problem appears to be 
> error 500 when loading the {{get_logs_with_metadata}} endpoint, givving the 
> following trace:
> {code:java}
>   / (  ()   )  \___
>  /( (  (  )   _))  )   )\
>(( (   )()  )   (   )  )
>  ((/  ( _(   )   (   _) ) (  () )  )
> ( (  ( (_)   (((   )  .((_ ) .  )_
>( (  )(  (  ))   ) . ) (   )
>   (  (   (  (   ) (  _  ( _) ).  ) . ) ) ( )
>   ( (  (   ) (  )   (  )) ) _)(   )  )  )
>  ( (  ( \ ) ((_  ( ) ( )  )   ) )  )) ( )
>   (  (   (  (   (_ ( ) ( _)  ) (  )  )   )
>  ( (  ( (  (  ) (_  )  ) )  _)   ) _( ( )
>   ((  (   )(( _)   _) _(_ (  (_ )
>(_((__(_(__(( ( ( |  ) ) ) )_))__))_)___)
>((__)\\||lll|l||///  \_))
> (   /(/ (  )  ) )\   )
>   (( ( ( | | ) ) )\   )
>(   /(| / ( )) ) ) )) )
>  ( ( _(|)_) )
>   (  ||\(|(|)|/|| )
> (|(||(||))
>   ( //|/l|||)|\\ \ )
> (/ / //  /|//\\  \ \  \ _)
> ---
> Node: airflow-nods-dev
> ---
> Traceback (most recent call last):
>   File 
> "/opt/airflow/src/apache-airflow/airflow/utils/log/gcs_task_handler.py", line 
> 113, in _read
> remote_log = self.gcs_read(remote_loc)
>   File 
> "/opt/airflow/src/apache-airflow/airflow/utils/log/gcs_task_handler.py", line 
> 131, in gcs_read
> return self.hook.download(bkt, blob).decode()
>   File "/opt/airflow/src/apache-airflow/airflow/contrib/hooks/gcs_hook.py", 
> line 107, in download
> .get_media(bucket=bucket, object=object) \
>   File "/usr/local/lib/python3.6/dist-packages/oauth2client/_helpers.py", 
> line 133, in positional_wrapper
> return wrapped(*args, **kwargs)
>   File "/usr/local/lib/python3.6/dist-packages/googleapiclient/http.py", line 
> 841, in execute
> raise HttpError(resp, content, uri=self.uri)
> googleapiclient.errors.HttpError:  https://www.googleapis.com/storage/v1/b/bucket-af/o/test-logs%2Fgeneric_transfer_single%2Ftransfer_file%2F2018-04-25T13%3A00%3A51.250983%2B00%3A00%2F1.log?alt=media
>  returned "Not Found">
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1982, in 
> wsgi_app
> response = self.full_dispatch_request()
>   File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1614, in 
> full_dispatch_request
> rv = self.handle_user_exception(e)
>   File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1517, in 
> handle_user_exception
> reraise(exc_type, exc_value, tb)
>   File "/usr/local/lib/python3.6/dist-packages/flask/_compat.py", line 33, in 
> reraise
> raise value
>   File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1612, in 
> full_dispatch_request
> rv = self.dispatch_request()
>   File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1598, in 
> dispatch_request
> return self.view_functions[rule.endpoint](**req.view_args)
>   File "/usr/local/lib/python3.6/dist-packages/flask_admin/base.py", line 69, 
> in inner
> return self._run_view(f, *args, **kwargs)
>   File "/usr/local/lib/python3.6/dist-packages/flask_admin/base.py", line 
> 368, in _run_view
> return fn(self, *args, **kwargs)
>   File "/usr/local/lib/python3.6/dist-packages/flask_login.py", line 758, in 
> decorated_view
> return func(*args, **kwargs)
>   File "/opt/airflow/src/apache-airflow/airflow/www/utils.py", line 269, in 
> wrapper
> return f(*args, **kwargs)
>   File "/opt/airflow/src/apache-airflow/airflow/utils/db.py", line 74, in 
> wrapper
> return func(*args, **kwargs)
>   File 

[jira] [Created] (AIRFLOW-2374) Airflow fails to show logs

2018-04-25 Thread Berislav Lopac (JIRA)
Berislav Lopac created AIRFLOW-2374:
---

 Summary: Airflow fails to show logs
 Key: AIRFLOW-2374
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2374
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Berislav Lopac
Assignee: Berislav Lopac


When viewing a log in the webserver, the page shows a loading gif and the log 
never appears. Looking in the Javascript console, the problem appears to be 
error 500 when loading the {{get_logs_with_metadata}} endpoint, givving the 
following trace:
{code:java}
  / (  ()   )  \___
 /( (  (  )   _))  )   )\
   (( (   )()  )   (   )  )
 ((/  ( _(   )   (   _) ) (  () )  )
( (  ( (_)   (((   )  .((_ ) .  )_
   ( (  )(  (  ))   ) . ) (   )
  (  (   (  (   ) (  _  ( _) ).  ) . ) ) ( )
  ( (  (   ) (  )   (  )) ) _)(   )  )  )
 ( (  ( \ ) ((_  ( ) ( )  )   ) )  )) ( )
  (  (   (  (   (_ ( ) ( _)  ) (  )  )   )
 ( (  ( (  (  ) (_  )  ) )  _)   ) _( ( )
  ((  (   )(( _)   _) _(_ (  (_ )
   (_((__(_(__(( ( ( |  ) ) ) )_))__))_)___)
   ((__)\\||lll|l||///  \_))
(   /(/ (  )  ) )\   )
  (( ( ( | | ) ) )\   )
   (   /(| / ( )) ) ) )) )
 ( ( _(|)_) )
  (  ||\(|(|)|/|| )
(|(||(||))
  ( //|/l|||)|\\ \ )
(/ / //  /|//\\  \ \  \ _)
---
Node: airflow-nods-dev
---
Traceback (most recent call last):
  File "/opt/airflow/src/apache-airflow/airflow/utils/log/gcs_task_handler.py", 
line 113, in _read
remote_log = self.gcs_read(remote_loc)
  File "/opt/airflow/src/apache-airflow/airflow/utils/log/gcs_task_handler.py", 
line 131, in gcs_read
return self.hook.download(bkt, blob).decode()
  File "/opt/airflow/src/apache-airflow/airflow/contrib/hooks/gcs_hook.py", 
line 107, in download
.get_media(bucket=bucket, object=object) \
  File "/usr/local/lib/python3.6/dist-packages/oauth2client/_helpers.py", line 
133, in positional_wrapper
return wrapped(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/googleapiclient/http.py", line 
841, in execute
raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: https://www.googleapis.com/storage/v1/b/bucket-af/o/test-logs%2Fgeneric_transfer_single%2Ftransfer_file%2F2018-04-25T13%3A00%3A51.250983%2B00%3A00%2F1.log?alt=media
 returned "Not Found">

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1982, in 
wsgi_app
response = self.full_dispatch_request()
  File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1614, in 
full_dispatch_request
rv = self.handle_user_exception(e)
  File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1517, in 
handle_user_exception
reraise(exc_type, exc_value, tb)
  File "/usr/local/lib/python3.6/dist-packages/flask/_compat.py", line 33, in 
reraise
raise value
  File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1612, in 
full_dispatch_request
rv = self.dispatch_request()
  File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1598, in 
dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
  File "/usr/local/lib/python3.6/dist-packages/flask_admin/base.py", line 69, 
in inner
return self._run_view(f, *args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/flask_admin/base.py", line 368, 
in _run_view
return fn(self, *args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/flask_login.py", line 758, in 
decorated_view
return func(*args, **kwargs)
  File "/opt/airflow/src/apache-airflow/airflow/www/utils.py", line 269, in 
wrapper
return f(*args, **kwargs)
  File "/opt/airflow/src/apache-airflow/airflow/utils/db.py", line 74, in 
wrapper
return func(*args, **kwargs)
  File "/opt/airflow/src/apache-airflow/airflow/www/views.py", line 770, in 
get_logs_with_metadata
logs, metadatas = handler.read(ti, try_number, metadata=metadata)
  File 
"/opt/airflow/src/apache-airflow/airflow/utils/log/file_task_handler.py", line 
164, in read
log, metadata = self._read(task_instance, try_number, metadata)
  File 

[jira] [Commented] (AIRFLOW-2336) Update hive_hook dependencies so that it can work with Python 3

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16452013#comment-16452013
 ] 

ASF subversion and git services commented on AIRFLOW-2336:
--

Commit 6c45b8c5f2ad1af8faea13529dae01cee10b4937 in incubator-airflow's branch 
refs/heads/master from [~lanzani]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=6c45b8c ]

[AIRFLOW-2336] Use hmsclient in hive_hook

The package hmsclient is Python2/3 compatible and
offer a handy context
manager to handle opening and closing connections.

Closes #3239 from gglanzani/AIRFLOW-2336


> Update hive_hook dependencies so that it can work with Python 3
> ---
>
> Key: AIRFLOW-2336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2336
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: Airflow 1.9.0
>Reporter: Giovanni Lanzani
>Assignee: Giovanni Lanzani
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.10.0
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> I already have a new version of the hive metastore thrift client out. I'm 
> updating it and I will update Airflow consequently (without changing the API)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2336) Update hive_hook dependencies so that it can work with Python 3

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16452016#comment-16452016
 ] 

ASF subversion and git services commented on AIRFLOW-2336:
--

Commit 84cfbf6a1f0402a3c41921a5bd24564b8024328e in incubator-airflow's branch 
refs/heads/v1-10-test from [~lanzani]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=84cfbf6 ]

[AIRFLOW-2336] Use hmsclient in hive_hook

The package hmsclient is Python2/3 compatible and
offer a handy context
manager to handle opening and closing connections.

Closes #3239 from gglanzani/AIRFLOW-2336

(cherry picked from commit 6c45b8c5f2ad1af8faea13529dae01cee10b4937)
Signed-off-by: Bolke de Bruin 


> Update hive_hook dependencies so that it can work with Python 3
> ---
>
> Key: AIRFLOW-2336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2336
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: Airflow 1.9.0
>Reporter: Giovanni Lanzani
>Assignee: Giovanni Lanzani
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.10.0
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> I already have a new version of the hive metastore thrift client out. I'm 
> updating it and I will update Airflow consequently (without changing the API)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2336) Update hive_hook dependencies so that it can work with Python 3

2018-04-25 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2336?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-2336.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

Issue resolved by pull request #3239
[https://github.com/apache/incubator-airflow/pull/3239]

> Update hive_hook dependencies so that it can work with Python 3
> ---
>
> Key: AIRFLOW-2336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2336
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: Airflow 1.9.0
>Reporter: Giovanni Lanzani
>Assignee: Giovanni Lanzani
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.10.0
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> I already have a new version of the hive metastore thrift client out. I'm 
> updating it and I will update Airflow consequently (without changing the API)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2336) Update hive_hook dependencies so that it can work with Python 3

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16452014#comment-16452014
 ] 

ASF subversion and git services commented on AIRFLOW-2336:
--

Commit 6c45b8c5f2ad1af8faea13529dae01cee10b4937 in incubator-airflow's branch 
refs/heads/master from [~lanzani]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=6c45b8c ]

[AIRFLOW-2336] Use hmsclient in hive_hook

The package hmsclient is Python2/3 compatible and
offer a handy context
manager to handle opening and closing connections.

Closes #3239 from gglanzani/AIRFLOW-2336


> Update hive_hook dependencies so that it can work with Python 3
> ---
>
> Key: AIRFLOW-2336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2336
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: Airflow 1.9.0
>Reporter: Giovanni Lanzani
>Assignee: Giovanni Lanzani
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.10.0
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> I already have a new version of the hive metastore thrift client out. I'm 
> updating it and I will update Airflow consequently (without changing the API)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2336) Update hive_hook dependencies so that it can work with Python 3

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16452017#comment-16452017
 ] 

ASF subversion and git services commented on AIRFLOW-2336:
--

Commit 84cfbf6a1f0402a3c41921a5bd24564b8024328e in incubator-airflow's branch 
refs/heads/v1-10-test from [~lanzani]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=84cfbf6 ]

[AIRFLOW-2336] Use hmsclient in hive_hook

The package hmsclient is Python2/3 compatible and
offer a handy context
manager to handle opening and closing connections.

Closes #3239 from gglanzani/AIRFLOW-2336

(cherry picked from commit 6c45b8c5f2ad1af8faea13529dae01cee10b4937)
Signed-off-by: Bolke de Bruin 


> Update hive_hook dependencies so that it can work with Python 3
> ---
>
> Key: AIRFLOW-2336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2336
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: Airflow 1.9.0
>Reporter: Giovanni Lanzani
>Assignee: Giovanni Lanzani
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.10.0
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> I already have a new version of the hive metastore thrift client out. I'm 
> updating it and I will update Airflow consequently (without changing the API)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2336] Use hmsclient in hive_hook

2018-04-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/v1-10-test 92e2ea60f -> 84cfbf6a1


[AIRFLOW-2336] Use hmsclient in hive_hook

The package hmsclient is Python2/3 compatible and
offer a handy context
manager to handle opening and closing connections.

Closes #3239 from gglanzani/AIRFLOW-2336

(cherry picked from commit 6c45b8c5f2ad1af8faea13529dae01cee10b4937)
Signed-off-by: Bolke de Bruin 


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/84cfbf6a
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/84cfbf6a
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/84cfbf6a

Branch: refs/heads/v1-10-test
Commit: 84cfbf6a1f0402a3c41921a5bd24564b8024328e
Parents: 92e2ea6
Author: Giovanni Lanzani 
Authored: Wed Apr 25 12:23:59 2018 +0200
Committer: Bolke de Bruin 
Committed: Wed Apr 25 12:24:17 2018 +0200

--
 airflow/hooks/hive_hooks.py   | 127 +++---
 setup.py  |   6 +-
 tests/hooks/test_hive_hook.py | 136 -
 3 files changed, 192 insertions(+), 77 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/84cfbf6a/airflow/hooks/hive_hooks.py
--
diff --git a/airflow/hooks/hive_hooks.py b/airflow/hooks/hive_hooks.py
index d278483..65238df 100644
--- a/airflow/hooks/hive_hooks.py
+++ b/airflow/hooks/hive_hooks.py
@@ -7,9 +7,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -28,7 +28,7 @@ import re
 import subprocess
 import time
 from tempfile import NamedTemporaryFile
-import hive_metastore
+import hmsclient
 
 from airflow import configuration as conf
 from airflow.exceptions import AirflowException
@@ -460,7 +460,6 @@ class HiveMetastoreHook(BaseHook):
 """
 from thrift.transport import TSocket, TTransport
 from thrift.protocol import TBinaryProtocol
-from hive_service import ThriftHive
 ms = self.metastore_conn
 auth_mechanism = ms.extra_dejson.get('authMechanism', 'NOSASL')
 if configuration.conf.get('core', 'security') == 'kerberos':
@@ -489,7 +488,7 @@ class HiveMetastoreHook(BaseHook):
 
 protocol = TBinaryProtocol.TBinaryProtocol(transport)
 
-return ThriftHive.Client(protocol)
+return hmsclient.HMSClient(iprot=protocol)
 
 def get_conn(self):
 return self.metastore
@@ -512,10 +511,10 @@ class HiveMetastoreHook(BaseHook):
 >>> hh.check_for_partition('airflow', t, "ds='2015-01-01'")
 True
 """
-self.metastore._oprot.trans.open()
-partitions = self.metastore.get_partitions_by_filter(
-schema, table, partition, 1)
-self.metastore._oprot.trans.close()
+with self.metastore as client:
+partitions = client.get_partitions_by_filter(
+schema, table, partition, 1)
+
 if partitions:
 return True
 else:
@@ -540,15 +539,8 @@ class HiveMetastoreHook(BaseHook):
 >>> hh.check_for_named_partition('airflow', t, "ds=xxx")
 False
 """
-self.metastore._oprot.trans.open()
-try:
-self.metastore.get_partition_by_name(
-schema, table, partition_name)
-return True
-except hive_metastore.ttypes.NoSuchObjectException:
-return False
-finally:
-self.metastore._oprot.trans.close()
+with self.metastore as client:
+return client.check_for_named_partition(schema, table, 
partition_name)
 
 def get_table(self, table_name, db='default'):
 """Get a metastore table object
@@ -560,31 +552,25 @@ class HiveMetastoreHook(BaseHook):
 >>> [col.name for col in t.sd.cols]
 ['state', 'year', 'name', 'gender', 'num']
 """
-self.metastore._oprot.trans.open()
 if db == 'default' and '.' in table_name:
 db, table_name = table_name.split('.')[:2]
-table = self.metastore.get_table(dbname=db, tbl_name=table_name)
-self.metastore._oprot.trans.close()
-return table
+with self.metastore as client:
+return client.get_table(dbname=db, tbl_name=table_name)
 
 def get_tables(self, db, pattern='*'):
 """
 Get a metastore table 

incubator-airflow git commit: [AIRFLOW-2336] Use hmsclient in hive_hook

2018-04-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master fd6f1d1a0 -> 6c45b8c5f


[AIRFLOW-2336] Use hmsclient in hive_hook

The package hmsclient is Python2/3 compatible and
offer a handy context
manager to handle opening and closing connections.

Closes #3239 from gglanzani/AIRFLOW-2336


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/6c45b8c5
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/6c45b8c5
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/6c45b8c5

Branch: refs/heads/master
Commit: 6c45b8c5f2ad1af8faea13529dae01cee10b4937
Parents: fd6f1d1
Author: Giovanni Lanzani 
Authored: Wed Apr 25 12:23:59 2018 +0200
Committer: Bolke de Bruin 
Committed: Wed Apr 25 12:23:59 2018 +0200

--
 airflow/hooks/hive_hooks.py   | 127 +++---
 setup.py  |   6 +-
 tests/hooks/test_hive_hook.py | 136 -
 3 files changed, 192 insertions(+), 77 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/6c45b8c5/airflow/hooks/hive_hooks.py
--
diff --git a/airflow/hooks/hive_hooks.py b/airflow/hooks/hive_hooks.py
index d278483..65238df 100644
--- a/airflow/hooks/hive_hooks.py
+++ b/airflow/hooks/hive_hooks.py
@@ -7,9 +7,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -28,7 +28,7 @@ import re
 import subprocess
 import time
 from tempfile import NamedTemporaryFile
-import hive_metastore
+import hmsclient
 
 from airflow import configuration as conf
 from airflow.exceptions import AirflowException
@@ -460,7 +460,6 @@ class HiveMetastoreHook(BaseHook):
 """
 from thrift.transport import TSocket, TTransport
 from thrift.protocol import TBinaryProtocol
-from hive_service import ThriftHive
 ms = self.metastore_conn
 auth_mechanism = ms.extra_dejson.get('authMechanism', 'NOSASL')
 if configuration.conf.get('core', 'security') == 'kerberos':
@@ -489,7 +488,7 @@ class HiveMetastoreHook(BaseHook):
 
 protocol = TBinaryProtocol.TBinaryProtocol(transport)
 
-return ThriftHive.Client(protocol)
+return hmsclient.HMSClient(iprot=protocol)
 
 def get_conn(self):
 return self.metastore
@@ -512,10 +511,10 @@ class HiveMetastoreHook(BaseHook):
 >>> hh.check_for_partition('airflow', t, "ds='2015-01-01'")
 True
 """
-self.metastore._oprot.trans.open()
-partitions = self.metastore.get_partitions_by_filter(
-schema, table, partition, 1)
-self.metastore._oprot.trans.close()
+with self.metastore as client:
+partitions = client.get_partitions_by_filter(
+schema, table, partition, 1)
+
 if partitions:
 return True
 else:
@@ -540,15 +539,8 @@ class HiveMetastoreHook(BaseHook):
 >>> hh.check_for_named_partition('airflow', t, "ds=xxx")
 False
 """
-self.metastore._oprot.trans.open()
-try:
-self.metastore.get_partition_by_name(
-schema, table, partition_name)
-return True
-except hive_metastore.ttypes.NoSuchObjectException:
-return False
-finally:
-self.metastore._oprot.trans.close()
+with self.metastore as client:
+return client.check_for_named_partition(schema, table, 
partition_name)
 
 def get_table(self, table_name, db='default'):
 """Get a metastore table object
@@ -560,31 +552,25 @@ class HiveMetastoreHook(BaseHook):
 >>> [col.name for col in t.sd.cols]
 ['state', 'year', 'name', 'gender', 'num']
 """
-self.metastore._oprot.trans.open()
 if db == 'default' and '.' in table_name:
 db, table_name = table_name.split('.')[:2]
-table = self.metastore.get_table(dbname=db, tbl_name=table_name)
-self.metastore._oprot.trans.close()
-return table
+with self.metastore as client:
+return client.get_table(dbname=db, tbl_name=table_name)
 
 def get_tables(self, db, pattern='*'):
 """
 Get a metastore table object
 """
-self.metastore._oprot.trans.open()
-tables = self.metastore.get_tables(db_name=db, 

[jira] [Commented] (AIRFLOW-2041) Syntax errors in python examples

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2041?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16452011#comment-16452011
 ] 

ASF subversion and git services commented on AIRFLOW-2041:
--

Commit 92e2ea60f7ffc584d3808e55ee1982d5da2e49c9 in incubator-airflow's branch 
refs/heads/v1-10-test from [~0atman]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=92e2ea6 ]

[AIRFLOW-2041] Correct Syntax in python examples

I parsed it with the ol' eyeball compiler. Someone
could flake8 it better, perhaps.
Changes:

 - correct `def` syntax on line 50
 - use literal dict on line 67

Closes #2479 from 0atman/patch-1

(cherry picked from commit fd6f1d1a07a914e23e44b853f81124c9b423d22e)
Signed-off-by: Bolke de Bruin 


> Syntax errors in python examples
> 
>
> Key: AIRFLOW-2041
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2041
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Tristram Oaten
>Assignee: Tristram Oaten
>Priority: Trivial
> Fix For: 2.0.0
>
>   Original Estimate: 10m
>  Remaining Estimate: 10m
>
> There's a few trivial syntax errors in 
> [docs/concepts.rst|https://github.com/apache/incubator-airflow/pull/2479/files#diff-c5dffe6b7f756e456fe8b2bdcf70c3c3],
>  likely because they've been written on-the-fly, and never seen a python 
> interpreter.
>  
> I fixed them in [https://github.com/apache/incubator-airflow/pull/2479]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-74) SubdagOperators can consume all celeryd worker processes

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-74?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16452010#comment-16452010
 ] 

ASF subversion and git services commented on AIRFLOW-74:


Commit d35902cb4a20afcee977327ff550e540eada5e4e in incubator-airflow's branch 
refs/heads/v1-10-test from Tao feng
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=d35902c ]

[AIRFLOW-74] SubdagOperators can consume all celeryd worker processes

Closes #3251 from feng-tao/airflow-74

(cherry picked from commit 64d950166773749c0e4aa0d7032b080cadd56a53)
Signed-off-by: Bolke de Bruin 


> SubdagOperators can consume all celeryd worker processes
> 
>
> Key: AIRFLOW-74
> URL: https://issues.apache.org/jira/browse/AIRFLOW-74
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Affects Versions: Airflow 1.7.1, Airflow 1.7.0, Airflow 1.6.2
> Environment: Airflow 1.7.1rc3 with CeleryExecutor
> 1  webserver
> 1 scheduler
> 2 workers 
>Reporter: Steven Yvinec-Kruyk
>Assignee: zgl
>Priority: Major
> Fix For: 1.10.0
>
>
> If the amount of concurrent ```SubdagOperator``` running >= the no. of celery 
> worker processes tasks are unable to work. All SDOs come to a complete halt. 
> Futhermore performance of a DAG is drastically reduced even before full 
> saturation of the workers as less workers are gradually available for actual 
> tasks. A workaround for this is to specify ```SequentialExecutor``` be used 
> by the ```SubdagOperator```
> ```
> from datetime import timedelta, datetime
> from airflow.models import DAG, Pool
> from airflow.operators import BashOperator, SubDagOperator, DummyOperator
> from airflow.executors import SequentialExecutor
> import airflow
> # -\
> # DEFINE THE POOLS
> # -/
> session = airflow.settings.Session()
> for p in ['test_pool_1', 'test_pool_2', 'test_pool_3']:
> pool = (
> session.query(Pool)
> .filter(Pool.pool == p)
> .first())
> if not pool:
> session.add(Pool(pool=p, slots=8))
> session.commit()
> # -\
> # DEFINE THE DAG
> # -/
> # Define the Dag Name. This must be unique.
> dag_name = 'hanging_subdags_n16_sqe'
> # Default args are passed to each task
> default_args = {
> 'owner': 'Airflow',
> 'depends_on_past': False,
> 'start_date': datetime(2016, 04, 10),
> 'retries': 0,
> 'retry_interval': timedelta(minutes=5),
> 'email': ['y...@email.com'],
> 'email_on_failure': True,
> 'email_on_retry': True,
> 'wait_for_downstream': False,
> }
> # Create the dag object
> dag = DAG(dag_name,
>   default_args=default_args,
>   schedule_interval='0 0 * * *'
>   )
> # -\
> # DEFINE THE TASKS
> # -/
> def get_subdag(dag, sd_id, pool=None):
> subdag = DAG(
> dag_id='{parent_dag}.{sd_id}'.format(
> parent_dag=dag.dag_id,
> sd_id=sd_id),
> params=dag.params,
> default_args=dag.default_args,
> template_searchpath=dag.template_searchpath,
> user_defined_macros=dag.user_defined_macros,
> )
> t1 = BashOperator(
> task_id='{sd_id}_step_1'.format(
> sd_id=sd_id
> ),
> bash_command='echo "hello" && sleep 60',
> dag=subdag,
> pool=pool,
> executor=SequentialExecutor
> )
> t2 = BashOperator(
> task_id='{sd_id}_step_two'.format(
> sd_id=sd_id
> ),
> bash_command='echo "hello" && sleep 15',
> dag=subdag,
> pool=pool,
> executor=SequentialExecutor
> )
> t2.set_upstream(t1)
> sdo = SubDagOperator(
> task_id=sd_id,
> subdag=subdag,
> retries=0,
> retry_delay=timedelta(seconds=5),
> dag=dag,
> depends_on_past=True,
> )
> return sdo
> start_task = DummyOperator(
> task_id='start',
> dag=dag
> )
> for n in range(1, 17):
> sd_i = get_subdag(dag=dag, sd_id='level_1_{n}'.format(n=n), 
> pool='test_pool_1')
> sd_ii = get_subdag(dag=dag, sd_id='level_2_{n}'.format(n=n), 
> pool='test_pool_2')
> sd_iii = get_subdag(dag=dag, sd_id='level_3_{n}'.format(n=n), 
> pool='test_pool_3')
> sd_i.set_upstream(start_task)
> sd_ii.set_upstream(sd_i)
> sd_iii.set_upstream(sd_ii)
> ```



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[2/2] incubator-airflow git commit: [AIRFLOW-2041] Correct Syntax in python examples

2018-04-25 Thread bolke
[AIRFLOW-2041] Correct Syntax in python examples

I parsed it with the ol' eyeball compiler. Someone
could flake8 it better, perhaps.
Changes:

 - correct `def` syntax on line 50
 - use literal dict on line 67

Closes #2479 from 0atman/patch-1

(cherry picked from commit fd6f1d1a07a914e23e44b853f81124c9b423d22e)
Signed-off-by: Bolke de Bruin 


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/92e2ea60
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/92e2ea60
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/92e2ea60

Branch: refs/heads/v1-10-test
Commit: 92e2ea60f7ffc584d3808e55ee1982d5da2e49c9
Parents: d35902c
Author: Tristram Oaten 
Authored: Tue Apr 24 23:04:28 2018 -0700
Committer: Bolke de Bruin 
Committed: Wed Apr 25 12:22:15 2018 +0200

--
 docs/concepts.rst | 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/92e2ea60/docs/concepts.rst
--
diff --git a/docs/concepts.rst b/docs/concepts.rst
index 3f70555..e85238d 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -47,7 +47,7 @@ scope.
 
 dag_1 = DAG('this_dag_will_be_discovered')
 
-def my_function()
+def my_function():
 dag_2 = DAG('but_this_dag_will_not')
 
 my_function()
@@ -64,9 +64,10 @@ any of its operators. This makes it easy to apply a common 
parameter to many ope
 
 .. code:: python
 
-default_args=dict(
-start_date=datetime(2016, 1, 1),
-owner='Airflow')
+default_args = {
+'start_date': datetime(2016, 1, 1),
+'owner': 'Airflow'
+}
 
 dag = DAG('my_dag', default_args=default_args)
 op = DummyOperator(task_id='dummy', dag=dag)



[1/2] incubator-airflow git commit: [AIRFLOW-74] SubdagOperators can consume all celeryd worker processes

2018-04-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/v1-10-test c1264e714 -> 92e2ea60f


[AIRFLOW-74] SubdagOperators can consume all celeryd worker processes

Closes #3251 from feng-tao/airflow-74

(cherry picked from commit 64d950166773749c0e4aa0d7032b080cadd56a53)
Signed-off-by: Bolke de Bruin 


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/d35902cb
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/d35902cb
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/d35902cb

Branch: refs/heads/v1-10-test
Commit: d35902cb4a20afcee977327ff550e540eada5e4e
Parents: c1264e7
Author: Tao feng 
Authored: Tue Apr 24 10:13:25 2018 -0700
Committer: Bolke de Bruin 
Committed: Wed Apr 25 12:22:04 2018 +0200

--
 UPDATING.md  |  2 ++
 airflow/operators/subdag_operator.py | 22 ++
 tests/operators/subdag_operator.py   | 19 +--
 3 files changed, 29 insertions(+), 14 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d35902cb/UPDATING.md
--
diff --git a/UPDATING.md b/UPDATING.md
index 881539f..609c8db 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -5,6 +5,8 @@ assists users migrating to a new version.
 
 ## Airflow Master
 
+### Default executor for SubDagOperator is changed to SequentialExecutor
+
 ### New Webserver UI with Role-Based Access Control
 
 The current webserver UI uses the Flask-Admin extension. The new webserver UI 
uses the [Flask-AppBuilder (FAB)](https://github.com/dpgaspar/Flask-AppBuilder) 
extension. FAB has built-in authentication support and Role-Based Access 
Control (RBAC), which provides configurable roles and permissions for 
individual users.

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d35902cb/airflow/operators/subdag_operator.py
--
diff --git a/airflow/operators/subdag_operator.py 
b/airflow/operators/subdag_operator.py
index c3c7591..052095e 100644
--- a/airflow/operators/subdag_operator.py
+++ b/airflow/operators/subdag_operator.py
@@ -7,9 +7,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -18,10 +18,10 @@
 # under the License.
 
 from airflow.exceptions import AirflowException
+from airflow.executors.sequential_executor import SequentialExecutor
 from airflow.models import BaseOperator, Pool
 from airflow.utils.decorators import apply_defaults
 from airflow.utils.db import provide_session
-from airflow.executors import GetDefaultExecutor
 
 
 class SubDagOperator(BaseOperator):
@@ -35,16 +35,19 @@ class SubDagOperator(BaseOperator):
 def __init__(
 self,
 subdag,
-executor=GetDefaultExecutor(),
+executor=SequentialExecutor(),
 *args, **kwargs):
 """
-Yo dawg. This runs a sub dag. By convention, a sub dag's dag_id
+This runs a sub dag. By convention, a sub dag's dag_id
 should be prefixed by its parent and a dot. As in `parent.child`.
 
 :param subdag: the DAG object to run as a subdag of the current DAG.
-:type subdag: airflow.DAG
-:param dag: the parent DAG
-:type subdag: airflow.DAG
+:type subdag: airflow.DAG.
+:param dag: the parent DAG for the subdag.
+:type dag: airflow.DAG.
+:param executor: the executor for this subdag. Default to use 
SequentialExecutor.
+ Please find AIRFLOW-74 for more details.
+:type executor: airflow.executors.
 """
 import airflow.models
 dag = kwargs.get('dag') or airflow.models._CONTEXT_MANAGER_DAG
@@ -88,6 +91,9 @@ class SubDagOperator(BaseOperator):
 )
 
 self.subdag = subdag
+# Airflow pool is not honored by SubDagOperator.
+# Hence resources could be consumed by SubdagOperators
+# Use other executor with your own risk.
 self.executor = executor
 
 def execute(self, context):

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d35902cb/tests/operators/subdag_operator.py
--
diff --git a/tests/operators/subdag_operator.py 
b/tests/operators/subdag_operator.py
index 

[jira] [Created] (AIRFLOW-2373) Do not run tasks when DagRun state is not running

2018-04-25 Thread Kevin Yang (JIRA)
Kevin Yang created AIRFLOW-2373:
---

 Summary: Do not run tasks when DagRun state is not running
 Key: AIRFLOW-2373
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2373
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Kevin Yang


Logically it might make sense to stop tasks from being started when DagRun is 
not running.

 

Note that this will affect the ability to run task from the UI, it might make 
sense to have an additional ignore option in the UI when running tasks manually.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2041) Syntax errors in python examples

2018-04-25 Thread Siddharth Anand (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Siddharth Anand resolved AIRFLOW-2041.
--
   Resolution: Fixed
Fix Version/s: 2.0.0

Issue resolved by pull request #2479
[https://github.com/apache/incubator-airflow/pull/2479]

> Syntax errors in python examples
> 
>
> Key: AIRFLOW-2041
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2041
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Tristram Oaten
>Assignee: Tristram Oaten
>Priority: Trivial
> Fix For: 2.0.0
>
>   Original Estimate: 10m
>  Remaining Estimate: 10m
>
> There's a few trivial syntax errors in 
> [docs/concepts.rst|https://github.com/apache/incubator-airflow/pull/2479/files#diff-c5dffe6b7f756e456fe8b2bdcf70c3c3],
>  likely because they've been written on-the-fly, and never seen a python 
> interpreter.
>  
> I fixed them in [https://github.com/apache/incubator-airflow/pull/2479]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2041) Syntax errors in python examples

2018-04-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2041?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16451696#comment-16451696
 ] 

ASF subversion and git services commented on AIRFLOW-2041:
--

Commit fd6f1d1a07a914e23e44b853f81124c9b423d22e in incubator-airflow's branch 
refs/heads/master from [~0atman]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=fd6f1d1 ]

[AIRFLOW-2041] Correct Syntax in python examples

I parsed it with the ol' eyeball compiler. Someone
could flake8 it better, perhaps.
Changes:

 - correct `def` syntax on line 50
 - use literal dict on line 67

Closes #2479 from 0atman/patch-1


> Syntax errors in python examples
> 
>
> Key: AIRFLOW-2041
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2041
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Tristram Oaten
>Assignee: Tristram Oaten
>Priority: Trivial
> Fix For: 2.0.0
>
>   Original Estimate: 10m
>  Remaining Estimate: 10m
>
> There's a few trivial syntax errors in 
> [docs/concepts.rst|https://github.com/apache/incubator-airflow/pull/2479/files#diff-c5dffe6b7f756e456fe8b2bdcf70c3c3],
>  likely because they've been written on-the-fly, and never seen a python 
> interpreter.
>  
> I fixed them in [https://github.com/apache/incubator-airflow/pull/2479]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2041] Correct Syntax in python examples

2018-04-25 Thread sanand
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 64d950166 -> fd6f1d1a0


[AIRFLOW-2041] Correct Syntax in python examples

I parsed it with the ol' eyeball compiler. Someone
could flake8 it better, perhaps.
Changes:

 - correct `def` syntax on line 50
 - use literal dict on line 67

Closes #2479 from 0atman/patch-1


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/fd6f1d1a
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/fd6f1d1a
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/fd6f1d1a

Branch: refs/heads/master
Commit: fd6f1d1a07a914e23e44b853f81124c9b423d22e
Parents: 64d9501
Author: Tristram Oaten 
Authored: Tue Apr 24 23:04:28 2018 -0700
Committer: r39132 
Committed: Tue Apr 24 23:04:38 2018 -0700

--
 docs/concepts.rst | 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/fd6f1d1a/docs/concepts.rst
--
diff --git a/docs/concepts.rst b/docs/concepts.rst
index 3f70555..e85238d 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -47,7 +47,7 @@ scope.
 
 dag_1 = DAG('this_dag_will_be_discovered')
 
-def my_function()
+def my_function():
 dag_2 = DAG('but_this_dag_will_not')
 
 my_function()
@@ -64,9 +64,10 @@ any of its operators. This makes it easy to apply a common 
parameter to many ope
 
 .. code:: python
 
-default_args=dict(
-start_date=datetime(2016, 1, 1),
-owner='Airflow')
+default_args = {
+'start_date': datetime(2016, 1, 1),
+'owner': 'Airflow'
+}
 
 dag = DAG('my_dag', default_args=default_args)
 op = DummyOperator(task_id='dummy', dag=dag)