[jira] [Resolved] (AIRFLOW-533) DB API hook's insert_rows sets autocommit non-generically
[ https://issues.apache.org/jira/browse/AIRFLOW-533?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Siddharth Anand resolved AIRFLOW-533. - Resolution: Fixed Issue resolved by pull request #1813 [https://github.com/apache/incubator-airflow/pull/1813] > DB API hook's insert_rows sets autocommit non-generically > - > > Key: AIRFLOW-533 > URL: https://issues.apache.org/jira/browse/AIRFLOW-533 > Project: Apache Airflow > Issue Type: Bug > Components: hooks >Reporter: Luke Rohde >Assignee: Luke Rohde > > The DB API hook has different behavior between > https://github.com/apache/incubator-airflow/blob/189e6b88742ace8c46e72d59d7662284e34b7a2e/airflow/hooks/dbapi_hook.py#L189 > and > https://github.com/apache/incubator-airflow/blob/189e6b88742ace8c46e72d59d7662284e34b7a2e/airflow/hooks/dbapi_hook.py#L142 > in how it handles autocommit. The latter seems correct to me, as it delegates > to the implementation how to set autocommit. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
incubator-airflow git commit: [AIRFLOW-533] Set autocommit via set_autocommit
Repository: incubator-airflow Updated Branches: refs/heads/master 12e48b4c6 -> f34b292f5 [AIRFLOW-533] Set autocommit via set_autocommit Delegate setting autocommit in insert_rows to set_autocommit Closes #1813 from thyming/fix-insert-rows- autocommit Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/f34b292f Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/f34b292f Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/f34b292f Branch: refs/heads/master Commit: f34b292f5889a6c155d967d1987e052e0bc92075 Parents: 12e48b4 Author: Luke RohdeAuthored: Mon Nov 14 23:34:51 2016 -0800 Committer: Siddharth Anand Committed: Mon Nov 14 23:34:56 2016 -0800 -- airflow/hooks/dbapi_hook.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) -- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/f34b292f/airflow/hooks/dbapi_hook.py -- diff --git a/airflow/hooks/dbapi_hook.py b/airflow/hooks/dbapi_hook.py index 939bae2..df52e54 100644 --- a/airflow/hooks/dbapi_hook.py +++ b/airflow/hooks/dbapi_hook.py @@ -203,10 +203,10 @@ class DbApiHook(BaseHook): else: target_fields = '' conn = self.get_conn() -cur = conn.cursor() if self.supports_autocommit: -cur.execute('SET autocommit = 0') +self.set_autocommit(conn, False) conn.commit() +cur = conn.cursor() i = 0 for row in rows: i += 1
[jira] [Commented] (AIRFLOW-533) DB API hook's insert_rows sets autocommit non-generically
[ https://issues.apache.org/jira/browse/AIRFLOW-533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15666397#comment-15666397 ] ASF subversion and git services commented on AIRFLOW-533: - Commit f34b292f5889a6c155d967d1987e052e0bc92075 in incubator-airflow's branch refs/heads/master from [~lukerohde] [ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=f34b292 ] [AIRFLOW-533] Set autocommit via set_autocommit Delegate setting autocommit in insert_rows to set_autocommit Closes #1813 from thyming/fix-insert-rows- autocommit > DB API hook's insert_rows sets autocommit non-generically > - > > Key: AIRFLOW-533 > URL: https://issues.apache.org/jira/browse/AIRFLOW-533 > Project: Apache Airflow > Issue Type: Bug > Components: hooks >Reporter: Luke Rohde >Assignee: Luke Rohde > > The DB API hook has different behavior between > https://github.com/apache/incubator-airflow/blob/189e6b88742ace8c46e72d59d7662284e34b7a2e/airflow/hooks/dbapi_hook.py#L189 > and > https://github.com/apache/incubator-airflow/blob/189e6b88742ace8c46e72d59d7662284e34b7a2e/airflow/hooks/dbapi_hook.py#L142 > in how it handles autocommit. The latter seems correct to me, as it delegates > to the implementation how to set autocommit. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-479) Config: Add LDAP Bind Password captured by stdout
[ https://issues.apache.org/jira/browse/AIRFLOW-479?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15666387#comment-15666387 ] Yongjun Park commented on AIRFLOW-479: -- Can I take this issue? > Config: Add LDAP Bind Password captured by stdout > - > > Key: AIRFLOW-479 > URL: https://issues.apache.org/jira/browse/AIRFLOW-479 > Project: Apache Airflow > Issue Type: Improvement > Components: security >Affects Versions: Airflow 1.7.1 >Reporter: Jason DeCorte >Priority: Minor > > Currently Airflow configuration only allows certain properties to be assigned > by the standard output of a command shell script. For security reasons, I > would like the bind_password property for ldap configuration to be added to > the list. This will enhance security by allowing the encryption of the > password while stored and the decryption of it at run time. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Resolved] (AIRFLOW-629) stop pinning lxml
[ https://issues.apache.org/jira/browse/AIRFLOW-629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Siddharth Anand resolved AIRFLOW-629. - Resolution: Fixed > stop pinning lxml > - > > Key: AIRFLOW-629 > URL: https://issues.apache.org/jira/browse/AIRFLOW-629 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Haydn Dufrene >Assignee: Haydn Dufrene > > lxml version should not be a hard requirement in the setup file. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-629) stop pinning lxml
[ https://issues.apache.org/jira/browse/AIRFLOW-629?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15666373#comment-15666373 ] ASF subversion and git services commented on AIRFLOW-629: - Commit 12e48b4c62bb764a38732f823708eb5c0ab2ac1f in incubator-airflow's branch refs/heads/master from jedipi [ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=12e48b4 ] [AIRFLOW-629] stop pinning lxml Closes #1882 from jedipi/improvement/stop-pinning- lxml > stop pinning lxml > - > > Key: AIRFLOW-629 > URL: https://issues.apache.org/jira/browse/AIRFLOW-629 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Haydn Dufrene >Assignee: Haydn Dufrene > > lxml version should not be a hard requirement in the setup file. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
incubator-airflow git commit: [AIRFLOW-629] stop pinning lxml
Repository: incubator-airflow Updated Branches: refs/heads/master dd1f50e59 -> 12e48b4c6 [AIRFLOW-629] stop pinning lxml Closes #1882 from jedipi/improvement/stop-pinning- lxml Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/12e48b4c Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/12e48b4c Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/12e48b4c Branch: refs/heads/master Commit: 12e48b4c62bb764a38732f823708eb5c0ab2ac1f Parents: dd1f50e Author: jedipiAuthored: Mon Nov 14 23:26:25 2016 -0800 Committer: Siddharth Anand Committed: Mon Nov 14 23:26:25 2016 -0800 -- setup.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) -- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/12e48b4c/setup.py -- diff --git a/setup.py b/setup.py index 3a75168..baa496a 100644 --- a/setup.py +++ b/setup.py @@ -205,7 +205,7 @@ def do_setup(): 'tabulate>=0.7.5, <0.8.0', 'thrift>=0.9.2, <0.10', 'zope.deprecation>=4.0, <5.0', - 'lxml==3.6.0', + 'lxml>=3.6.0, <4.0', ], extras_require={ 'all': devel_all,
[jira] [Resolved] (AIRFLOW-464) Add setdefault method to Variable Object
[ https://issues.apache.org/jira/browse/AIRFLOW-464?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Siddharth Anand resolved AIRFLOW-464. - Resolution: Fixed > Add setdefault method to Variable Object > > > Key: AIRFLOW-464 > URL: https://issues.apache.org/jira/browse/AIRFLOW-464 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Ben Tallman >Assignee: Ben Tallman >Priority: Trivial > > In order to assist with environment migrations, we added a setdefault method > to > the Variable object. This allows default variables to be created (and then > edited) with less chance for typos/copy+paste bugs. > Variable.setdefault(key, default, deserialize_json=[True|False]) returns > either the > value stored in Variable(key) or sets Variable(key) = default and returns > default. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-464) Add setdefault method to Variable Object
[ https://issues.apache.org/jira/browse/AIRFLOW-464?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15666329#comment-15666329 ] ASF subversion and git services commented on AIRFLOW-464: - Commit dd1f50e59008e95f457ce47ba877f631d5187530 in incubator-airflow's branch refs/heads/master from [~btall...@gmail.com] [ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=dd1f50e ] [AIRFLOW-464] Add setdefault method to Variable In order to assist with environment migrations, we added a setdefault method to the Variable object. This allows default variables to be created (and then edited) with less chance for typos/copy+paste bugs. Also changed Variable.get ValueError exception to a KeyError Variable.setdefault(key, default, deserialize_json=[True|False]) returns either the value stored in Variable(key) or sets Variable(key) = default and returns default. Dear Airflow Maintainers, Please accept this PR that addresses the following issues: - _https://issues.apache.org/jira/browse/AIRFLOW- 464_ This was changed from adding a create_if_none flag to Variable.get based on feedback on this PR. Testing Done: - Added a test to test/core.py to cover this functionality Closes #1765 from btallman/CreateIfNone_feature > Add setdefault method to Variable Object > > > Key: AIRFLOW-464 > URL: https://issues.apache.org/jira/browse/AIRFLOW-464 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Ben Tallman >Assignee: Ben Tallman >Priority: Trivial > > In order to assist with environment migrations, we added a setdefault method > to > the Variable object. This allows default variables to be created (and then > edited) with less chance for typos/copy+paste bugs. > Variable.setdefault(key, default, deserialize_json=[True|False]) returns > either the > value stored in Variable(key) or sets Variable(key) = default and returns > default. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
incubator-airflow git commit: [AIRFLOW-464] Add setdefault method to Variable
Repository: incubator-airflow Updated Branches: refs/heads/master c6dd4d457 -> dd1f50e59 [AIRFLOW-464] Add setdefault method to Variable In order to assist with environment migrations, we added a setdefault method to the Variable object. This allows default variables to be created (and then edited) with less chance for typos/copy+paste bugs. Also changed Variable.get ValueError exception to a KeyError Variable.setdefault(key, default, deserialize_json=[True|False]) returns either the value stored in Variable(key) or sets Variable(key) = default and returns default. Dear Airflow Maintainers, Please accept this PR that addresses the following issues: - _https://issues.apache.org/jira/browse/AIRFLOW- 464_ This was changed from adding a create_if_none flag to Variable.get based on feedback on this PR. Testing Done: - Added a test to test/core.py to cover this functionality Closes #1765 from btallman/CreateIfNone_feature Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/dd1f50e5 Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/dd1f50e5 Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/dd1f50e5 Branch: refs/heads/master Commit: dd1f50e59008e95f457ce47ba877f631d5187530 Parents: c6dd4d4 Author: Benjamin TallmanAuthored: Mon Nov 14 23:04:04 2016 -0800 Committer: Siddharth Anand Committed: Mon Nov 14 23:04:21 2016 -0800 -- airflow/models.py | 32 +++- tests/core.py | 12 2 files changed, 43 insertions(+), 1 deletion(-) -- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/dd1f50e5/airflow/models.py -- diff --git a/airflow/models.py b/airflow/models.py index 3509a0c..e64d04f 100755 --- a/airflow/models.py +++ b/airflow/models.py @@ -3366,6 +3366,36 @@ class Variable(Base): descriptor=property(cls.get_val, cls.set_val)) @classmethod +def setdefault(cls, key, default, deserialize_json=False): +""" +Like a Python builtin dict object, setdefault returns the current value +for a key, and if it isn't there, stores the default value and returns it. + +:param key: Dict key for this Variable +:type key: String +:param: default: Default value to set and return if the variable +isn't already in the DB +:type: default: Mixed +:param: deserialize_json: Store this as a JSON encoded value in the DB + and un-encode it when retrieving a value +:return: Mixed +""" +default_sentinel = object() +obj = Variable.get(key, default_var=default_sentinel, deserialize_json=False) +if obj is default_sentinel: +if default is not None: +Variable.set(key, default, serialize_json=deserialize_json) +return default +else: +raise ValueError('Default Value must be set') +else: +if deserialize_json: +return json.loads(obj.val) +else: +return obj.val + + +@classmethod @provide_session def get(cls, key, default_var=None, deserialize_json=False, session=None): obj = session.query(cls).filter(cls.key == key).first() @@ -3373,7 +3403,7 @@ class Variable(Base): if default_var is not None: return default_var else: -raise ValueError('Variable {} does not exist'.format(key)) +raise KeyError('Variable {} does not exist'.format(key)) else: if deserialize_json: return json.loads(obj.val) http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/dd1f50e5/tests/core.py -- diff --git a/tests/core.py b/tests/core.py index effc63d..b52945a 100644 --- a/tests/core.py +++ b/tests/core.py @@ -691,6 +691,18 @@ class CoreTest(unittest.TestCase): default_var=default_value, deserialize_json=True) +def test_variable_setdefault_round_trip(self): +key = "tested_var_setdefault_1_id" +value = "Monday morning breakfast in Paris" +Variable.setdefault(key, value) +assert value == Variable.get(key) + +def test_variable_setdefault_round_trip_json(self): +key = "tested_var_setdefault_2_id" +value = {"city": 'Paris', "Hapiness": True} +Variable.setdefault(key, value, deserialize_json=True) +assert value == Variable.get(key, deserialize_json=True) + def
[jira] [Commented] (AIRFLOW-626) HTML Content does not show up when sending email with attachment
[ https://issues.apache.org/jira/browse/AIRFLOW-626?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15665976#comment-15665976 ] ASF subversion and git services commented on AIRFLOW-626: - Commit c6dd4d4578918364da1cd3d5655a8d41a65871b5 in incubator-airflow's branch refs/heads/master from Siddharth Anand [ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=c6dd4d4 ] Revert "[AIRFLOW-626] HTML Content does not show up when sending email with attachment" This reverts commit 55af3e04f8aa2062715370c8feec10308938715e. Master is currently broken as shown on https://travis-ci.org/apache/incubator-airflow/jobs/175858834 == FAIL: test_custom_backend (tests.EmailTest) -- Traceback (most recent call last): File "/home/travis/build/apache/incubator-airflow/.tox/py27-cdh-airflow_backend_sqlite/lib/python2.7/site-packages/mock/mock.py", line 1305, in patched return func(*args, **keywargs) File "/home/travis/build/apache/incubator-airflow/tests/core.py", line 1927, in test_custom_backend send_email_test.assert_called_with('to', 'subject', 'content', files=None, dryrun=False, cc=None, bcc=None) File "/home/travis/build/apache/incubator-airflow/.tox/py27-cdh-airflow_backend_sqlite/lib/python2.7/site-packages/mock/mock.py", line 937, in assert_called_with six.raise_from(AssertionError(_error_message(cause)), cause) File "/home/travis/build/apache/incubator-airflow/.tox/py27-cdh-airflow_backend_sqlite/lib/python2.7/site-packages/six.py", line 718, in raise_from raise value AssertionError: Expected call: mock('to', 'subject', 'content', bcc=None, cc=None, dryrun=False, files=None) Actual call: mock('to', 'subject', 'content', bcc=None, cc=None, dryrun=False, files=None, mime_subtype=u'mixed') > HTML Content does not show up when sending email with attachment > > > Key: AIRFLOW-626 > URL: https://issues.apache.org/jira/browse/AIRFLOW-626 > Project: Apache Airflow > Issue Type: Bug >Reporter: Ilya Rakoshes >Assignee: Ilya Rakoshes >Priority: Minor > Fix For: Airflow 1.8 > > > When the send_email function in airflow.utils is used to send an email with > both an email body in {{html_content}} and attachments in {{files}}, the > email comes through without the body. This impacts EmailOperator. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
incubator-airflow git commit: Revert "[AIRFLOW-626] HTML Content does not show up when sending email with attachment"
Repository: incubator-airflow Updated Branches: refs/heads/master 55af3e04f -> c6dd4d457 Revert "[AIRFLOW-626] HTML Content does not show up when sending email with attachment" This reverts commit 55af3e04f8aa2062715370c8feec10308938715e. Master is currently broken as shown on https://travis-ci.org/apache/incubator-airflow/jobs/175858834 == FAIL: test_custom_backend (tests.EmailTest) -- Traceback (most recent call last): File "/home/travis/build/apache/incubator-airflow/.tox/py27-cdh-airflow_backend_sqlite/lib/python2.7/site-packages/mock/mock.py", line 1305, in patched return func(*args, **keywargs) File "/home/travis/build/apache/incubator-airflow/tests/core.py", line 1927, in test_custom_backend send_email_test.assert_called_with('to', 'subject', 'content', files=None, dryrun=False, cc=None, bcc=None) File "/home/travis/build/apache/incubator-airflow/.tox/py27-cdh-airflow_backend_sqlite/lib/python2.7/site-packages/mock/mock.py", line 937, in assert_called_with six.raise_from(AssertionError(_error_message(cause)), cause) File "/home/travis/build/apache/incubator-airflow/.tox/py27-cdh-airflow_backend_sqlite/lib/python2.7/site-packages/six.py", line 718, in raise_from raise value AssertionError: Expected call: mock('to', 'subject', 'content', bcc=None, cc=None, dryrun=False, files=None) Actual call: mock('to', 'subject', 'content', bcc=None, cc=None, dryrun=False, files=None, mime_subtype=u'mixed') Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/c6dd4d45 Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/c6dd4d45 Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/c6dd4d45 Branch: refs/heads/master Commit: c6dd4d4578918364da1cd3d5655a8d41a65871b5 Parents: 55af3e0 Author: Siddharth AnandAuthored: Mon Nov 14 19:57:08 2016 -0800 Committer: Siddharth Anand Committed: Mon Nov 14 19:57:08 2016 -0800 -- airflow/operators/email_operator.py | 4 +--- airflow/utils/email.py | 8 2 files changed, 5 insertions(+), 7 deletions(-) -- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/c6dd4d45/airflow/operators/email_operator.py -- diff --git a/airflow/operators/email_operator.py b/airflow/operators/email_operator.py index 5167a7a..76cc56b 100644 --- a/airflow/operators/email_operator.py +++ b/airflow/operators/email_operator.py @@ -49,7 +49,6 @@ class EmailOperator(BaseOperator): files=None, cc=None, bcc=None, -mime_subtype='mixed', *args, **kwargs): super(EmailOperator, self).__init__(*args, **kwargs) self.to = to @@ -58,7 +57,6 @@ class EmailOperator(BaseOperator): self.files = files or [] self.cc = cc self.bcc = bcc -self.mime_subtype = mime_subtype def execute(self, context): -send_email(self.to, self.subject, self.html_content, files=self.files, cc=self.cc, bcc=self.bcc, mime_subtype=self.mime_subtype) +send_email(self.to, self.subject, self.html_content, files=self.files, cc=self.cc, bcc=self.bcc) http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/c6dd4d45/airflow/utils/email.py -- diff --git a/airflow/utils/email.py b/airflow/utils/email.py index c4906fd..6fe8662 100644 --- a/airflow/utils/email.py +++ b/airflow/utils/email.py @@ -33,17 +33,17 @@ from email.utils import formatdate from airflow import configuration -def send_email(to, subject, html_content, files=None, dryrun=False, cc=None, bcc=None, mime_subtype='mixed'): +def send_email(to, subject, html_content, files=None, dryrun=False, cc=None, bcc=None): """ Send email using backend specified in EMAIL_BACKEND. """ path, attr = configuration.get('email', 'EMAIL_BACKEND').rsplit('.', 1) module = importlib.import_module(path) backend = getattr(module, attr) -return backend(to, subject, html_content, files=files, dryrun=dryrun, cc=cc, bcc=bcc, mime_subtype=mime_subtype) +return backend(to, subject, html_content, files=files, dryrun=dryrun, cc=cc, bcc=bcc) -def send_email_smtp(to, subject, html_content, files=None, dryrun=False, cc=None, bcc=None, mime_subtype='mixed'): +def send_email_smtp(to, subject, html_content, files=None, dryrun=False, cc=None, bcc=None): """ Send an email with html content @@ -53,7 +53,7 @@ def send_email_smtp(to, subject, html_content, files=None,
[jira] [Updated] (AIRFLOW-627) Tasks getting Queued when Pool is full sometimes never run
[ https://issues.apache.org/jira/browse/AIRFLOW-627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ben Tallman updated AIRFLOW-627: Description: Log data when this happens: [2016-11-14 10:54:04,174] {models.py:168} INFO - Filling up the DagBag from /opt/efs/airflow/dags/crawl_traffic_prod.py [2016-11-14 10:54:07,562] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:07,667] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:27,214] {models.py:168} INFO - Filling up the DagBag from /opt/efs/airflow/dags/crawl_traffic_prod.py [2016-11-14 10:54:30,150] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:30,311] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:32,438] {models.py:1072} INFO - Dependencies all met for [2016-11-14 10:54:32,700] {models.py:1069} WARNING - Dependencies not met for , dependency 'DAG's Pool Has Space' FAILED: Task's pool 'prod_pod_crawler' is full. depends_on_past False depsset([, , ]) was: [2016-11-14 10:54:04,174] {models.py:168} INFO - Filling up the DagBag from /opt/efs/airflow/dags/crawl_traffic_prod.py [2016-11-14 10:54:07,562] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:07,667] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:27,214] {models.py:168} INFO - Filling up the DagBag from /opt/efs/airflow/dags/crawl_traffic_prod.py [2016-11-14 10:54:30,150] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:30,311] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:32,438] {models.py:1072} INFO - Dependencies all met for [2016-11-14 10:54:32,700] {models.py:1069} WARNING - Dependencies not met for , dependency 'DAG's Pool Has Space' FAILED: Task's pool 'prod_pod_crawler' is full. > Tasks getting Queued when Pool is full sometimes never run > -- > > Key: AIRFLOW-627 > URL: https://issues.apache.org/jira/browse/AIRFLOW-627 > Project: Apache Airflow > Issue Type: Bug > Components: scheduler >Affects Versions: Airflow 1.8 > Environment: Celery Executor, Master Branch, Postgres >Reporter: Ben Tallman > > Log data when this happens: > [2016-11-14 10:54:04,174] {models.py:168} INFO - Filling up the DagBag from > /opt/efs/airflow/dags/crawl_traffic_prod.py > [2016-11-14 10:54:07,562] {base_hook.py:67} INFO - Using connection to: > db.xyz.com > [2016-11-14 10:54:07,667] {base_hook.py:67} INFO - Using connection to: > db.xyz.com > [2016-11-14 10:54:27,214] {models.py:168} INFO - Filling up the DagBag from > /opt/efs/airflow/dags/crawl_traffic_prod.py > [2016-11-14 10:54:30,150] {base_hook.py:67} INFO - Using connection to: > db.xyz.com > [2016-11-14 10:54:30,311] {base_hook.py:67} INFO - Using connection to: > db.xyz.com > [2016-11-14 10:54:32,438] {models.py:1072} INFO - Dependencies all met for > 01:00:00 [queued]> > [2016-11-14 10:54:32,700] {models.py:1069} WARNING - Dependencies not met for > 01:00:00 [queued]>, dependency 'DAG's Pool Has Space' FAILED: Task's pool > 'prod_pod_crawler' is full. > depends_on_past False > deps set([ , , > ]) -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (AIRFLOW-627) Tasks getting Queued when Pool is full sometimes never run
[ https://issues.apache.org/jira/browse/AIRFLOW-627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ben Tallman updated AIRFLOW-627: Description: Log data when this happens: [2016-11-14 10:54:04,174] {models.py:168} INFO - Filling up the DagBag from /opt/efs/airflow/dags/crawl_traffic_prod.py [2016-11-14 10:54:07,562] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:07,667] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:27,214] {models.py:168} INFO - Filling up the DagBag from /opt/efs/airflow/dags/crawl_traffic_prod.py [2016-11-14 10:54:30,150] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:30,311] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:32,438] {models.py:1072} INFO - Dependencies all met for [2016-11-14 10:54:32,700] {models.py:1069} WARNING - Dependencies not met for , dependency 'DAG's Pool Has Space' FAILED: Task's pool 'prod_pod_crawler' is full. Task Details info: depends_on_past False depsset([, , ]) was: Log data when this happens: [2016-11-14 10:54:04,174] {models.py:168} INFO - Filling up the DagBag from /opt/efs/airflow/dags/crawl_traffic_prod.py [2016-11-14 10:54:07,562] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:07,667] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:27,214] {models.py:168} INFO - Filling up the DagBag from /opt/efs/airflow/dags/crawl_traffic_prod.py [2016-11-14 10:54:30,150] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:30,311] {base_hook.py:67} INFO - Using connection to: db.xyz.com [2016-11-14 10:54:32,438] {models.py:1072} INFO - Dependencies all met for [2016-11-14 10:54:32,700] {models.py:1069} WARNING - Dependencies not met for , dependency 'DAG's Pool Has Space' FAILED: Task's pool 'prod_pod_crawler' is full. depends_on_past False depsset([ , , ]) > Tasks getting Queued when Pool is full sometimes never run > -- > > Key: AIRFLOW-627 > URL: https://issues.apache.org/jira/browse/AIRFLOW-627 > Project: Apache Airflow > Issue Type: Bug > Components: scheduler >Affects Versions: Airflow 1.8 > Environment: Celery Executor, Master Branch, Postgres >Reporter: Ben Tallman > > Log data when this happens: > [2016-11-14 10:54:04,174] {models.py:168} INFO - Filling up the DagBag from > /opt/efs/airflow/dags/crawl_traffic_prod.py > [2016-11-14 10:54:07,562] {base_hook.py:67} INFO - Using connection to: > db.xyz.com > [2016-11-14 10:54:07,667] {base_hook.py:67} INFO - Using connection to: > db.xyz.com > [2016-11-14 10:54:27,214] {models.py:168} INFO - Filling up the DagBag from > /opt/efs/airflow/dags/crawl_traffic_prod.py > [2016-11-14 10:54:30,150] {base_hook.py:67} INFO - Using connection to: > db.xyz.com > [2016-11-14 10:54:30,311] {base_hook.py:67} INFO - Using connection to: > db.xyz.com > [2016-11-14 10:54:32,438] {models.py:1072} INFO - Dependencies all met for > 01:00:00 [queued]> > [2016-11-14 10:54:32,700] {models.py:1069} WARNING - Dependencies not met for > 01:00:00 [queued]>, dependency 'DAG's Pool Has Space' FAILED: Task's pool > 'prod_pod_crawler' is full. > Task Details info: > depends_on_past False > deps set([ , , > ]) -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (AIRFLOW-628) SalesforceHook
[ https://issues.apache.org/jira/browse/AIRFLOW-628?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Giovanni Briggs updated AIRFLOW-628: External issue URL: https://github.com/apache/incubator-airflow/pull/1881 > SalesforceHook > -- > > Key: AIRFLOW-628 > URL: https://issues.apache.org/jira/browse/AIRFLOW-628 > Project: Apache Airflow > Issue Type: New Feature >Reporter: Giovanni Briggs >Assignee: Giovanni Briggs >Priority: Minor > > Created a SalesforceHook that allows you to connect to Salesforce, retrieve > data from it and write that data to a file. > The output file can then be used with other hooks and operators to move the > data to another data warehouses. > This is the original GitHub repository I was using to house the hook and an > operator: https://github.com/Jalepeno112/airflow-salesforce. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (AIRFLOW-628) SalesforceHook
Giovanni Briggs created AIRFLOW-628: --- Summary: SalesforceHook Key: AIRFLOW-628 URL: https://issues.apache.org/jira/browse/AIRFLOW-628 Project: Apache Airflow Issue Type: New Feature Reporter: Giovanni Briggs Assignee: Giovanni Briggs Priority: Minor Created a SalesforceHook that allows you to connect to Salesforce, retrieve data from it and write that data to a file. The output file can then be used with other hooks and operators to move the data to another data warehouses. This is the original GitHub repository I was using to house the hook and an operator: https://github.com/Jalepeno112/airflow-salesforce. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Closed] (AIRFLOW-626) HTML Content does not show up when sending email with attachment
[ https://issues.apache.org/jira/browse/AIRFLOW-626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Chris Riccomini closed AIRFLOW-626. --- Resolution: Fixed Fix Version/s: Airflow 1.8 > HTML Content does not show up when sending email with attachment > > > Key: AIRFLOW-626 > URL: https://issues.apache.org/jira/browse/AIRFLOW-626 > Project: Apache Airflow > Issue Type: Bug >Reporter: Ilya Rakoshes >Assignee: Ilya Rakoshes >Priority: Minor > Fix For: Airflow 1.8 > > > When the send_email function in airflow.utils is used to send an email with > both an email body in {{html_content}} and attachments in {{files}}, the > email comes through without the body. This impacts EmailOperator. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-626) HTML Content does not show up when sending email with attachment
[ https://issues.apache.org/jira/browse/AIRFLOW-626?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15665132#comment-15665132 ] ASF subversion and git services commented on AIRFLOW-626: - Commit 55af3e04f8aa2062715370c8feec10308938715e in incubator-airflow's branch refs/heads/master from [~illop] [ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=55af3e0 ] [AIRFLOW-626][AIRFLOW-1] HTML Content does not show up when sending email with attachment Closes #1880 from illop/send_email_mimetype > HTML Content does not show up when sending email with attachment > > > Key: AIRFLOW-626 > URL: https://issues.apache.org/jira/browse/AIRFLOW-626 > Project: Apache Airflow > Issue Type: Bug >Reporter: Ilya Rakoshes >Assignee: Ilya Rakoshes >Priority: Minor > Fix For: Airflow 1.8 > > > When the send_email function in airflow.utils is used to send an email with > both an email body in {{html_content}} and attachments in {{files}}, the > email comes through without the body. This impacts EmailOperator. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-1) Migrate GitHub code to Apache git
[ https://issues.apache.org/jira/browse/AIRFLOW-1?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15665133#comment-15665133 ] ASF subversion and git services commented on AIRFLOW-1: --- Commit 55af3e04f8aa2062715370c8feec10308938715e in incubator-airflow's branch refs/heads/master from [~illop] [ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=55af3e0 ] [AIRFLOW-626][AIRFLOW-1] HTML Content does not show up when sending email with attachment Closes #1880 from illop/send_email_mimetype > Migrate GitHub code to Apache git > - > > Key: AIRFLOW-1 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1 > Project: Apache Airflow > Issue Type: Improvement > Components: project-management >Reporter: Maxime Beauchemin >Assignee: Maxime Beauchemin > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
incubator-airflow git commit: [AIRFLOW-626][AIRFLOW-1] HTML Content does not show up when sending email with attachment
Repository: incubator-airflow Updated Branches: refs/heads/master 4609f689c -> 55af3e04f [AIRFLOW-626][AIRFLOW-1] HTML Content does not show up when sending email with attachment Closes #1880 from illop/send_email_mimetype Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/55af3e04 Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/55af3e04 Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/55af3e04 Branch: refs/heads/master Commit: 55af3e04f8aa2062715370c8feec10308938715e Parents: 4609f68 Author: Ilya RakoshesAuthored: Mon Nov 14 13:44:46 2016 -0800 Committer: Chris Riccomini Committed: Mon Nov 14 13:44:46 2016 -0800 -- airflow/operators/email_operator.py | 4 +++- airflow/utils/email.py | 8 2 files changed, 7 insertions(+), 5 deletions(-) -- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/55af3e04/airflow/operators/email_operator.py -- diff --git a/airflow/operators/email_operator.py b/airflow/operators/email_operator.py index 76cc56b..5167a7a 100644 --- a/airflow/operators/email_operator.py +++ b/airflow/operators/email_operator.py @@ -49,6 +49,7 @@ class EmailOperator(BaseOperator): files=None, cc=None, bcc=None, +mime_subtype='mixed', *args, **kwargs): super(EmailOperator, self).__init__(*args, **kwargs) self.to = to @@ -57,6 +58,7 @@ class EmailOperator(BaseOperator): self.files = files or [] self.cc = cc self.bcc = bcc +self.mime_subtype = mime_subtype def execute(self, context): -send_email(self.to, self.subject, self.html_content, files=self.files, cc=self.cc, bcc=self.bcc) +send_email(self.to, self.subject, self.html_content, files=self.files, cc=self.cc, bcc=self.bcc, mime_subtype=self.mime_subtype) http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/55af3e04/airflow/utils/email.py -- diff --git a/airflow/utils/email.py b/airflow/utils/email.py index 6fe8662..c4906fd 100644 --- a/airflow/utils/email.py +++ b/airflow/utils/email.py @@ -33,17 +33,17 @@ from email.utils import formatdate from airflow import configuration -def send_email(to, subject, html_content, files=None, dryrun=False, cc=None, bcc=None): +def send_email(to, subject, html_content, files=None, dryrun=False, cc=None, bcc=None, mime_subtype='mixed'): """ Send email using backend specified in EMAIL_BACKEND. """ path, attr = configuration.get('email', 'EMAIL_BACKEND').rsplit('.', 1) module = importlib.import_module(path) backend = getattr(module, attr) -return backend(to, subject, html_content, files=files, dryrun=dryrun, cc=cc, bcc=bcc) +return backend(to, subject, html_content, files=files, dryrun=dryrun, cc=cc, bcc=bcc, mime_subtype=mime_subtype) -def send_email_smtp(to, subject, html_content, files=None, dryrun=False, cc=None, bcc=None): +def send_email_smtp(to, subject, html_content, files=None, dryrun=False, cc=None, bcc=None, mime_subtype='mixed'): """ Send an email with html content @@ -53,7 +53,7 @@ def send_email_smtp(to, subject, html_content, files=None, dryrun=False, cc=None to = get_email_address_list(to) -msg = MIMEMultipart('alternative') +msg = MIMEMultipart(mime_subtype) msg['Subject'] = subject msg['From'] = SMTP_MAIL_FROM msg['To'] = ", ".join(to)
[jira] [Commented] (AIRFLOW-121) Documenting dag doc_md feature
[ https://issues.apache.org/jira/browse/AIRFLOW-121?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15663784#comment-15663784 ] Flex Gao commented on AIRFLOW-121: -- Added it: [https://issues.apache.org/jira/browse/AIRFLOW-625] > Documenting dag doc_md feature > -- > > Key: AIRFLOW-121 > URL: https://issues.apache.org/jira/browse/AIRFLOW-121 > Project: Apache Airflow > Issue Type: Improvement > Components: docs >Reporter: dud >Assignee: dud >Priority: Trivial > > Dear Airflow Maintainers, > I added a note about DAG documentation. > I'd be glad if my PR would be merged : > https://github.com/apache/incubator-airflow/pull/1493 > Regards > dud -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (AIRFLOW-625) doc_md in concepts document seems wrong
[ https://issues.apache.org/jira/browse/AIRFLOW-625?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Flex Gao updated AIRFLOW-625: - Description: In [https://github.com/apache/incubator-airflow/blob/master/docs/concepts.rst] it said *doc_md* is an attribute of a task, but this will give an error on webserver *Graph* tab Example Dag file: {code} #!/usr/bin/env python # -*- coding: utf-8 -*- from airflow import DAG from airflow.operators.bash_operator import BashOperator from datetime import datetime, timedelta SCRIPTS_PATH = '/var/lib/airflow/scripts' default_args = { 'depends_on_past': False, 'start_date': datetime(2016, 11, 9, 0, 55), } dag = DAG('elasticsearch', default_args=default_args, schedule_interval=timedelta(days=1)) t1 = BashOperator( task_id='daily_index_delete', bash_command='%s/es_clean.py' % SCRIPTS_PATH, dag=dag) t1.doc_md = """\ Task Documentation Clean ES Indeices Every Day """ {code} This will give a traceback: *AttributeError: 'NoneType' object has no attribute 'strip'* But if i changed *t1.doc_md* to *dag.doc_md*, everything is ok. was: In [https://github.com/apache/incubator-airflow/blob/master/docs/concepts.rst] it said *doc_md* is an attribute of a task, but this will give an error on webserver *Graph* tab Example Dag file: {code:python} #!/usr/bin/env python # -*- coding: utf-8 -*- from airflow import DAG from airflow.operators.bash_operator import BashOperator from datetime import datetime, timedelta SCRIPTS_PATH = '/var/lib/airflow/scripts' default_args = { 'depends_on_past': False, 'start_date': datetime(2016, 11, 9, 0, 55), } dag = DAG('elasticsearch', default_args=default_args, schedule_interval=timedelta(days=1)) t1 = BashOperator( task_id='daily_index_delete', bash_command='%s/es_clean.py' % SCRIPTS_PATH, dag=dag) t1.doc_md = """\ Task Documentation Clean ES Indeices Every Day """ {code} This will give a traceback: *AttributeError: 'NoneType' object has no attribute 'strip'* But if i changed *t1.doc_md* to *dag.doc_md*, everything is ok. > doc_md in concepts document seems wrong > --- > > Key: AIRFLOW-625 > URL: https://issues.apache.org/jira/browse/AIRFLOW-625 > Project: Apache Airflow > Issue Type: Bug >Affects Versions: Airflow 1.7.1 > Environment: CentOS 7.2 >Reporter: Flex Gao > > In > [https://github.com/apache/incubator-airflow/blob/master/docs/concepts.rst] > it said *doc_md* is an attribute of a task, but this will give an error on > webserver *Graph* tab > Example Dag file: > {code} > #!/usr/bin/env python > # -*- coding: utf-8 -*- > from airflow import DAG > from airflow.operators.bash_operator import BashOperator > from datetime import datetime, timedelta > SCRIPTS_PATH = '/var/lib/airflow/scripts' > default_args = { > 'depends_on_past': False, > 'start_date': datetime(2016, 11, 9, 0, 55), > } > dag = DAG('elasticsearch', default_args=default_args, > schedule_interval=timedelta(days=1)) > t1 = BashOperator( > task_id='daily_index_delete', > bash_command='%s/es_clean.py' % SCRIPTS_PATH, > dag=dag) > t1.doc_md = """\ > Task Documentation > Clean ES Indeices Every Day > """ > {code} > This will give a traceback: > *AttributeError: 'NoneType' object has no attribute 'strip'* > But if i changed *t1.doc_md* to *dag.doc_md*, everything is ok. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (AIRFLOW-625) doc_md in concepts document seems wrong
Flex Gao created AIRFLOW-625: Summary: doc_md in concepts document seems wrong Key: AIRFLOW-625 URL: https://issues.apache.org/jira/browse/AIRFLOW-625 Project: Apache Airflow Issue Type: Bug Affects Versions: Airflow 1.7.1 Environment: CentOS 7.2 Reporter: Flex Gao In [https://github.com/apache/incubator-airflow/blob/master/docs/concepts.rst] it said *doc_md* is an attribute of a task, but this will give an error on webserver *Graph* tab Example Dag file: {code:python} #!/usr/bin/env python # -*- coding: utf-8 -*- from airflow import DAG from airflow.operators.bash_operator import BashOperator from datetime import datetime, timedelta SCRIPTS_PATH = '/var/lib/airflow/scripts' default_args = { 'depends_on_past': False, 'start_date': datetime(2016, 11, 9, 0, 55), } dag = DAG('elasticsearch', default_args=default_args, schedule_interval=timedelta(days=1)) t1 = BashOperator( task_id='daily_index_delete', bash_command='%s/es_clean.py' % SCRIPTS_PATH, dag=dag) t1.doc_md = """\ Task Documentation Clean ES Indeices Every Day """ {code} This will give a traceback: *AttributeError: 'NoneType' object has no attribute 'strip'* But if i changed *t1.doc_md* to *dag.doc_md*, everything is ok. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-121) Documenting dag doc_md feature
[ https://issues.apache.org/jira/browse/AIRFLOW-121?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15663602#comment-15663602 ] dud commented on AIRFLOW-121: - Hello. I think you should file a new bug issue with an example DAG that triggers this bug. Please also specify which version you are running. > Documenting dag doc_md feature > -- > > Key: AIRFLOW-121 > URL: https://issues.apache.org/jira/browse/AIRFLOW-121 > Project: Apache Airflow > Issue Type: Improvement > Components: docs >Reporter: dud >Assignee: dud >Priority: Trivial > > Dear Airflow Maintainers, > I added a note about DAG documentation. > I'd be glad if my PR would be merged : > https://github.com/apache/incubator-airflow/pull/1493 > Regards > dud -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-121) Documenting dag doc_md feature
[ https://issues.apache.org/jira/browse/AIRFLOW-121?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15663500#comment-15663500 ] Flex Gao commented on AIRFLOW-121: -- I followed this merge doc, and found the "Graph" tab in Webserver will crash, it gave: "markdown AttributeError: 'NoneType' object has no attribute 'strip'" But if i changed "t.doc_md" to "dag.doc_md" everything is ok. > Documenting dag doc_md feature > -- > > Key: AIRFLOW-121 > URL: https://issues.apache.org/jira/browse/AIRFLOW-121 > Project: Apache Airflow > Issue Type: Improvement > Components: docs >Reporter: dud >Assignee: dud >Priority: Trivial > > Dear Airflow Maintainers, > I added a note about DAG documentation. > I'd be glad if my PR would be merged : > https://github.com/apache/incubator-airflow/pull/1493 > Regards > dud -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-573) Jinja2 template failing env argument in BashOperater if string ends in .sh or .bash
[ https://issues.apache.org/jira/browse/AIRFLOW-573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15663411#comment-15663411 ] Flex Gao commented on AIRFLOW-573: -- Also met this problem, and the document said the bash operator can use " the file location is relative to the directory containing the pipeline file", but i found no code to implement this. > Jinja2 template failing env argument in BashOperater if string ends in .sh or > .bash > --- > > Key: AIRFLOW-573 > URL: https://issues.apache.org/jira/browse/AIRFLOW-573 > Project: Apache Airflow > Issue Type: Bug > Components: operators >Affects Versions: Airflow 2.0 >Reporter: Michael Shire > > A dot/period in a templated string appears to cause jinja template errors in > Airflow. > example: > I constructed a dictionary to pass as an environment into BashOperator > i.e. > envvars = {'use_this_script':'/Users/mshire/x.sh'} > task4 = BashOperator( > bash_command="env", > task_id="test_env_variables", > env=envvars, > dag=dag > ) > If I remove the dot/period (i.e. ".") in "x.sh" then it works. Otherwise I > get an error below. I have tried it directly in python with jinja2 and it > works, but not in Airflow. > > airflow test scratch test_env_variables 2016-10-01 > [2016-10-16 00:39:32,981] {__init__.py:36} INFO - Using executor > SequentialExecutor > [2016-10-16 00:39:33,374] {models.py:154} INFO - Filling up the DagBag from > /Users/mshire/airflow/dags > envvars= {'use_this_script': '/Users/mshire/x.sh'} > [2016-10-16 00:39:33,479] {models.py:1196} INFO - > > Starting attempt 1 of 1 > > [2016-10-16 00:39:33,480] {models.py:1219} INFO - Executing >on 2016-10-01 00:00:00 > [2016-10-16 00:39:33,489] {models.py:1286} ERROR - /Users/mshire/x.sh > Traceback (most recent call last): > File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 1233, > in run > self.render_templates() > File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 1409, > in render_templates > rendered_content = rt(attr, content, jinja_context) > File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 2019, > in render_template > return self.render_template_from_field(attr, content, context, jinja_env) > File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 1995, > in render_template_from_field > for k, v in list(content.items())} > File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 1995, > in > for k, v in list(content.items())} > File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 2017, > in render_template > return jinja_env.get_template(content).render(**context) > File "/usr/local/lib/python2.7/site-packages/jinja2/environment.py", line > 812, in get_template > return self._load_template(name, self.make_globals(globals)) > File "/usr/local/lib/python2.7/site-packages/jinja2/environment.py", line > 774, in _load_template > cache_key = self.loader.get_source(self, name)[1] > File "/usr/local/lib/python2.7/site-packages/jinja2/loaders.py", line 187, > in get_source > raise TemplateNotFound(template) > TemplateNotFound: /Users/mshire/x.sh > [2016-10-16 00:39:33,490] {models.py:1306} INFO - Marking task as FAILED. -- This message was sent by Atlassian JIRA (v6.3.4#6332)