[jira] [Work started] (AIRFLOW-2109) scheduler stopped picking up jobs suddenly

2018-02-14 Thread rahul (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-2109 started by rahul.
--
> scheduler stopped picking up jobs suddenly
> --
>
> Key: AIRFLOW-2109
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2109
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: Airflow 1.8
> Environment: aws EC2 server
>Reporter: rahul
>Assignee: rahul
>Priority: Major
>
> Hi,
>  
> The scheduler stopped picking up jobs suddenly. when i resetdb it started 
> picking up properly.
>  
> could you please let me know the reason why airflow resetdb resolved this 
> issue.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2109) scheduler stopped picking up jobs suddenly

2018-02-14 Thread rahul (JIRA)
rahul created AIRFLOW-2109:
--

 Summary: scheduler stopped picking up jobs suddenly
 Key: AIRFLOW-2109
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2109
 Project: Apache Airflow
  Issue Type: Bug
  Components: scheduler
Affects Versions: Airflow 1.8
 Environment: aws EC2 server
Reporter: rahul
Assignee: Alex Lumpov


Hi,

 

The scheduler stopped picking up jobs suddenly. when i resetdb it started 
picking up properly.

 

could you please let me know the reason why airflow resetdb resolved this issue.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2108) BashOperator discards process indentation

2018-02-14 Thread Chris Bandy (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16364816#comment-16364816
 ] 

Chris Bandy commented on AIRFLOW-2108:
--

If I understand correctly, this could be fixed by replacing {{line.strip()}} 
with {{line.rstrip()}}.

> BashOperator discards process indentation
> -
>
> Key: AIRFLOW-2108
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2108
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.9.0
>Reporter: Chris Bandy
>Priority: Minor
>
> When the BashOperator logs every line of output from the executing process, 
> it strips leading whitespace which makes it difficult to interpret output 
> that was formatted with indentation.
> For example, I'm executing [PGLoader|http://pgloader.readthedocs.io/] through 
> this operator. When it finishes, it prints a summary which appears in the 
> logs like so:
> {noformat}
> [2018-02-14 07:31:44,524] {bash_operator.py:101} INFO - 
> 2018-02-14T07:31:44.115000Z LOG report summary reset
> [2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - table name errors 
>   read   imported  bytes  total time   read  write
> [2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - 
> --  -  -  -  -  
> --  -  -
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - fetch meta data   
>0524524 1.438s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Schemas
>   0  0  0 0.161s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create SQL Types  
> 0 19 1920.413s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create tables 
>  0310310   3m2.316s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Set Table OIDs
>   0155155 0.458s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - 
> --  -  -  -  -  
> --  -  -
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - 
> --  -  -  -  -  
> --  -  -
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Index Build 
> Completion  0353353  1m37.323s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Indexes
>   0353353  3m25.929s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Reset Sequences   
>0  0  0 2.677s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Primary Keys  
> 0147147  1m21.091s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Foreign Keys   
>0 16 16 8.283s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Triggers   
>0  0  0 0.339s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Install Comments  
> 0  0  0 0.000s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - 
> --  -  -  -  -  
> --  -  -
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Total import time 
>  ∞  0  0  6m35.642s
> {noformat}
> Ideally, the leading whitespace would be retained, so the logs look like this:
> {noformat}
> [2018-02-14 07:31:44,524] {bash_operator.py:101} INFO - 
> 2018-02-14T07:31:44.115000Z LOG report summary reset
> [2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - table 
> name errors   read   imported  bytes  total time   read   
>write
> [2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - 
> --  -  -  -  -  
> --  -  -
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -fetch meta 
> data  0524524 1.438s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create 
> Schemas  0  0  0 0.161s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -   Create SQL 
> Types  0 19 1920.413s
> [2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -  Create 
> tables  0310310   3m2.316

[jira] [Updated] (AIRFLOW-2108) BashOperator discards process indentation

2018-02-14 Thread Chris Bandy (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2108?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chris Bandy updated AIRFLOW-2108:
-
Description: 
When the BashOperator logs every line of output from the executing process, it 
strips leading whitespace which makes it difficult to interpret output that was 
formatted with indentation.

For example, I'm executing [PGLoader|http://pgloader.readthedocs.io/] through 
this operator. When it finishes, it prints a summary which appears in the logs 
like so:
{noformat}
[2018-02-14 07:31:44,524] {bash_operator.py:101} INFO - 
2018-02-14T07:31:44.115000Z LOG report summary reset
[2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - table name errors   
read   imported  bytes  total time   read  write
[2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - fetch meta data 
 0524524 1.438s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Schemas  
0  0  0 0.161s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create SQL Types
  0 19 1920.413s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create tables  
0310310   3m2.316s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Set Table OIDs  
0155155 0.458s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Index Build Completion  
0353353  1m37.323s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Indexes  
0353353  3m25.929s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Reset Sequences 
 0  0  0 2.677s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Primary Keys  0 
   147147  1m21.091s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Foreign Keys 
 0 16 16 8.283s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Triggers 
 0  0  0 0.339s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Install Comments
  0  0  0 0.000s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Total import time   
   ∞  0  0  6m35.642s
{noformat}
Ideally, the leading whitespace would be retained, so the logs look like this:
{noformat}
[2018-02-14 07:31:44,524] {bash_operator.py:101} INFO - 
2018-02-14T07:31:44.115000Z LOG report summary reset
[2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - table name  
   errors   read   imported  bytes  total time   read  write
[2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -fetch meta data  
0524524 1.438s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Schemas  
0  0  0 0.161s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -   Create SQL Types  
0 19 1920.413s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -  Create tables  
0310310   3m2.316s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Set Table OIDs  
0155155 0.458s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Index Build Completion  
0353353  1m37.323s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create 

[jira] [Created] (AIRFLOW-2108) BashOperator discards process indentation

2018-02-14 Thread Chris Bandy (JIRA)
Chris Bandy created AIRFLOW-2108:


 Summary: BashOperator discards process indentation
 Key: AIRFLOW-2108
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2108
 Project: Apache Airflow
  Issue Type: Bug
  Components: operators
Affects Versions: 1.9.0
Reporter: Chris Bandy


When the BashOperator logs every line of output from the executing process, it 
strips leading whitespace which makes it difficult to interpret output that was 
formatted with indentation.

For example, I'm executing [PGLoader|http://pgloader.readthedocs.io/] through 
this operator. When it finishes, it prints a summary which appears in the logs 
like so:

{noformat}
[2018-02-14 07:31:44,524] {bash_operator.py:101} INFO - 
2018-02-14T07:31:44.115000Z LOG report summary reset
[2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - table name errors   
read   imported  bytes  total time   read  write
[2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - fetch meta data 
 0524524 1.438s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Schemas  
0  0  0 0.161s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create SQL Types
  0 19 1920.413s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create tables  
0310310   3m2.316s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Set Table OIDs  
0155155 0.458s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Index Build Completion  
0353353  1m37.323s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Indexes  
0353353  3m25.929s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Reset Sequences 
 0  0  0 2.677s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Primary Keys  0 
   147147  1m21.091s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Foreign Keys 
 0 16 16 8.283s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Triggers 
 0  0  0 0.339s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Install Comments
  0  0  0 0.000s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Total import time   
   ∞  0  0  6m35.642s
{noformat}

Ideally, the leading whitespace would be retained, so the output looks like 
this:

{noformat}
[2018-02-14 07:31:44,524] {bash_operator.py:101} INFO - 
2018-02-14T07:31:44.115000Z LOG report summary reset
[2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - table name  
   errors   read   imported  bytes  total time   read  write
[2018-02-14 07:31:44,564] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -fetch meta data  
0524524 1.438s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Create Schemas  
0  0  0 0.161s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -   Create SQL Types  
0 19 1920.413s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO -  Create tables  
0310310   3m2.316s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - Set Table OIDs  
0155155 0.458s
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31:44,567] {bash_operator.py:101} INFO - --  
-  -  -  -  --  -  -
[2018-02-14 07:31

[jira] [Commented] (AIRFLOW-1667) Remote log handlers don't upload logs on task finish

2018-02-14 Thread Josh Bacon (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1667?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16364775#comment-16364775
 ] 

Josh Bacon commented on AIRFLOW-1667:
-

Thanks for the explanation on internals [~ashb]. I made some incorrect 
assumptions, my issue was unrelated and is now resolved. Logs shipping 
correctly.

> Remote log handlers don't upload logs on task finish
> 
>
> Key: AIRFLOW-1667
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1667
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.9.0, 1.10.0
>Reporter: Arthur Vigil
>Priority: Major
>
> AIRFLOW-1385 revised logging for configurability, but the provided remote log 
> handlers (S3TaskHandler and GCSTaskHandler) only upload on close (flush is 
> left at the default implementation provided by `logging.FileHandler`). A 
> handler will be closed on process exit by `logging.shutdown()`, but depending 
> on the Executor used worker processes may not regularly shutdown, and can 
> very likely persist between tasks. This means during normal execution log 
> files are never uploaded.
> Need to find a way to flush remote log handlers in a timely manner, but 
> without hitting the target resources unnecessarily.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-908) Airflow run should print worker name at top of log

2018-02-14 Thread Dan Davydov (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dan Davydov resolved AIRFLOW-908.
-
Resolution: Fixed

> Airflow run should print worker name at top of log
> --
>
> Key: AIRFLOW-908
> URL: https://issues.apache.org/jira/browse/AIRFLOW-908
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Dan Davydov
>Assignee: Dan Davydov
>Priority: Major
>  Labels: beginner, starter
>
> Airflow run should log the worker hostname at top of log.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2107) add time_partitioning to run_query

2018-02-14 Thread Ben Marengo (JIRA)
Ben Marengo created AIRFLOW-2107:


 Summary: add time_partitioning to run_query
 Key: AIRFLOW-2107
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2107
 Project: Apache Airflow
  Issue Type: New Feature
  Components: contrib, gcp, hooks, operators
Reporter: Ben Marengo
Assignee: Ben Marengo


google have added a time partitioning field to the query part of their jobs api

[https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.timePartitioning]

we should mirror that in airflow (hook and operator)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1027) Task details cannot be shown when PythonOperator calls a partial function

2018-02-14 Thread Matthew Revell (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1027?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16363992#comment-16363992
 ] 

Matthew Revell commented on AIRFLOW-1027:
-

This ticket only describes one cause for this error. It can also occur for 
other reasons. A more complete solution is proposed with 
[AIRFLOW-2099|https://issues.apache.org/jira/browse/AIRFLOW-2099]

> Task details cannot be shown when PythonOperator calls a partial function
> -
>
> Key: AIRFLOW-1027
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1027
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: Airflow 1.7.1
>Reporter: Adrian Partl
>Assignee: Adrian Partl
>Priority: Minor
>
> Showing task details of a PythonOperator that uses a `functools.partial` as a 
> callable results in the following error:
> {noformat}
>   File "/usr/lib/python2.7/site-packages/airflow/www/views.py", line 909, in 
> task
> special_attrs_rendered[attr_name] = attr_renderer[attr_name](source)
>   File "/usr/lib/python2.7/site-packages/airflow/www/views.py", line 224, in 
> 
> inspect.getsource(x), lexers.PythonLexer),
>   File "/usr/lib64/python2.7/inspect.py", line 701, in getsource
> lines, lnum = getsourcelines(object)
>   File "/usr/lib64/python2.7/inspect.py", line 690, in getsourcelines
> lines, lnum = findsource(object)
>   File "/usr/lib64/python2.7/inspect.py", line 526, in findsource
> file = getfile(object)
>   File "/usr/lib64/python2.7/inspect.py", line 420, in getfile
> 'function, traceback, frame, or code object'.format(object))
> TypeError:  is not a module, class, 
> method, function, traceback, frame, or code object
> {noformat}
> A sample dag definition for this is:
> {noformat}
> def func_with_two_args(arg_1, arg_2):
> pass
> partial_func = functools.partial(func_with_two_args, arg_1=1)
> dag = DAG(dag_id='test_issue_1333_dag', default_args=default_args)
> dag_task1 = PythonOperator(
> task_id='test_dagrun_functool_partial',
> dag=dag,
> python_callable=partial_func)
> {noformat}
> I will provide a PR with a fix for this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1405) Airflow v 1.8.1 unable to properly initialize with MySQL

2018-02-14 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16363809#comment-16363809
 ] 

Yuliya Volkova commented on AIRFLOW-1405:
-

Maybe, possible to close this task? It's not bug. Documentation issue solved 
here - https://issues.apache.org/jira/browse/AIRFLOW-1405 

> Airflow v 1.8.1 unable to properly initialize with MySQL
> 
>
> Key: AIRFLOW-1405
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1405
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Affects Versions: 1.8.1
> Environment: CentOS7
>Reporter: Aakash Bhardwaj
>Priority: Major
> Fix For: 1.8.1
>
> Attachments: error_log.txt
>
>
> While working on a CentOS7 system, I was trying to configure Airflow version 
> 1.8.1 to run with MySql in the backend.
> I have installed Airflow in a Virtual Environment, and the MySQL has a 
> database named airflow (default).
> But on running the command -
> {code:none}
> airflow initdb
> {code}
> the following error is reported
> {noformat}
> [2017-07-12 13:22:36,558] {__init__.py:57} INFO - Using executor LocalExecutor
> DB: mysql://airflow:***@localhost/airflow
> [2017-07-12 13:22:37,218] {db.py:287} INFO - Creating tables
> INFO  [alembic.runtime.migration] Context impl MySQLImpl.
> INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
> INFO  [alembic.runtime.migration] Running upgrade f2ca10b85618 -> 
> 4addfa1236f1, Add fractional seconds to mysql tables
> Traceback (most recent call last):
>   File "/opt/airflow_virtual_environment/airflow_venv/bin/airflow", line 28, 
> in 
> args.func(args)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/bin/cli.py",
>  line 951, in initdb
> db_utils.initdb()
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 106, in initdb
> upgradedb()
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 294, in upgradedb
> command.upgrade(config, 'heads')
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/command.py",
>  line 174, in upgrade
> script.run_env()
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/script/base.py",
>  line 416, in run_env
> util.load_python_file(self.dir, 'env.py')
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/util/pyfiles.py",
>  line 93, in load_python_file
> module = load_module_py(module_id, path)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/util/compat.py",
>  line 79, in load_module_py
> mod = imp.load_source(module_id, path, fp)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/migrations/env.py",
>  line 86, in 
> run_migrations_online()
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/migrations/env.py",
>  line 81, in run_migrations_online
> context.run_migrations()
>   File "", line 8, in run_migrations
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/runtime/environment.py",
>  line 807, in run_migrations
> self.get_context().run_migrations(**kw)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/runtime/migration.py",
>  line 321, in run_migrations
> step.migration_fn(**kw)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/migrations/versions/4addfa1236f1_add_fractional_seconds_to_mysql_tables.py",
>  line 36, in upgrade
> op.alter_column(table_name='dag', column_name='last_scheduler_run', 
> type_=mysql.DATETIME(fsp=6))
>   File "", line 8, in alter_column
>   File "", line 3, in alter_column
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/operations/ops.py",
>  line 1420, in alter_column
> return operations.invoke(alt)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/operations/base.py",
>  line 318, in invoke
> return fn(self, operation)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/operations/toimpl.py",
>  line 53, in alter_column
> **operation.kw
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/ddl/mysql.py",
>  line 67, in alter_column
> else existing_autoincrement
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/ddl/impl.py",
>  l

[jira] [Created] (AIRFLOW-2106) Cannot pass sandbox argument to sales_force hook preventing sandbox connection

2018-02-14 Thread Rob Liddle (JIRA)
Rob Liddle created AIRFLOW-2106:
---

 Summary: Cannot pass sandbox argument to sales_force hook 
preventing sandbox connection
 Key: AIRFLOW-2106
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2106
 Project: Apache Airflow
  Issue Type: Improvement
  Components: contrib, hooks
Affects Versions: 1.8.0, 1.9.0
 Environment: any
Reporter: Rob Liddle


In the salesforce_hook we pass a number of variables on to the 
simple-salesforce library.

We do not pass or allow passing of the sandbox value - which means it defaults 
to false inside the simple-salesforce library.

This value is required in order to connect to a salesforce sandbox testing 
environment. Without it the connection fails saying invalid credentials (even 
if they are correct).

Link to hook section:

[https://github.com/apache/incubator-airflow/blob/a592e732356427b4fd1ed27c890cf82e9cf495f0/airflow/contrib/hooks/salesforce_hook.py#L78]

Link to simple-salesforce args:

[https://github.com/simple-salesforce/simple-salesforce/blob/364c889abe9e22b7145524f2dc422e41c5f0639b/simple_salesforce/login.py#L24]
 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)