[jira] [Created] (AIRFLOW-7001) Mysql 5.7 handles timezone-aware timestamps differently than 5.6

2020-03-06 Thread Jarek Potiuk (Jira)
Jarek Potiuk created AIRFLOW-7001:
-

 Summary: Mysql 5.7 handles timezone-aware timestamps differently 
than 5.6
 Key: AIRFLOW-7001
 URL: https://issues.apache.org/jira/browse/AIRFLOW-7001
 Project: Apache Airflow
  Issue Type: Improvement
  Components: database, mysql
Affects Versions: 1.10.9, 2.0.0
Reporter: Jarek Potiuk


In Airflow when UtcDateTime is used, always Timezone is required and added if 
missing.

For example when utcnow() function is used to get timezone we get the timestamp 
in the form of  '2020-03-07 07:32:34.121705+00:00' 

When such value - with timezone - is used in MySQL 5.6 the timezone part is 
IGNORED:

 
{code:java}
mysql> create table test (a timestamp(6));
Query OK, 0 rows affected (0.01 sec)
mysql> insert into test values ('2020-03-07 07:32:34.121705+00:00');
Query OK, 1 row affected, 1 warning (0.00 sec)
mysql> insert into test values ('2020-03-07 07:32:34.121705+01:00');
Query OK, 1 row affected, 1 warning (0.00 sec)
mysql> select * from test;
++
| a |
++
| 2020-03-07 07:32:34.121705 |
| 2020-03-07 07:32:34.121705 |
++
2 rows in set (0.00 sec)
mysql> SHOW VARIABLES LIKE "%version%";
+-+--+
| Variable_name | Value |
+-+--+
| innodb_version | 5.6.47 |
| protocol_version | 10 |
| slave_type_conversions | |
| version | 5.6.47 |
| version_comment | MySQL Community Server (GPL) |
| version_compile_machine | x86_64 |
| version_compile_os | Linux |
+-+--+
7 rows in set (0.00 sec) 
{code}
 

The same insert in 5.7 results in error:

 
{code:java}
mysql> create table test(a TIMESTAMP(6));
Query OK, 0 rows affected (0.00 sec)
mysql> insert into test values ('2020-03-07 07:32:34.121705+01:00');
ERROR 1292 (22007): Incorrect datetime value: '2020-03-07 
07:32:34.121705+01:00' for column 'a' at row 1
mysql> SHOW VARIABLES LIKE "%version%";
+-+--+
| Variable_name | Value |
+-+--+
| innodb_version | 5.7.29 |
| protocol_version | 10 |
| slave_type_conversions | |
| tls_version | TLSv1,TLSv1.1,TLSv1.2 |
| version | 5.7.29 |
| version_comment | MySQL Community Server (GPL) |
| version_compile_machine | x86_64 |
| version_compile_os | Linux |
+-+--+
8 rows in set (0.00 sec)
{code}
Seems that for MySQL - neither 5.6 (ignore timezone) nor 5.7 (crashes) works 
properly.

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (AIRFLOW-5822) CSRF_ENABLED was changed to WTF_CSRF_ENABLED

2020-03-06 Thread Rohit S S (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5822?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rohit S S reassigned AIRFLOW-5822:
--

Assignee: Rohit S S

> CSRF_ENABLED was changed to WTF_CSRF_ENABLED
> 
>
> Key: AIRFLOW-5822
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5822
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration
>Affects Versions: 1.10.5
>Reporter: Rustam
>Assignee: Rohit S S
>Priority: Critical
>
> Variable CSRF_ENABLED was renamed to WTF_CSRF_ENABLED, but not changed in 
> webserver_config_template.
> And we need to change it from environment variables, anything like 
> conf.get('csrf', 'WTF_CSRF_ENABLED')



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (AIRFLOW-6960) Airflow Celery worker : command returned non-zero exit status 2

2020-03-06 Thread Rohit S S (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6960?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rohit S S reassigned AIRFLOW-6960:
--

Assignee: (was: Rohit S S)

> Airflow Celery worker : command returned non-zero exit status 2
> ---
>
> Key: AIRFLOW-6960
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6960
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Affects Versions: 2.0.0, 1.10.9
>Reporter: Uragalage Thilanka Mahesh Perera
>Priority: Blocker
>
> I am getting below error and trying to fix it for hours and did get any luck. 
> Below logs are from airflow celery worker.
> {code:java}
>   airflow command error: argument subcommand: invalid choice: 'tasks' (choose 
> from 'backfill', 'list_dag_runs', 'list_tasks', 'clear', 'pause', 'unpause', 
> 'trigger_dag', 'delete_dag', 'show_dag', 'pool', 'variables', 'kerberos', 
> 'render', 'run', 'initdb', 'list_dags', 'dag_state', 'task_failed_deps', 
> 'task_state', 'serve_logs', 'test', 'webserver', 'resetdb', 'upgradedb', 
> 'checkdb', 'shell', 'scheduler', 'worker', 'flower', 'version', 
> 'connections', 'create_user', 'delete_user', 'list_users', 'sync_perm', 
> 'next_execution', 'rotate_fernet_key'), see help above. usage: airflow [-h] 
> {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
>  ... positional arguments: 
> {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
>  sub-command help backfill Run subsections of a DAG for a specified date 
> range. If reset_dag_run option is used, backfill will first prompt users 
> whether airflow should clear all the previous dag_run and task_instances 
> within the backfill date range. If rerun_failed_tasks is used, backfill will 
> auto re-run the previous failed task instances within the backfill date 
> range. list_dag_runs List dag runs given a DAG id. If state option is given, 
> it will onlysearch for all the dagruns with the given state. If no_backfill 
> option is given, it will filter outall backfill dagruns for given dag id. 
> list_tasks List the tasks within a DAG clear Clear a set of task instance, as 
> if they never ran pause Pause a DAG unpause Resume a paused DAG trigger_dag 
> Trigger a DAG run delete_dag Delete all DB records related to the specified 
> DAG show_dag Displays DAG's tasks with their dependencies pool CRUD 
> operations on pools variables CRUD operations on variables kerberos Start a 
> kerberos ticket renewer render Render a task instance's template(s) run Run a 
> single task instance initdb Initialize the metadata database list_dags List 
> all the DAGs dag_state Get the status of a dag run task_failed_deps Returns 
> the unmet dependencies for a task instance from the perspective of the 
> scheduler. In other words, why a task instance doesn't get scheduled and then 
> queued by the scheduler, and then run by an executor). task_state Get the 
> status of a task instance serve_logs Serve logs generate by worker test Test 
> a task instance. This will run a task without checking for dependencies or 
> recording its state in the database. webserver Start a Airflow webserver 
> instance resetdb Burn down and rebuild the metadata database upgradedb 
> Upgrade the metadata database to latest version checkdb Check if the database 
> can be reached. shell Runs a shell to access the database scheduler Start a 
> scheduler instance worker Start a Celery worker node flower Start a Celery 
> Flower version Show the version connections List/Add/Delete connections 
> create_user Create an account for the Web UI (FAB-based) delete_user Delete 
> an account for the Web UI list_users List accounts for the Web UI sync_perm 
> Update permissions for existing roles and DAGs. next_execution Get the next 
> execution datetime of a DAG. rotate_fernet_key Rotate all encrypted 
> connection credentials and variables; see 
> https://airflow.readthedocs.io/en/stable/howto/secure- 
> connections.html#rotating-encryption-keys. optional arguments: -h, --help 
> show this help message and exit airflow command error: argument subcommand: 
> invalid choice: 'tasks' (choose from 'backfill', 'list_dag_runs', 
> 'list_tasks', 'clear', 'pause', 'unpause', 

[GitHub] [airflow] Sharadh commented on issue #6317: [AIRFLOW-5644] Simplify TriggerDagRunOperator usage

2020-03-06 Thread GitBox
Sharadh commented on issue #6317: [AIRFLOW-5644] Simplify TriggerDagRunOperator 
usage
URL: https://github.com/apache/airflow/pull/6317#issuecomment-596043828
 
 
   @BasPH @Fokko not sure what y'alls settings on notifications for closed PRs 
are, so pinging you to make sure :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] Sharadh commented on issue #6317: [AIRFLOW-5644] Simplify TriggerDagRunOperator usage

2020-03-06 Thread GitBox
Sharadh commented on issue #6317: [AIRFLOW-5644] Simplify TriggerDagRunOperator 
usage
URL: https://github.com/apache/airflow/pull/6317#issuecomment-596043701
 
 
   Greetings! I'm curious to know if you folks knew this change reduced 
functionality. Specifically, we have workflows where the `python_callable` was 
useful with two things:
   
   1. Dynamically generate the `conf` required for the `trigger_dag` call
   2. Return a false-y value so the `trigger_dag` call does not take place
   
   I am not sure how this can be done after the change.
   
   In general, having the convenience of an arbitrary python callable to hook 
into and modify behavior based on incoming conf is very valuable. For a 
practical example, this task would trigger a dag only if a flag was set in the 
conf; this flag can vary between dag runs, but the same dag can model both 
behaviors:
   
   ```py
   
   step1 = SomeOperator()
   
   step2 = AnotherOperator()
   
   def check_and_trigger(context, dag_run_obj):
   payload = context["dag_run"].conf
   
   if not payload["should_trigger"]:
   return False
   
   dag_run_obj.payload = payload["downstream_payload"]
   return dag_run_obj
   
   maybe_trigger_bar_dag = TriggerDagRunOperator(
   trigger_dag_id="bar",
   python_callable=check_and_trigger,
   )
   
   step1 >> step2 >> maybe_trigger_bar_dag
   ```
   In our use-case, the Dag itself is static but takes in a few parameters via 
`conf`, which comes in via the experimental API or from another scheduled dag. 
It helps us reuse dags without getting into the gnarly world of sub-dags.
   
   Please let me know if I can explain things further. I was unable to find the 
motivation for this change apart from the Jira ticket linked, so please do 
point me to more reading if it exists, so I can gain context.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-6604) Add support for capacityProviderStrategy argument in ECS Operator

2020-03-06 Thread Armando Martinez (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Armando Martinez reassigned AIRFLOW-6604:
-

Assignee: Armando Martinez

> Add support for capacityProviderStrategy argument in ECS Operator
> -
>
> Key: AIRFLOW-6604
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6604
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.10.7
>Reporter: Andrey Kateshov
>Assignee: Armando Martinez
>Priority: Major
>
> In december Aws added new a major feature to ECS: capacity providers. This is 
> set using a new argument in boto3 ecs run_taks method called 
> capacityProviderStrategy 
> (https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ecs.html#ECS.Client.run_task).
>  However, ECSOperator does not currently pass through this argument to boto3 
> client method invocation. This very important feature and will be adopted 
> quickly by all users of ECS.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6497) Scheduler creates DagBag in the same process with outdated info

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6497?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053728#comment-17053728
 ] 

ASF GitHub Bot commented on AIRFLOW-6497:
-

mik-laj commented on pull request #7597: [AIRFLOW-6497] Avoid loading DAGs in 
the main scheduler loop
URL: https://github.com/apache/airflow/pull/7597
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Scheduler creates DagBag in the same process with outdated info
> ---
>
> Key: AIRFLOW-6497
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6497
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: Qian Yu
>Priority: Major
>
> The following code in scheduler_job.py seems to be called in the same process 
> as the scheduler. It creates a DagBag. But since scheduler is a long running 
> process, it does not pick up the latest changes made to DAGs. For example, 
> changes to retries count, on_failure_callback, newly added tasks, etc are not 
> reflected.
>  
> {code:python}
> if ti.try_number == try_number and ti.state == State.QUEUED:
> msg = ("Executor reports task instance {} finished ({}) "
>"although the task says its {}. Was the task "
>"killed externally?".format(ti, state, ti.state))
> Stats.incr('scheduler.tasks.killed_externally')
> self.log.error(msg)
> try:
> simple_dag = simple_dag_bag.get_dag(dag_id)
> dagbag = models.DagBag(simple_dag.full_filepath)
> dag = dagbag.get_dag(dag_id)
> ti.task = dag.get_task(task_id)
> ti.handle_failure(msg)
> except Exception:
> self.log.error("Cannot load the dag bag to handle 
> failure for %s"
>". Setting task to FAILED without 
> callbacks or "
>"retries. Do you have enough 
> resources?", ti)
> ti.state = State.FAILED
> session.merge(ti)
> session.commit()
> {code}
> This causes errors such as AttributeError due to stale code being hit. E.g. 
> when someone added a .join attribute to CustomOperator without bouncing the 
> scheduler, this is what he would get after a CeleryWorker timeout error 
> causes this line to be hit:
> {code}
> [2020-01-05 22:25:45,951] {dagbag.py:207} ERROR - Failed to import: 
> /dags/dag1.py
> Traceback (most recent call last):
>   File "/lib/python3.6/site-packages/airflow/models/dagbag.py", line 204, in 
> process_file
> m = imp.load_source(mod_name, filepath)
>   File "/usr/lib/python3.6/imp.py", line 172, in load_source
> module = _load(spec)
>   File "", line 684, in _load
>   File "", line 665, in _load_unlocked
>   File "", line 678, in exec_module
>   File "", line 219, in _call_with_frames_removed
>   File "/dags/dag1.py", line 280, in 
> task1 >> task2.join
> AttributeError: 'CustomOperator' object has no attribute 'join'
> [2020-01-05 22:25:45,951] {scheduler_job.py:1314} ERROR - Cannot load the dag 
> bag to handle failure for  [queued]>. Setting task to FAILED without callbacks or retries. Do you have 
> enough resou
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6960) Airflow Celery worker : command returned non-zero exit status 2

2020-03-06 Thread Rohit S S (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053750#comment-17053750
 ] 

Rohit S S commented on AIRFLOW-6960:


* Sir, please note you are trying to install a later version(2.0) from a (1.10) 
release.
 * You cannot be doing that.

> Airflow Celery worker : command returned non-zero exit status 2
> ---
>
> Key: AIRFLOW-6960
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6960
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Affects Versions: 2.0.0, 1.10.9
>Reporter: Uragalage Thilanka Mahesh Perera
>Assignee: Rohit S S
>Priority: Blocker
>
> I am getting below error and trying to fix it for hours and did get any luck. 
> Below logs are from airflow celery worker.
> {code:java}
>   airflow command error: argument subcommand: invalid choice: 'tasks' (choose 
> from 'backfill', 'list_dag_runs', 'list_tasks', 'clear', 'pause', 'unpause', 
> 'trigger_dag', 'delete_dag', 'show_dag', 'pool', 'variables', 'kerberos', 
> 'render', 'run', 'initdb', 'list_dags', 'dag_state', 'task_failed_deps', 
> 'task_state', 'serve_logs', 'test', 'webserver', 'resetdb', 'upgradedb', 
> 'checkdb', 'shell', 'scheduler', 'worker', 'flower', 'version', 
> 'connections', 'create_user', 'delete_user', 'list_users', 'sync_perm', 
> 'next_execution', 'rotate_fernet_key'), see help above. usage: airflow [-h] 
> {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
>  ... positional arguments: 
> {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
>  sub-command help backfill Run subsections of a DAG for a specified date 
> range. If reset_dag_run option is used, backfill will first prompt users 
> whether airflow should clear all the previous dag_run and task_instances 
> within the backfill date range. If rerun_failed_tasks is used, backfill will 
> auto re-run the previous failed task instances within the backfill date 
> range. list_dag_runs List dag runs given a DAG id. If state option is given, 
> it will onlysearch for all the dagruns with the given state. If no_backfill 
> option is given, it will filter outall backfill dagruns for given dag id. 
> list_tasks List the tasks within a DAG clear Clear a set of task instance, as 
> if they never ran pause Pause a DAG unpause Resume a paused DAG trigger_dag 
> Trigger a DAG run delete_dag Delete all DB records related to the specified 
> DAG show_dag Displays DAG's tasks with their dependencies pool CRUD 
> operations on pools variables CRUD operations on variables kerberos Start a 
> kerberos ticket renewer render Render a task instance's template(s) run Run a 
> single task instance initdb Initialize the metadata database list_dags List 
> all the DAGs dag_state Get the status of a dag run task_failed_deps Returns 
> the unmet dependencies for a task instance from the perspective of the 
> scheduler. In other words, why a task instance doesn't get scheduled and then 
> queued by the scheduler, and then run by an executor). task_state Get the 
> status of a task instance serve_logs Serve logs generate by worker test Test 
> a task instance. This will run a task without checking for dependencies or 
> recording its state in the database. webserver Start a Airflow webserver 
> instance resetdb Burn down and rebuild the metadata database upgradedb 
> Upgrade the metadata database to latest version checkdb Check if the database 
> can be reached. shell Runs a shell to access the database scheduler Start a 
> scheduler instance worker Start a Celery worker node flower Start a Celery 
> Flower version Show the version connections List/Add/Delete connections 
> create_user Create an account for the Web UI (FAB-based) delete_user Delete 
> an account for the Web UI list_users List accounts for the Web UI sync_perm 
> Update permissions for existing roles and DAGs. next_execution Get the next 
> execution datetime of a DAG. rotate_fernet_key Rotate all encrypted 
> connection credentials and variables; see 
> https://airflow.readthedocs.io/en/stable/howto/secure- 
> connections.html#rotating-encryption-keys. optional arguments: -h, --help 
> show this help message and exit airflow command 

[jira] [Reopened] (AIRFLOW-6497) Scheduler creates DagBag in the same process with outdated info

2020-03-06 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6497?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula reopened AIRFLOW-6497:


> Scheduler creates DagBag in the same process with outdated info
> ---
>
> Key: AIRFLOW-6497
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6497
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: Qian Yu
>Priority: Major
> Fix For: 2.0.0
>
>
> The following code in scheduler_job.py seems to be called in the same process 
> as the scheduler. It creates a DagBag. But since scheduler is a long running 
> process, it does not pick up the latest changes made to DAGs. For example, 
> changes to retries count, on_failure_callback, newly added tasks, etc are not 
> reflected.
>  
> {code:python}
> if ti.try_number == try_number and ti.state == State.QUEUED:
> msg = ("Executor reports task instance {} finished ({}) "
>"although the task says its {}. Was the task "
>"killed externally?".format(ti, state, ti.state))
> Stats.incr('scheduler.tasks.killed_externally')
> self.log.error(msg)
> try:
> simple_dag = simple_dag_bag.get_dag(dag_id)
> dagbag = models.DagBag(simple_dag.full_filepath)
> dag = dagbag.get_dag(dag_id)
> ti.task = dag.get_task(task_id)
> ti.handle_failure(msg)
> except Exception:
> self.log.error("Cannot load the dag bag to handle 
> failure for %s"
>". Setting task to FAILED without 
> callbacks or "
>"retries. Do you have enough 
> resources?", ti)
> ti.state = State.FAILED
> session.merge(ti)
> session.commit()
> {code}
> This causes errors such as AttributeError due to stale code being hit. E.g. 
> when someone added a .join attribute to CustomOperator without bouncing the 
> scheduler, this is what he would get after a CeleryWorker timeout error 
> causes this line to be hit:
> {code}
> [2020-01-05 22:25:45,951] {dagbag.py:207} ERROR - Failed to import: 
> /dags/dag1.py
> Traceback (most recent call last):
>   File "/lib/python3.6/site-packages/airflow/models/dagbag.py", line 204, in 
> process_file
> m = imp.load_source(mod_name, filepath)
>   File "/usr/lib/python3.6/imp.py", line 172, in load_source
> module = _load(spec)
>   File "", line 684, in _load
>   File "", line 665, in _load_unlocked
>   File "", line 678, in exec_module
>   File "", line 219, in _call_with_frames_removed
>   File "/dags/dag1.py", line 280, in 
> task1 >> task2.join
> AttributeError: 'CustomOperator' object has no attribute 'join'
> [2020-01-05 22:25:45,951] {scheduler_job.py:1314} ERROR - Cannot load the dag 
> bag to handle failure for  [queued]>. Setting task to FAILED without callbacks or retries. Do you have 
> enough resou
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6999) Use check_output to capture the stderr in celery executor

2020-03-06 Thread Ping Zhang (Jira)
Ping Zhang created AIRFLOW-6999:
---

 Summary: Use check_output to capture the stderr in celery executor
 Key: AIRFLOW-6999
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6999
 Project: Apache Airflow
  Issue Type: Improvement
  Components: worker
Affects Versions: 1.10.9
Reporter: Ping Zhang


so that airflow celery worker can capture the error output



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Closed] (AIRFLOW-6497) Scheduler creates DagBag in the same process with outdated info

2020-03-06 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6497?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula closed AIRFLOW-6497.
--
Fix Version/s: 2.0.0
   Resolution: Fixed

> Scheduler creates DagBag in the same process with outdated info
> ---
>
> Key: AIRFLOW-6497
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6497
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: Qian Yu
>Priority: Major
> Fix For: 2.0.0
>
>
> The following code in scheduler_job.py seems to be called in the same process 
> as the scheduler. It creates a DagBag. But since scheduler is a long running 
> process, it does not pick up the latest changes made to DAGs. For example, 
> changes to retries count, on_failure_callback, newly added tasks, etc are not 
> reflected.
>  
> {code:python}
> if ti.try_number == try_number and ti.state == State.QUEUED:
> msg = ("Executor reports task instance {} finished ({}) "
>"although the task says its {}. Was the task "
>"killed externally?".format(ti, state, ti.state))
> Stats.incr('scheduler.tasks.killed_externally')
> self.log.error(msg)
> try:
> simple_dag = simple_dag_bag.get_dag(dag_id)
> dagbag = models.DagBag(simple_dag.full_filepath)
> dag = dagbag.get_dag(dag_id)
> ti.task = dag.get_task(task_id)
> ti.handle_failure(msg)
> except Exception:
> self.log.error("Cannot load the dag bag to handle 
> failure for %s"
>". Setting task to FAILED without 
> callbacks or "
>"retries. Do you have enough 
> resources?", ti)
> ti.state = State.FAILED
> session.merge(ti)
> session.commit()
> {code}
> This causes errors such as AttributeError due to stale code being hit. E.g. 
> when someone added a .join attribute to CustomOperator without bouncing the 
> scheduler, this is what he would get after a CeleryWorker timeout error 
> causes this line to be hit:
> {code}
> [2020-01-05 22:25:45,951] {dagbag.py:207} ERROR - Failed to import: 
> /dags/dag1.py
> Traceback (most recent call last):
>   File "/lib/python3.6/site-packages/airflow/models/dagbag.py", line 204, in 
> process_file
> m = imp.load_source(mod_name, filepath)
>   File "/usr/lib/python3.6/imp.py", line 172, in load_source
> module = _load(spec)
>   File "", line 684, in _load
>   File "", line 665, in _load_unlocked
>   File "", line 678, in exec_module
>   File "", line 219, in _call_with_frames_removed
>   File "/dags/dag1.py", line 280, in 
> task1 >> task2.join
> AttributeError: 'CustomOperator' object has no attribute 'join'
> [2020-01-05 22:25:45,951] {scheduler_job.py:1314} ERROR - Cannot load the dag 
> bag to handle failure for  [queued]>. Setting task to FAILED without callbacks or retries. Do you have 
> enough resou
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-6497) Scheduler creates DagBag in the same process with outdated info

2020-03-06 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6497?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-6497.

Resolution: Fixed

> Scheduler creates DagBag in the same process with outdated info
> ---
>
> Key: AIRFLOW-6497
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6497
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: Qian Yu
>Priority: Major
> Fix For: 2.0.0
>
>
> The following code in scheduler_job.py seems to be called in the same process 
> as the scheduler. It creates a DagBag. But since scheduler is a long running 
> process, it does not pick up the latest changes made to DAGs. For example, 
> changes to retries count, on_failure_callback, newly added tasks, etc are not 
> reflected.
>  
> {code:python}
> if ti.try_number == try_number and ti.state == State.QUEUED:
> msg = ("Executor reports task instance {} finished ({}) "
>"although the task says its {}. Was the task "
>"killed externally?".format(ti, state, ti.state))
> Stats.incr('scheduler.tasks.killed_externally')
> self.log.error(msg)
> try:
> simple_dag = simple_dag_bag.get_dag(dag_id)
> dagbag = models.DagBag(simple_dag.full_filepath)
> dag = dagbag.get_dag(dag_id)
> ti.task = dag.get_task(task_id)
> ti.handle_failure(msg)
> except Exception:
> self.log.error("Cannot load the dag bag to handle 
> failure for %s"
>". Setting task to FAILED without 
> callbacks or "
>"retries. Do you have enough 
> resources?", ti)
> ti.state = State.FAILED
> session.merge(ti)
> session.commit()
> {code}
> This causes errors such as AttributeError due to stale code being hit. E.g. 
> when someone added a .join attribute to CustomOperator without bouncing the 
> scheduler, this is what he would get after a CeleryWorker timeout error 
> causes this line to be hit:
> {code}
> [2020-01-05 22:25:45,951] {dagbag.py:207} ERROR - Failed to import: 
> /dags/dag1.py
> Traceback (most recent call last):
>   File "/lib/python3.6/site-packages/airflow/models/dagbag.py", line 204, in 
> process_file
> m = imp.load_source(mod_name, filepath)
>   File "/usr/lib/python3.6/imp.py", line 172, in load_source
> module = _load(spec)
>   File "", line 684, in _load
>   File "", line 665, in _load_unlocked
>   File "", line 678, in exec_module
>   File "", line 219, in _call_with_frames_removed
>   File "/dags/dag1.py", line 280, in 
> task1 >> task2.join
> AttributeError: 'CustomOperator' object has no attribute 'join'
> [2020-01-05 22:25:45,951] {scheduler_job.py:1314} ERROR - Cannot load the dag 
> bag to handle failure for  [queued]>. Setting task to FAILED without callbacks or retries. Do you have 
> enough resou
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6999) Use check_output to capture the stderr in celery executor

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053843#comment-17053843
 ] 

ASF GitHub Bot commented on AIRFLOW-6999:
-

pingzh commented on pull request #7638: [AIRFLOW-6999] use check_output in 
celery app task
URL: https://github.com/apache/airflow/pull/7638
 
 
   so that celery worker can log the stderr when the command fails
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Use check_output to capture the stderr in celery executor
> -
>
> Key: AIRFLOW-6999
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6999
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: worker
>Affects Versions: 1.10.9
>Reporter: Ping Zhang
>Priority: Minor
>
> so that airflow celery worker can capture the error output



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (AIRFLOW-6497) Scheduler creates DagBag in the same process with outdated info

2020-03-06 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6497?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula reassigned AIRFLOW-6497:
--

Assignee: Kamil Bregula

> Scheduler creates DagBag in the same process with outdated info
> ---
>
> Key: AIRFLOW-6497
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6497
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: Qian Yu
>Assignee: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>
> The following code in scheduler_job.py seems to be called in the same process 
> as the scheduler. It creates a DagBag. But since scheduler is a long running 
> process, it does not pick up the latest changes made to DAGs. For example, 
> changes to retries count, on_failure_callback, newly added tasks, etc are not 
> reflected.
>  
> {code:python}
> if ti.try_number == try_number and ti.state == State.QUEUED:
> msg = ("Executor reports task instance {} finished ({}) "
>"although the task says its {}. Was the task "
>"killed externally?".format(ti, state, ti.state))
> Stats.incr('scheduler.tasks.killed_externally')
> self.log.error(msg)
> try:
> simple_dag = simple_dag_bag.get_dag(dag_id)
> dagbag = models.DagBag(simple_dag.full_filepath)
> dag = dagbag.get_dag(dag_id)
> ti.task = dag.get_task(task_id)
> ti.handle_failure(msg)
> except Exception:
> self.log.error("Cannot load the dag bag to handle 
> failure for %s"
>". Setting task to FAILED without 
> callbacks or "
>"retries. Do you have enough 
> resources?", ti)
> ti.state = State.FAILED
> session.merge(ti)
> session.commit()
> {code}
> This causes errors such as AttributeError due to stale code being hit. E.g. 
> when someone added a .join attribute to CustomOperator without bouncing the 
> scheduler, this is what he would get after a CeleryWorker timeout error 
> causes this line to be hit:
> {code}
> [2020-01-05 22:25:45,951] {dagbag.py:207} ERROR - Failed to import: 
> /dags/dag1.py
> Traceback (most recent call last):
>   File "/lib/python3.6/site-packages/airflow/models/dagbag.py", line 204, in 
> process_file
> m = imp.load_source(mod_name, filepath)
>   File "/usr/lib/python3.6/imp.py", line 172, in load_source
> module = _load(spec)
>   File "", line 684, in _load
>   File "", line 665, in _load_unlocked
>   File "", line 678, in exec_module
>   File "", line 219, in _call_with_frames_removed
>   File "/dags/dag1.py", line 280, in 
> task1 >> task2.join
> AttributeError: 'CustomOperator' object has no attribute 'join'
> [2020-01-05 22:25:45,951] {scheduler_job.py:1314} ERROR - Cannot load the dag 
> bag to handle failure for  [queued]>. Setting task to FAILED without callbacks or retries. Do you have 
> enough resou
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-7000) Allow passing in env var JSON dict in task_test

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-7000?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053856#comment-17053856
 ] 

ASF GitHub Bot commented on AIRFLOW-7000:
-

KevinYang21 commented on pull request #7639: [AIRFLOW-7000] Allow passing in 
env var JSON dict in task_test
URL: https://github.com/apache/airflow/pull/7639
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow passing in env var JSON dict in task_test
> ---
>
> Key: AIRFLOW-7000
> URL: https://issues.apache.org/jira/browse/AIRFLOW-7000
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 1.10.9
>Reporter: Kevin Yang
>Assignee: Kevin Yang
>Priority: Trivial
>
> Add a convinient way to batch add env vars, e.g. {'test_mode': True, 
> "write_to_production": True}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] KevinYang21 opened a new pull request #7639: [AIRFLOW-7000] Allow passing in env var JSON dict in task_test

2020-03-06 Thread GitBox
KevinYang21 opened a new pull request #7639: [AIRFLOW-7000] Allow passing in 
env var JSON dict in task_test
URL: https://github.com/apache/airflow/pull/7639
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6998) System test is failing in Travis

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6998?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053726#comment-17053726
 ] 

ASF GitHub Bot commented on AIRFLOW-6998:
-

potiuk commented on pull request #7636: [AIRFLOW-6998] Fix failing system tests 
in CI
URL: https://github.com/apache/airflow/pull/7636
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> System test is failing in Travis
> 
>
> Key: AIRFLOW-6998
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6998
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6994) SparkSubmitOperator re launches spark driver even when original driver still running

2020-03-06 Thread t oo (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

t oo updated AIRFLOW-6994:
--
Fix Version/s: 1.10.10

> SparkSubmitOperator re launches spark driver even when original driver still 
> running
> 
>
> Key: AIRFLOW-6994
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6994
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.8, 1.10.9
>Reporter: t oo
>Assignee: t oo
>Priority: Major
> Fix For: 1.10.10
>
>
> https://issues.apache.org/jira/browse/AIRFLOW-6229 introduced a bug
> Due to temporary network blip in connection to spark the state goes to 
> unknown (as no tags found in curl response) and forces retry
> fix in spark_submit_hook.py:
>   
> {code:java}
>   def _process_spark_status_log(self, itr):
> """
> parses the logs of the spark driver status query process
> :param itr: An iterator which iterates over the input of the 
> subprocess
> """
> response_found = False
> driver_found = False
> # Consume the iterator
> for line in itr:
> line = line.strip()
> if "submissionId" in line:
> response_found = True
> 
> # Check if the log line is about the driver status and extract 
> the status.
> if "driverState" in line:
> self._driver_status = line.split(' : ')[1] \
> .replace(',', '').replace('\"', '').strip()
> driver_found = True
> self.log.debug("spark driver status log: {}".format(line))
> if response_found and not driver_found:
> self._driver_status = "UNKNOWN"
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work started] (AIRFLOW-6994) SparkSubmitOperator re launches spark driver even when original driver still running

2020-03-06 Thread t oo (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-6994 started by t oo.
-
> SparkSubmitOperator re launches spark driver even when original driver still 
> running
> 
>
> Key: AIRFLOW-6994
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6994
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.8, 1.10.9
>Reporter: t oo
>Assignee: t oo
>Priority: Major
>
> https://issues.apache.org/jira/browse/AIRFLOW-6229 introduced a bug
> Due to temporary network blip in connection to spark the state goes to 
> unknown (as no tags found in curl response) and forces retry
> fix in spark_submit_hook.py:
>   
> {code:java}
>   def _process_spark_status_log(self, itr):
> """
> parses the logs of the spark driver status query process
> :param itr: An iterator which iterates over the input of the 
> subprocess
> """
> response_found = False
> driver_found = False
> # Consume the iterator
> for line in itr:
> line = line.strip()
> if "submissionId" in line:
> response_found = True
> 
> # Check if the log line is about the driver status and extract 
> the status.
> if "driverState" in line:
> self._driver_status = line.split(' : ')[1] \
> .replace(',', '').replace('\"', '').strip()
> driver_found = True
> self.log.debug("spark driver status log: {}".format(line))
> if response_found and not driver_found:
> self._driver_status = "UNKNOWN"
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6998) System test is failing in Travis

2020-03-06 Thread Jarek Potiuk (Jira)
Jarek Potiuk created AIRFLOW-6998:
-

 Summary: System test is failing in Travis
 Key: AIRFLOW-6998
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6998
 Project: Apache Airflow
  Issue Type: Improvement
  Components: ci
Affects Versions: 2.0.0
Reporter: Jarek Potiuk






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6994) SparkSubmitOperator re launches spark driver even when original driver still running

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6994?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053777#comment-17053777
 ] 

ASF GitHub Bot commented on AIRFLOW-6994:
-

tooptoop4 commented on pull request #7637: [AIRFLOW-6994] SparkSubmitOperator 
re-launches spark driver even when original driver still running
URL: https://github.com/apache/airflow/pull/7637
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> SparkSubmitOperator re launches spark driver even when original driver still 
> running
> 
>
> Key: AIRFLOW-6994
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6994
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.8, 1.10.9
>Reporter: t oo
>Assignee: t oo
>Priority: Major
>
> https://issues.apache.org/jira/browse/AIRFLOW-6229 introduced a bug
> Due to temporary network blip in connection to spark the state goes to 
> unknown (as no tags found in curl response) and forces retry
> fix in spark_submit_hook.py:
>   
> {code:java}
>   def _process_spark_status_log(self, itr):
> """
> parses the logs of the spark driver status query process
> :param itr: An iterator which iterates over the input of the 
> subprocess
> """
> response_found = False
> driver_found = False
> # Consume the iterator
> for line in itr:
> line = line.strip()
> if "submissionId" in line:
> response_found = True
> 
> # Check if the log line is about the driver status and extract 
> the status.
> if "driverState" in line:
> self._driver_status = line.split(' : ')[1] \
> .replace(',', '').replace('\"', '').strip()
> driver_found = True
> self.log.debug("spark driver status log: {}".format(line))
> if response_found and not driver_found:
> self._driver_status = "UNKNOWN"
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-6998) System test is failing in Travis

2020-03-06 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6998?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-6998.
-
Fix Version/s: 2.0.0
   Resolution: Fixed

> System test is failing in Travis
> 
>
> Key: AIRFLOW-6998
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6998
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6998) System test is failing in Travis

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6998?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053768#comment-17053768
 ] 

ASF GitHub Bot commented on AIRFLOW-6998:
-

kaxil commented on pull request #7636: [AIRFLOW-6998] Fix failing system tests 
in CI
URL: https://github.com/apache/airflow/pull/7636
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> System test is failing in Travis
> 
>
> Key: AIRFLOW-6998
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6998
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5647) Airflow UI should also display dag_concurrency reached

2020-03-06 Thread Akbar Habeeb B (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5647?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053797#comment-17053797
 ] 

Akbar Habeeb B commented on AIRFLOW-5647:
-

[~basph] Can you please help me in it

> Airflow UI should also display dag_concurrency reached
> --
>
> Key: AIRFLOW-5647
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5647
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ui
>Affects Versions: 1.10.5
>Reporter: Bas Harenslak
>Assignee: Akbar Habeeb B
>Priority: Major
>  Labels: gsoc, gsoc2020, mentor
>
> Currently, in the main view, the schedule column box is highlighted in red if 
> the max. number of DAG runs is achieved. In this case no more DAG runs can be 
> started until a DAG run completes.
> I think it should also display in red when the dag_concurrency (i.e. max 
> concurrent tasks) is achieved. In this case also, no more tasks can be 
> started until a task completes. However there is currently nothing in the UI 
> showing that (currently running 1.10.5).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-7000) Allow passing in env var JSON dict in task_test

2020-03-06 Thread Kevin Yang (Jira)
Kevin Yang created AIRFLOW-7000:
---

 Summary: Allow passing in env var JSON dict in task_test
 Key: AIRFLOW-7000
 URL: https://issues.apache.org/jira/browse/AIRFLOW-7000
 Project: Apache Airflow
  Issue Type: Improvement
  Components: cli
Affects Versions: 1.10.9
Reporter: Kevin Yang
Assignee: Kevin Yang


Add a convinient way to batch add env vars, e.g. {'test_mode': True, 
"write_to_production": True}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] pingzh opened a new pull request #7638: [AIRFLOW-6999] use check_output in celery app task

2020-03-06 Thread GitBox
pingzh opened a new pull request #7638: [AIRFLOW-6999] use check_output in 
celery app task
URL: https://github.com/apache/airflow/pull/7638
 
 
   so that celery worker can log the stderr when the command fails
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dstandish commented on a change in pull request #6376: [AIRFLOW-5705] Add creds backend and support for AWS SSM

2020-03-06 Thread GitBox
dstandish commented on a change in pull request #6376: [AIRFLOW-5705] Add creds 
backend and support for AWS SSM
URL: https://github.com/apache/airflow/pull/6376#discussion_r389203393
 
 

 ##
 File path: airflow/creds/__init__.py
 ##
 @@ -0,0 +1,74 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+Creds framework provides means of getting connection objects from various 
sources e.g. the following:
+* Environment variables
+* Metatsore database
+* AWS SSM Parameter store
+"""
+
+from abc import ABC, abstractmethod
+from typing import List
+
+from airflow import AirflowException, conf
+from airflow.models import Connection
+from airflow.utils.module_loading import import_string
+
+CONN_ENV_PREFIX = "AIRFLOW_CONN_"
+
+
+class BaseCredsBackend(ABC):
+"""
+Abstract base class to provide connection objects given a conn_id
 
 Review comment:
   i spoke to the thinking [in this 
comment](https://github.com/apache/airflow/pull/6376#discussion_r389202690).  i 
am not attached to the name.  does group feel `BaseConnectionBackend` is the 
right name?
   
   alternatively can update description to be more compatible.
   e.g. 
   ```
   Abstract base class to retrieve creds given a conn_id and construct a 
Connection object
   ```
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dstandish commented on a change in pull request #6376: [AIRFLOW-5705] Add creds backend and support for AWS SSM

2020-03-06 Thread GitBox
dstandish commented on a change in pull request #6376: [AIRFLOW-5705] Add creds 
backend and support for AWS SSM
URL: https://github.com/apache/airflow/pull/6376#discussion_r389202690
 
 

 ##
 File path: airflow/config_templates/config.yml
 ##
 @@ -463,6 +463,15 @@
   type: string
   example: ~
   default: "task"
+- name: creds_backend
+  description: |
+Classes to use for creds backend
+  version_added: ~
+  type: string
+  example: ~
+  default: >-
+airflow.creds.environment_variables.EnvironmentVariablesCredsBackend,
+airflow.creds.metastore.MetastoreCredsBackend
 
 Review comment:
   so... i struggled with this a bit myself.
   
   i can call it `BaseConnectionBackend` if that's the general consensus -- not 
firmly attached to _creds_.
   
   as to why though, i wrestled with it a bit myself.  but i guess i thought 
creds was a more apt description of what this does.  this is not really 
fetching connections per se but connection _meta_ or connection _info_ or 
plainly _creds_.  it is a backend for getting creds from arbitrary creds store, 
and it provides a way to yield connections from those creds.
   
   and why creds instead of _credentials_? 
   
   when you can have economy _and_ clarity at the same time, why not?
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #7637: [AIRFLOW-6994] SparkSubmitOperator re-launches spark driver even when original driver still running

2020-03-06 Thread GitBox
codecov-io commented on issue #7637: [AIRFLOW-6994] SparkSubmitOperator 
re-launches spark driver even when original driver still running
URL: https://github.com/apache/airflow/pull/7637#issuecomment-596013967
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7637?src=pr=h1) 
Report
   > Merging 
[#7637](https://codecov.io/gh/apache/airflow/pull/7637?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f3abd340826289dec23e96b79a6ed9b6a1955027?src=pr=desc)
 will **decrease** coverage by `0.27%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7637/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7637?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7637  +/-   ##
   ==
   - Coverage   86.83%   86.55%   -0.28% 
   ==
 Files 897  897  
 Lines   4280542808   +3 
   ==
   - Hits3716937054 -115 
   - Misses   5636 5754 +118
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7637?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...rflow/providers/apache/spark/hooks/spark\_submit.py](https://codecov.io/gh/apache/airflow/pull/7637/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL3NwYXJrL2hvb2tzL3NwYXJrX3N1Ym1pdC5weQ==)
 | `84.84% <100%> (+0.17%)` | :arrow_up: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7637/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7637/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7637/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `47.18% <0%> (-45.08%)` | :arrow_down: |
   | 
[...viders/cncf/kubernetes/operators/kubernetes\_pod.py](https://codecov.io/gh/apache/airflow/pull/7637/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvY25jZi9rdWJlcm5ldGVzL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZC5weQ==)
 | `69.69% <0%> (-25.26%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/7637/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[airflow/jobs/backfill\_job.py](https://codecov.io/gh/apache/airflow/pull/7637/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2JhY2tmaWxsX2pvYi5weQ==)
 | `92.15% <0%> (+0.28%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7637?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7637?src=pr=footer). 
Last update 
[f3abd34...3c3eee7](https://codecov.io/gh/apache/airflow/pull/7637?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tooptoop4 opened a new pull request #7637: [AIRFLOW-6994] SparkSubmitOperator re-launches spark driver even when original driver still running

2020-03-06 Thread GitBox
tooptoop4 opened a new pull request #7637: [AIRFLOW-6994] SparkSubmitOperator 
re-launches spark driver even when original driver still running
URL: https://github.com/apache/airflow/pull/7637
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #7636: [AIRFLOW-6998] Fix failing system tests in CI

2020-03-06 Thread GitBox
kaxil merged pull request #7636: [AIRFLOW-6998] Fix failing system tests in CI
URL: https://github.com/apache/airflow/pull/7636
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7636: [AIRFLOW-6998] Fix failing system tests in CI

2020-03-06 Thread GitBox
codecov-io edited a comment on issue #7636: [AIRFLOW-6998] Fix failing system 
tests in CI
URL: https://github.com/apache/airflow/pull/7636#issuecomment-595976912
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=h1) 
Report
   > Merging 
[#7636](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/1e3cdddcd87be3c0f11b43efea11cdbddaff4470?src=pr=desc)
 will **decrease** coverage by `0.17%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7636/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7636  +/-   ##
   ==
   - Coverage   86.73%   86.55%   -0.18% 
   ==
 Files 897  897  
 Lines   4274742805  +58 
   ==
   - Hits3707537051  -24 
   - Misses   5672 5754  +82
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...w/providers/apache/hive/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL2hpdmUvb3BlcmF0b3JzL215c3FsX3RvX2hpdmUucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/security/kerberos.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9zZWN1cml0eS9rZXJiZXJvcy5weQ==)
 | `76.08% <0%> (ø)` | :arrow_up: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `47.18% <0%> (-45.08%)` | :arrow_down: |
   | 
[airflow/providers/mysql/operators/mysql.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvbXlzcWwvb3BlcmF0b3JzL215c3FsLnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[...viders/cncf/kubernetes/operators/kubernetes\_pod.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvY25jZi9rdWJlcm5ldGVzL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZC5weQ==)
 | `69.69% <0%> (-25.26%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[airflow/providers/apache/hive/hooks/hive.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL2hpdmUvaG9va3MvaGl2ZS5weQ==)
 | `77.55% <0%> (ø)` | :arrow_up: |
   | 
[airflow/providers/google/cloud/operators/tasks.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL2Nsb3VkL29wZXJhdG9ycy90YXNrcy5weQ==)
 | `99.14% <0%> (-0.86%)` | :arrow_down: |
   | ... and [19 
more](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=footer). 
Last update 
[1e3cddd...493a9e0](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #7636: [AIRFLOW-6998] Fix failing system tests in CI

2020-03-06 Thread GitBox
codecov-io commented on issue #7636: [AIRFLOW-6998] Fix failing system tests in 
CI
URL: https://github.com/apache/airflow/pull/7636#issuecomment-595976912
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=h1) 
Report
   > Merging 
[#7636](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/1e3cdddcd87be3c0f11b43efea11cdbddaff4470?src=pr=desc)
 will **decrease** coverage by `0.34%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7636/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7636  +/-   ##
   ==
   - Coverage   86.73%   86.38%   -0.35% 
   ==
 Files 897  897  
 Lines   4274742805  +58 
   ==
   - Hits3707536979  -96 
   - Misses   5672 5826 +154
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...w/providers/apache/hive/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL2hpdmUvb3BlcmF0b3JzL215c3FsX3RvX2hpdmUucHk=)
 | `35.84% <0%> (-64.16%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/security/kerberos.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9zZWN1cml0eS9rZXJiZXJvcy5weQ==)
 | `30.43% <0%> (-45.66%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `47.18% <0%> (-45.08%)` | :arrow_down: |
   | 
[airflow/providers/mysql/operators/mysql.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvbXlzcWwvb3BlcmF0b3JzL215c3FsLnB5)
 | `55% <0%> (-45%)` | :arrow_down: |
   | 
[...viders/cncf/kubernetes/operators/kubernetes\_pod.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvY25jZi9rdWJlcm5ldGVzL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZC5weQ==)
 | `69.69% <0%> (-25.26%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[airflow/providers/apache/hive/hooks/hive.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL2hpdmUvaG9va3MvaGl2ZS5weQ==)
 | `76.02% <0%> (-1.54%)` | :arrow_down: |
   | 
[airflow/providers/google/cloud/operators/tasks.py](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL2Nsb3VkL29wZXJhdG9ycy90YXNrcy5weQ==)
 | `99.14% <0%> (-0.86%)` | :arrow_down: |
   | ... and [19 
more](https://codecov.io/gh/apache/airflow/pull/7636/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=footer). 
Last update 
[1e3cddd...493a9e0](https://codecov.io/gh/apache/airflow/pull/7636?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj merged pull request #7597: [AIRFLOW-6497] Avoid loading DAGs in the main scheduler loop

2020-03-06 Thread GitBox
mik-laj merged pull request #7597: [AIRFLOW-6497] Avoid loading DAGs in the 
main scheduler loop
URL: https://github.com/apache/airflow/pull/7597
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #7597: [AIRFLOW-6497] Avoid loading DAGs in the main scheduler loop

2020-03-06 Thread GitBox
mik-laj commented on issue #7597: [AIRFLOW-6497] Avoid loading DAGs in the main 
scheduler loop
URL: https://github.com/apache/airflow/pull/7597#issuecomment-595933565
 
 
   Travis is green.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk opened a new pull request #7636: [AIRFLOW-6998] Fix failing system tests in CI

2020-03-06 Thread GitBox
potiuk opened a new pull request #7636: [AIRFLOW-6998] Fix failing system tests 
in CI
URL: https://github.com/apache/airflow/pull/7636
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ad-m commented on issue #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
ad-m commented on issue #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#issuecomment-595928731
 
 
   @mik-laj , the changes look good. 
   
   @nuclearpinguin, I don't know the full process of accepting changes to this 
project. What can we do to go further with this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ad-m commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
ad-m commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r389103575
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
 
 Review comment:
   Thank you.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ad-m commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
ad-m commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r389101039
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] feluelle commented on a change in pull request #6576: [AIRFLOW-5922] Add option to specify the mysql client library used in MySqlHook

2020-03-06 Thread GitBox
feluelle commented on a change in pull request #6576: [AIRFLOW-5922] Add option 
to specify the mysql client library used in MySqlHook
URL: https://github.com/apache/airflow/pull/6576#discussion_r389087612
 
 

 ##
 File path: airflow/providers/mysql/hooks/mysql.py
 ##
 @@ -113,8 +107,44 @@ def get_conn(self):
 conn_config['unix_socket'] = conn.extra_dejson['unix_socket']
 if local_infile:
 conn_config["local_infile"] = 1
-conn = MySQLdb.connect(**conn_config)
-return conn
+return conn_config
+
+def _get_conn_config_mysql_connector_python(self, conn):
+conn_config = {
+'user': conn.login,
+'password': conn.password or '',
+'host': conn.host or 'localhost',
+'database': self.schema or conn.schema or '',
+'port': int(conn.port) if conn.port else 3306
+}
+
+if conn.extra_dejson.get('allow_local_infile', False):
+conn_config["allow_local_infile"] = True
+
+return conn_config
+
+def get_conn(self):
+"""
+Establishes a connection to a mysql database
+by extracting the connection configuration from the Airflow connection.
+
+.. note:: By default it connects to the database via the mysqlclient 
library.
+But you can also choose the mysql-connector-python library which 
lets you connect through ssl
+without any further ssl parameters required.
+
+:return: a mysql connection object
+"""
+conn = self.connection or self.get_connection(self.mysql_conn_id)  # 
pylint: disable=no-member
+
+client_name = conn.extra_dejson.get('client', 'mysqlclient')
 
 Review comment:
   Please take a look at my latest changes :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7030: [AIRFLOW-6440][AIP-29] Add AWS Fargate Executor

2020-03-06 Thread GitBox
codecov-io edited a comment on issue #7030: [AIRFLOW-6440][AIP-29] Add AWS 
Fargate Executor
URL: https://github.com/apache/airflow/pull/7030#issuecomment-581103197
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7030?src=pr=h1) 
Report
   > Merging 
[#7030](https://codecov.io/gh/apache/airflow/pull/7030?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/ab6bb0012c38740b76e864d42d299c5c7a9972a3?src=pr=desc)
 will **decrease** coverage by `54.71%`.
   > The diff coverage is `31.39%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7030/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7030?src=pr=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master#7030   +/-   ##
   ===
   - Coverage   86.74%   32.02%   -54.72% 
   ===
 Files 897  898+1 
 Lines   4275342963  +210 
   ===
   - Hits3708413758-23326 
   - Misses   566929205+23536
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7030?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/config\_templates/default\_aws\_ecs.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWdfdGVtcGxhdGVzL2RlZmF1bHRfYXdzX2Vjcy5weQ==)
 | `0% <0%> (ø)` | |
   | 
[airflow/executors/executor\_loader.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvZXhlY3V0b3JfbG9hZGVyLnB5)
 | `52.5% <100%> (-47.5%)` | :arrow_down: |
   | 
[airflow/executors/aws\_ecs\_executor.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvYXdzX2Vjc19leGVjdXRvci5weQ==)
 | `32.09% <32.09%> (ø)` | |
   | 
[...low/contrib/operators/wasb\_delete\_blob\_operator.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy93YXNiX2RlbGV0ZV9ibG9iX29wZXJhdG9yLnB5)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...ing\_platform/example\_dags/example\_display\_video.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL21hcmtldGluZ19wbGF0Zm9ybS9leGFtcGxlX2RhZ3MvZXhhbXBsZV9kaXNwbGF5X3ZpZGVvLnB5)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/vertica\_hook.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3ZlcnRpY2FfaG9vay5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvX19pbml0X18ucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/hooks/mssql\_hook.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9tc3NxbF9ob29rLnB5)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...viders/docker/example\_dags/example\_docker\_swarm.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZG9ja2VyL2V4YW1wbGVfZGFncy9leGFtcGxlX2RvY2tlcl9zd2FybS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/hooks/webhdfs\_hook.py](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy93ZWJoZGZzX2hvb2sucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | ... and [774 
more](https://codecov.io/gh/apache/airflow/pull/7030/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7030?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7030?src=pr=footer). 
Last update 
[ab6bb00...2bcad14](https://codecov.io/gh/apache/airflow/pull/7030?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow-on-k8s-operator] RohitR1 opened a new issue #12: E2E test are failing

2020-03-06 Thread GitBox
RohitR1 opened a new issue #12: E2E test are failing 
URL: https://github.com/apache/airflow-on-k8s-operator/issues/12
 
 
   make e2e-test is failing
   ```bash
   make e2e-test
   
   Your active configuration is: [rahul]
   kubectl get namespace airflowop-system || kubectl create namespace 
airflowop-system
   Error from server (NotFound): namespaces "airflowop-system" not found
   namespace/airflowop-system created
   go test -v -timeout 20m test/e2e/base/base_test.go --namespace 
airflowop-system
   === RUN   Test
   Running Suite: AirflowBase Suite
   
   Random Seed: 1583520410
   Will run 2 of 2 specs
   
   STEP: creating a new AirflowBase: 
   • Failure [0.289 seconds]
   AirflowBase controller tests
   /Users/rahul/airflow-on-k8s-operator/test/e2e/base/base_test.go:70
 creating a AirflowBase with mysql [It]
 /Users/rahul/airflow-on-k8s-operator/test/e2e/base/base_test.go:76
   
 failed to create CR : AirflowBase.airflow.apache.org "" is invalid: 
metadata.name: Required value: name or generateName is required
 Unexpected error:
 <*errors.StatusError | 0xc00023e1e0>: {
 ErrStatus: {
 TypeMeta: {Kind: "", APIVersion: ""},
 ListMeta: {
 SelfLink: "",
 ResourceVersion: "",
 Continue: "",
 RemainingItemCount: nil,
 },
 Status: "Failure",
 Message: "AirflowBase.airflow.apache.org \"\" is invalid: 
metadata.name: Required value: name or generateName is required",
 Reason: "Invalid",
 Details: {
 Name: "",
 Group: "airflow.apache.org",
 Kind: "AirflowBase",
 UID: "",
 Causes: [
 {
 Type: "FieldValueRequired",
 Message: "Required value: name or generateName is 
required",
 Field: "metadata.name",
 },
 ],
 RetryAfterSeconds: 0,
 },
 Code: 422,
 },
 }
 AirflowBase.airflow.apache.org "" is invalid: metadata.name: Required 
value: name or generateName is required
 occurred
   
 
/Users/rahul/airflow-on-k8s-operator/vendor/sigs.k8s.io/controller-reconciler/pkg/test/framework.go:178
   --
   STEP: creating a new AirflowBase: 
   • Failure [0.094 seconds]
   AirflowBase controller tests
   /Users/rahul/airflow-on-k8s-operator/test/e2e/base/base_test.go:70
 creating a AirflowBase with postgres [It]
 /Users/rahul/airflow-on-k8s-operator/test/e2e/base/base_test.go:88
   
 failed to create CR : AirflowBase.airflow.apache.org "" is invalid: 
metadata.name: Required value: name or generateName is required
 Unexpected error:
 <*errors.StatusError | 0xc00023ebe0>: {
 ErrStatus: {
 TypeMeta: {Kind: "", APIVersion: ""},
 ListMeta: {
 SelfLink: "",
 ResourceVersion: "",
 Continue: "",
 RemainingItemCount: nil,
 },
 Status: "Failure",
 Message: "AirflowBase.airflow.apache.org \"\" is invalid: 
metadata.name: Required value: name or generateName is required",
 Reason: "Invalid",
 Details: {
 Name: "",
 Group: "airflow.apache.org",
 Kind: "AirflowBase",
 UID: "",
 Causes: [
 {
 Type: "FieldValueRequired",
 Message: "Required value: name or generateName is 
required",
 Field: "metadata.name",
 },
 ],
 RetryAfterSeconds: 0,
 },
 Code: 422,
 },
 }
 AirflowBase.airflow.apache.org "" is invalid: metadata.name: Required 
value: name or generateName is required
 occurred
   
 
/Users/rahul/airflow-on-k8s-operator/vendor/sigs.k8s.io/controller-reconciler/pkg/test/framework.go:178
   --
   Failure [0.000 seconds]
   [AfterSuite] AfterSuite 
   /Users/rahul/airflow-on-k8s-operator/test/e2e/base/base_test.go:56
   
 failed to delete CR : resource name may not be empty
 Unexpected error:
 <*errors.errorString | 0xc000261260>: {
 s: "resource name may not be empty",
 }
 resource name may not be empty
 occurred
   
 
/Users/rahul/airflow-on-k8s-operator/vendor/sigs.k8s.io/controller-reconciler/pkg/test/framework.go:132
   --
   
   
   Summarizing 2 Failures:

[jira] [Comment Edited] (AIRFLOW-6960) Airflow Celery worker : command returned non-zero exit status 2

2020-03-06 Thread Rohit S S (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053690#comment-17053690
 ] 

Rohit S S edited comment on AIRFLOW-6960 at 3/6/20, 6:39 PM:
-

* Request you to give elaborate steps to recreate the bug.
 * It doesn't look like a bug because if you use ./breeze everything looks fine.
 * Detailed steps to recreate this bug would definitely help me solve the issue.
 * Also what do you mean by running the worker separately (Please Be lil more 
specific)


was (Author: randr97):
* Request you to give elaborate steps to recreate the bug.
 * It doesn't look like a bug because if you use ./breeze everything looks fine.
 * Detailed steps to recreate this bug would definitely help me solve the issue.

> Airflow Celery worker : command returned non-zero exit status 2
> ---
>
> Key: AIRFLOW-6960
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6960
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Affects Versions: 2.0.0, 1.10.9
>Reporter: Uragalage Thilanka Mahesh Perera
>Assignee: Rohit S S
>Priority: Blocker
>
> I am getting below error and trying to fix it for hours and did get any luck. 
> Below logs are from airflow celery worker.
> {code:java}
>   airflow command error: argument subcommand: invalid choice: 'tasks' (choose 
> from 'backfill', 'list_dag_runs', 'list_tasks', 'clear', 'pause', 'unpause', 
> 'trigger_dag', 'delete_dag', 'show_dag', 'pool', 'variables', 'kerberos', 
> 'render', 'run', 'initdb', 'list_dags', 'dag_state', 'task_failed_deps', 
> 'task_state', 'serve_logs', 'test', 'webserver', 'resetdb', 'upgradedb', 
> 'checkdb', 'shell', 'scheduler', 'worker', 'flower', 'version', 
> 'connections', 'create_user', 'delete_user', 'list_users', 'sync_perm', 
> 'next_execution', 'rotate_fernet_key'), see help above. usage: airflow [-h] 
> {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
>  ... positional arguments: 
> {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
>  sub-command help backfill Run subsections of a DAG for a specified date 
> range. If reset_dag_run option is used, backfill will first prompt users 
> whether airflow should clear all the previous dag_run and task_instances 
> within the backfill date range. If rerun_failed_tasks is used, backfill will 
> auto re-run the previous failed task instances within the backfill date 
> range. list_dag_runs List dag runs given a DAG id. If state option is given, 
> it will onlysearch for all the dagruns with the given state. If no_backfill 
> option is given, it will filter outall backfill dagruns for given dag id. 
> list_tasks List the tasks within a DAG clear Clear a set of task instance, as 
> if they never ran pause Pause a DAG unpause Resume a paused DAG trigger_dag 
> Trigger a DAG run delete_dag Delete all DB records related to the specified 
> DAG show_dag Displays DAG's tasks with their dependencies pool CRUD 
> operations on pools variables CRUD operations on variables kerberos Start a 
> kerberos ticket renewer render Render a task instance's template(s) run Run a 
> single task instance initdb Initialize the metadata database list_dags List 
> all the DAGs dag_state Get the status of a dag run task_failed_deps Returns 
> the unmet dependencies for a task instance from the perspective of the 
> scheduler. In other words, why a task instance doesn't get scheduled and then 
> queued by the scheduler, and then run by an executor). task_state Get the 
> status of a task instance serve_logs Serve logs generate by worker test Test 
> a task instance. This will run a task without checking for dependencies or 
> recording its state in the database. webserver Start a Airflow webserver 
> instance resetdb Burn down and rebuild the metadata database upgradedb 
> Upgrade the metadata database to latest version checkdb Check if the database 
> can be reached. shell Runs a shell to access the database scheduler Start a 
> scheduler instance worker Start a Celery worker node flower Start a Celery 
> Flower version Show the version connections List/Add/Delete connections 
> create_user Create an account for the Web UI 

[jira] [Commented] (AIRFLOW-6960) Airflow Celery worker : command returned non-zero exit status 2

2020-03-06 Thread Rohit S S (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053690#comment-17053690
 ] 

Rohit S S commented on AIRFLOW-6960:


* Request you to give elaborate steps to recreate the bug.
 * It doesn't look like a bug because if you use ./breeze everything looks fine.
 * Detailed steps to recreate this bug would definitely help me solve the issue.

> Airflow Celery worker : command returned non-zero exit status 2
> ---
>
> Key: AIRFLOW-6960
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6960
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Affects Versions: 2.0.0, 1.10.9
>Reporter: Uragalage Thilanka Mahesh Perera
>Assignee: Rohit S S
>Priority: Blocker
>
> I am getting below error and trying to fix it for hours and did get any luck. 
> Below logs are from airflow celery worker.
> {code:java}
>   airflow command error: argument subcommand: invalid choice: 'tasks' (choose 
> from 'backfill', 'list_dag_runs', 'list_tasks', 'clear', 'pause', 'unpause', 
> 'trigger_dag', 'delete_dag', 'show_dag', 'pool', 'variables', 'kerberos', 
> 'render', 'run', 'initdb', 'list_dags', 'dag_state', 'task_failed_deps', 
> 'task_state', 'serve_logs', 'test', 'webserver', 'resetdb', 'upgradedb', 
> 'checkdb', 'shell', 'scheduler', 'worker', 'flower', 'version', 
> 'connections', 'create_user', 'delete_user', 'list_users', 'sync_perm', 
> 'next_execution', 'rotate_fernet_key'), see help above. usage: airflow [-h] 
> {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
>  ... positional arguments: 
> {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
>  sub-command help backfill Run subsections of a DAG for a specified date 
> range. If reset_dag_run option is used, backfill will first prompt users 
> whether airflow should clear all the previous dag_run and task_instances 
> within the backfill date range. If rerun_failed_tasks is used, backfill will 
> auto re-run the previous failed task instances within the backfill date 
> range. list_dag_runs List dag runs given a DAG id. If state option is given, 
> it will onlysearch for all the dagruns with the given state. If no_backfill 
> option is given, it will filter outall backfill dagruns for given dag id. 
> list_tasks List the tasks within a DAG clear Clear a set of task instance, as 
> if they never ran pause Pause a DAG unpause Resume a paused DAG trigger_dag 
> Trigger a DAG run delete_dag Delete all DB records related to the specified 
> DAG show_dag Displays DAG's tasks with their dependencies pool CRUD 
> operations on pools variables CRUD operations on variables kerberos Start a 
> kerberos ticket renewer render Render a task instance's template(s) run Run a 
> single task instance initdb Initialize the metadata database list_dags List 
> all the DAGs dag_state Get the status of a dag run task_failed_deps Returns 
> the unmet dependencies for a task instance from the perspective of the 
> scheduler. In other words, why a task instance doesn't get scheduled and then 
> queued by the scheduler, and then run by an executor). task_state Get the 
> status of a task instance serve_logs Serve logs generate by worker test Test 
> a task instance. This will run a task without checking for dependencies or 
> recording its state in the database. webserver Start a Airflow webserver 
> instance resetdb Burn down and rebuild the metadata database upgradedb 
> Upgrade the metadata database to latest version checkdb Check if the database 
> can be reached. shell Runs a shell to access the database scheduler Start a 
> scheduler instance worker Start a Celery worker node flower Start a Celery 
> Flower version Show the version connections List/Add/Delete connections 
> create_user Create an account for the Web UI (FAB-based) delete_user Delete 
> an account for the Web UI list_users List accounts for the Web UI sync_perm 
> Update permissions for existing roles and DAGs. next_execution Get the next 
> execution datetime of a DAG. rotate_fernet_key Rotate all encrypted 
> connection credentials and variables; see 
> https://airflow.readthedocs.io/en/stable/howto/secure- 
> 

[GitHub] [airflow-on-k8s-operator] RohitR1 opened a new issue #11: Why architecture is different from configmap?

2020-03-06 Thread GitBox
RohitR1 opened a new issue #11: Why architecture is different from configmap?
URL: https://github.com/apache/airflow-on-k8s-operator/issues/11
 
 
   As you can see 
https://github.com/apache/airflow-on-k8s-operator/blob/master/templates/airflow-configmap.yaml#L37
 `executor = KubernetesExecutor`
   
   But in it shows CeleryExecutor: 
https://github.com/apache/airflow-on-k8s-operator/blob/master/docs/airflow-cluster.png
   
   Lot of confusions..
   
   Basic questions: 
   - Does this operator support KubernetesExecutor?
   
   If yes, Where is docs on installing multi node airflow with 
KubernetesExecutor?
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-6997) Kubernetes git-sync init container for worker pods is not able to sync git behind a proxy

2020-03-06 Thread Kris Geusebroek (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6997?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kris Geusebroek updated AIRFLOW-6997:
-
Summary: Kubernetes git-sync init container for worker pods is not able to 
sync git behind a proxy  (was: Kubernetes git-sync init container fro worker 
pods is not able to sync git behind a proxy)

> Kubernetes git-sync init container for worker pods is not able to sync git 
> behind a proxy
> -
>
> Key: AIRFLOW-6997
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6997
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration, executor-kubernetes, worker
>Affects Versions: 1.10.9
>Reporter: Kris Geusebroek
>Assignee: Daniel Imberman
>Priority: Major
>
> The git-sync-clone initcontainer used for the worker pods to sync the dags 
> only gets the GIT_SYNC environment variables. Any environment variable 
> configured in the `kubernetes_environment_variables` section of the airflow 
> config are ignored. (only in the init container, not in the worker pod)
> The HTTPS_PROXY configured there is thus ignored and all workerpods fail with 
> an error.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6997) Kubernetes git-sync init container fro worker pods is not able to sync git behind a proxy

2020-03-06 Thread Kris Geusebroek (Jira)
Kris Geusebroek created AIRFLOW-6997:


 Summary: Kubernetes git-sync init container fro worker pods is not 
able to sync git behind a proxy
 Key: AIRFLOW-6997
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6997
 Project: Apache Airflow
  Issue Type: Bug
  Components: configuration, executor-kubernetes, worker
Affects Versions: 1.10.9
Reporter: Kris Geusebroek
Assignee: Daniel Imberman


The git-sync-clone initcontainer used for the worker pods to sync the dags only 
gets the GIT_SYNC environment variables. Any environment variable configured in 
the `kubernetes_environment_variables` section of the airflow config are 
ignored. (only in the init container, not in the worker pod)

The HTTPS_PROXY configured there is thus ignored and all workerpods fail with 
an error.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5062) Allow ACL header in S3 Hook

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5062?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053646#comment-17053646
 ] 

ASF GitHub Bot commented on AIRFLOW-5062:
-

retornam commented on pull request #7635: [AIRFLOW-5062]  Allow ACL Header in 
S3Hook
URL: https://github.com/apache/airflow/pull/7635
 
 
   Allow passing in the ACL header in the AWS S3 Hook
   
   Signed-off-by: Raymond Etornam 
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow ACL header in S3 Hook
> ---
>
> Key: AIRFLOW-5062
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5062
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: aws
>Affects Versions: 1.10.3
>Reporter: Lester kim
>Assignee: Lester kim
>Priority: Critical
>
> Definition of Done
>  * Add ACL field in S3Hook when uploading a file



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] retornam opened a new pull request #7635: [AIRFLOW-5062] Allow ACL Header in S3Hook

2020-03-06 Thread GitBox
retornam opened a new pull request #7635: [AIRFLOW-5062]  Allow ACL Header in 
S3Hook
URL: https://github.com/apache/airflow/pull/7635
 
 
   Allow passing in the ACL header in the AWS S3 Hook
   
   Signed-off-by: Raymond Etornam 
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #6342: [AIRFLOW-5662] fix incorrect naming for scheduler used slot metric

2020-03-06 Thread GitBox
kaxil commented on issue #6342: [AIRFLOW-5662] fix incorrect naming for 
scheduler used slot metric
URL: https://github.com/apache/airflow/pull/6342#issuecomment-595885701
 
 
   Can we update the name of the PR, please? We do a lot more than just fixing 
correct name :) 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] stale[bot] removed a comment on issue #6342: [AIRFLOW-5662] fix incorrect naming for scheduler used slot metric

2020-03-06 Thread GitBox
stale[bot] removed a comment on issue #6342: [AIRFLOW-5662] fix incorrect 
naming for scheduler used slot metric
URL: https://github.com/apache/airflow/pull/6342#issuecomment-569323488
 
 
   This issue has been automatically marked as stale because it has not had 
recent activity. It will be closed if no further activity occurs. Thank you for 
your contributions.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tooptoop4 commented on issue #6912: [AIRFLOW-6352] security - ui - add login timeout

2020-03-06 Thread GitBox
tooptoop4 commented on issue #6912: [AIRFLOW-6352] security - ui - add login 
timeout
URL: https://github.com/apache/airflow/pull/6912#issuecomment-595847453
 
 
   @mik-laj any idea how this works? these lines are missing from app.py in 
1.10.9
   import datetime
   import flask
   import flask_login


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #6342: [AIRFLOW-5662] fix incorrect naming for scheduler used slot metric

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #6342: [AIRFLOW-5662] fix 
incorrect naming for scheduler used slot metric
URL: https://github.com/apache/airflow/pull/6342#discussion_r388996128
 
 

 ##
 File path: tests/models/test_pool.py
 ##
 @@ -62,7 +62,20 @@ def test_open_slots(self):
 self.assertEqual(3, pool.open_slots())  # pylint: 
disable=no-value-for-parameter
 self.assertEqual(1, pool.used_slots())  # pylint: 
disable=no-value-for-parameter
 self.assertEqual(1, pool.queued_slots())  # pylint: 
disable=no-value-for-parameter
-self.assertEqual(2, pool.occupied_slots())  # pylint: 
disable=no-value-for-parameter
+self.assertEqual({
 
 Review comment:
   ```suggestion
   self.assertEqual(2, pool.occupied_slots())  # pylint: 
disable=no-value-for-parameter
   self.assertEqual({
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388992741
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388992810
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388992440
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
 
 Review comment:
   I removed all trailings slashes


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388992847
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388992206
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388992043
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388991810
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388989806
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388987692
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388987518
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388987239
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2094 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections/:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags/:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[GitHub] [airflow] msb217 commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] Add OpenAPI spec

2020-03-06 Thread GitBox
msb217 commented on a change in pull request #7549: [AIRFLOW-6929][DONT-MERGE] 
Add OpenAPI spec
URL: https://github.com/apache/airflow/pull/7549#discussion_r388578794
 
 

 ##
 File path: openapi.yaml
 ##
 @@ -0,0 +1,2118 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+openapi: 3.0.3
+
+info:
+  title: "Airflow API (Stable)"
+  description: |
+Apache Airflow management API
+  version: '1.0.0'
+  license:
+name: Apache 2.0
+url: http://www.apache.org/licenses/LICENSE-2.0.html
+
+servers:
+  - url: /api/v1
+description: Airfow Stable API
+
+paths:
+  # Database entities
+  /connections:
+get:
+  summary: Get all conneciton entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of connection entry
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/ConnectionCollection'
+  - $ref: '#/components/schemas/CollectionInfo'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+post:
+  summary: Create conneciton entry
+  tags: [Connection]
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /connections/{connection_id}:
+parameters:
+  - $ref: '#/components/parameters/ConnectionID'
+
+get:
+  summary: Get connection entry
+  tags: [Connection]
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+patch:
+  summary: Update a connection entry
+  tags: [Connection]
+  parameters:
+- $ref: '#/components/parameters/UpdateMask'
+  requestBody:
+required: true
+content:
+  application/json:
+schema:
+  $ref: '#/components/schemas/Connection'
+
+  responses:
+'200':
+  description: Successfull response
+  content:
+application/json:
+  schema:
+$ref: '#/components/schemas/Connection'
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+'404':
+  $ref: '#/components/responses/NotFound'
+
+delete:
+  summary: Delete connection entry
+  tags: [Connection]
+  responses:
+'204':
+  description: No content
+'400':
+  $ref: '#/components/responses/BadRequest'
+'401':
+  $ref: '#/components/responses/Unauthenticated'
+'403':
+  $ref: '#/components/responses/PermissionDenied'
+
+  /dags:
+get:
+  summary: Get all DAGs
+  tags: [DAG]
+  parameters:
+- $ref: '#/components/parameters/PageLimit'
+- $ref: '#/components/parameters/PageOffset'
+  responses:
+'200':
+  description: List of DAGs
+  content:
+application/json:
+  schema:
+allOf:
+  - $ref: '#/components/schemas/DAGCollection'
+  - $ref: 

[jira] [Created] (AIRFLOW-6996) Control KubernetesExecutor delete_worker_pods setting per task

2020-03-06 Thread Barend (Jira)
Barend created AIRFLOW-6996:
---

 Summary: Control KubernetesExecutor delete_worker_pods setting per 
task
 Key: AIRFLOW-6996
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6996
 Project: Apache Airflow
  Issue Type: Improvement
  Components: executor-kubernetes
Affects Versions: 1.10.9, 2.0.0
Reporter: Barend
Assignee: Daniel Imberman


h3. Context

The global {{airflow.cfg}} defines a boolean flag that controls whether the 
KubernetesExecutor should delete worker pods:
{code:none}
[kubernetes]
# If True (default), worker pods will be deleted upon termination
delete_worker_pods = True
{code}
You generally want this flag to be {{True}}, unless diagnosing specific kinds 
of task launch problems.

It's currently an all-or-nothing setting that affects all tasks in all DAGs.
h3. Improvement

When including a task in a DAG, I have the option to provide executor 
configuration specifically for that task:
{code:python}
sometask = DummyOperator(
task_id="example",
executor_config={ "KubernetesExecutor": { "image": "..." }  }
)
{code}
This does not currently (v1.10.9) give me the option to override pod deletion 
for that specific task. I think this would be an improvement for two reasons:
 # you're generally stopping deletion to debug the launch of a specifc task, 
making this the tightest possible scope where you'd want to control this 
behaviour
 # you can control the setting without restarting Airflow, by reloading the DAG



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] inytar commented on a change in pull request #7621: [AIRFLOW-6982] add native python exasol support

2020-03-06 Thread GitBox
inytar commented on a change in pull request #7621: [AIRFLOW-6982] add native 
python exasol support
URL: https://github.com/apache/airflow/pull/7621#discussion_r388968617
 
 

 ##
 File path: airflow/providers/exasol/hooks/exasol.py
 ##
 @@ -0,0 +1,179 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from contextlib import closing
+
+import pyexasol
+from past.builtins import basestring
+
+from airflow.hooks.dbapi_hook import DbApiHook
+
+
+class ExasolHook(DbApiHook):
+"""
+Interact with Exasol.
+You can specify the pyexasol ``compression``, ``encryption``, ``json_lib``
+and ``client_name``  parameters in the extra field of your connection
+as ``{"compression": True, "json_lib": "rapidjson", etc}``.
+See `pyexasol reference
+`_
+for more details.
+"""
+conn_name_attr = 'exasol_conn_id'
+default_conn_name = 'exasol_default'
+supports_autocommit = True
+
+def __init__(self, *args, **kwargs):
+super(ExasolHook, self).__init__(*args, **kwargs)
+self.schema = kwargs.pop("schema", None)
+
+def get_conn(self):
+conn_id = getattr(self, self.conn_name_attr)
+conn = self.get_connection(conn_id)
+conn_args = dict(
+dsn='%s:%s' % (conn.host, conn.port),
+user=conn.login,
+password=conn.password,
+schema=self.schema or conn.schema)
+# check for parameters in conn.extra
+for arg_name, arg_val in conn.extra_dejson.items():
+if arg_name in ['compression', 'encryption', 'json_lib', 
'client_name']:
+conn_args[arg_name] = arg_val
+
+conn = pyexasol.connect(**conn_args)
+return conn
+
+def get_pandas_df(self, sql, parameters=None):
+"""
+Executes the sql and returns a pandas dataframe
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+conn.export_to_pandas(sql, query_params=parameters)
+
+def get_records(self, sql, parameters=None):
+"""
+Executes the sql and returns a set of records.
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+with closing(conn.execute(sql, parameters)) as cur:
+return cur.fetchall()
+
+def get_first(self, sql, parameters=None):
+"""
+Executes the sql and returns the first resulting row.
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+with closing(conn.execute(sql, parameters)) as cur:
+return cur.fetchone()
+
+def run(self, sql, autocommit=False, parameters=None):
+"""
+Runs a command or a list of commands. Pass a list of sql
+statements to the sql parameter to get them to execute
+sequentially
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param autocommit: What to set the connection's autocommit setting to
+before executing the query.
+:type autocommit: bool
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+if isinstance(sql, basestring):
+sql = [sql]
+
+with closing(self.get_conn()) as conn:
+if self.supports_autocommit:
+

[jira] [Created] (AIRFLOW-6995) Recents tasks / DAG runs stuck loading

2020-03-06 Thread Philipp (Jira)
Philipp created AIRFLOW-6995:


 Summary: Recents tasks / DAG runs stuck loading
 Key: AIRFLOW-6995
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6995
 Project: Apache Airflow
  Issue Type: Bug
  Components: DagRun
Affects Versions: 1.10.7
Reporter: Philipp


Hi,

we just finished deploying most of our jobs when the "Recents tasks" and "DAG 
runs" columns began to stop loading. The Browser reports:
 * last_dagruns/dag_ids=
 * dag_stats/dag_ids=
 * task_stats/dag_ids= 

 
{code:java}
Bad Request
Request Line is too large (4295 > 4094)
{code}
 

 

Seems like Airflow has a hard time handling "too" many jobs with "too" large 
names after all?

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6994) SparkSubmitOperator re launches spark driver even when original driver still running

2020-03-06 Thread t oo (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

t oo updated AIRFLOW-6994:
--
Affects Version/s: (was: 1.10.6)
   1.10.8
   1.10.9

> SparkSubmitOperator re launches spark driver even when original driver still 
> running
> 
>
> Key: AIRFLOW-6994
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6994
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.8, 1.10.9
>Reporter: t oo
>Assignee: t oo
>Priority: Major
>
> https://issues.apache.org/jira/browse/AIRFLOW-6229 introduced a bug
> Due to temporary network blip in connection to spark the state goes to 
> unknown (as no tags found in curl response) and forces retry
> fix in spark_submit_hook.py:
>   
> {code:java}
>   def _process_spark_status_log(self, itr):
> """
> parses the logs of the spark driver status query process
> :param itr: An iterator which iterates over the input of the 
> subprocess
> """
> response_found = False
> driver_found = False
> # Consume the iterator
> for line in itr:
> line = line.strip()
> if "submissionId" in line:
> response_found = True
> 
> # Check if the log line is about the driver status and extract 
> the status.
> if "driverState" in line:
> self._driver_status = line.split(' : ')[1] \
> .replace(',', '').replace('\"', '').strip()
> driver_found = True
> self.log.debug("spark driver status log: {}".format(line))
> if response_found and not driver_found:
> self._driver_status = "UNKNOWN"
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6994) SparkSubmitOperator re launches spark driver even when original driver still running

2020-03-06 Thread t oo (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

t oo updated AIRFLOW-6994:
--
Fix Version/s: (was: 1.10.8)

> SparkSubmitOperator re launches spark driver even when original driver still 
> running
> 
>
> Key: AIRFLOW-6994
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6994
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.6
>Reporter: t oo
>Assignee: t oo
>Priority: Major
>
> https://issues.apache.org/jira/browse/AIRFLOW-6229 introduced a bug
> Due to temporary network blip in connection to spark the state goes to 
> unknown (as no tags found in curl response) and forces retry
> fix in spark_submit_hook.py:
>   
> {code:java}
>   def _process_spark_status_log(self, itr):
> """
> parses the logs of the spark driver status query process
> :param itr: An iterator which iterates over the input of the 
> subprocess
> """
> response_found = False
> driver_found = False
> # Consume the iterator
> for line in itr:
> line = line.strip()
> if "submissionId" in line:
> response_found = True
> 
> # Check if the log line is about the driver status and extract 
> the status.
> if "driverState" in line:
> self._driver_status = line.split(' : ')[1] \
> .replace(',', '').replace('\"', '').strip()
> driver_found = True
> self.log.debug("spark driver status log: {}".format(line))
> if response_found and not driver_found:
> self._driver_status = "UNKNOWN"
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6994) SparkSubmitOperator re launches spark driver even when original driver still running

2020-03-06 Thread t oo (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

t oo updated AIRFLOW-6994:
--
Description: 
https://issues.apache.org/jira/browse/AIRFLOW-6229 introduced a bug

Due to temporary network blip in connection to spark the state goes to unknown 
(as no tags found in curl response) and forces retry

fix in spark_submit_hook.py:

  
{code:java}
  def _process_spark_status_log(self, itr):
"""
parses the logs of the spark driver status query process

:param itr: An iterator which iterates over the input of the subprocess
"""
response_found = False
driver_found = False
# Consume the iterator
for line in itr:
line = line.strip()

if "submissionId" in line:
response_found = True

# Check if the log line is about the driver status and extract the 
status.
if "driverState" in line:
self._driver_status = line.split(' : ')[1] \
.replace(',', '').replace('\"', '').strip()
driver_found = True

self.log.debug("spark driver status log: {}".format(line))

if response_found and not driver_found:
self._driver_status = "UNKNOWN"
{code}


  was:
You click ‘release’ on a new spark cluster while the prior spark cluster is 
processing some spark submits from airflow. Then airflow is never able to 
finish the sparksubmit task as it polls from status on the new spark cluster 
build which it can’t find status for as the submit happened on earlier spark 
cluster build….the status loop goes on forever

 

[https://github.com/apache/airflow/blob/1.10.6/airflow/contrib/hooks/spark_submit_hook.py#L446]

[https://github.com/apache/airflow/blob/1.10.6/airflow/contrib/hooks/spark_submit_hook.py#L489]

It loops forever if it can’t find driverState tag in the json response, since 
the new build (pointed to by the released DNS name) doesn’t know about the 
driver submitted (in previously released build) then the 2nd response below 
does not contain the driverState tag.

  

#response before clicking release on new build

[ec2-user@reda ~]$

curl +[http://dns:6066/v1/submissions/status/driver-20191202142207-]+

{  "action" : "SubmissionStatusResponse",  "driverState" : "RUNNING",  
"serverSparkVersion" : "2.3.4",  "submissionId" : "driver-20191202142207-", 
 "success" : true,  "workerHostPort" : "reda:31489",  "workerId" : 
"worker-20191202133526-reda-31489"}

 

#response after clicking release on new build

[ec2-user@reda ~]$

curl [http://dns:6066/v1/submissions/status/driver-20191202142207-]     

{  "action" : "SubmissionStatusResponse",  "serverSparkVersion" : "2.3.4",  
"submissionId" : "driver-20191202142207-",  "success" : false               
}

               

 

Definitely a defect in current code. Can fix this by modifying 
_process_spark_status_log function to set driver status to UNKNOWN if 
driverState is not in response after iterating all lines.

 


> SparkSubmitOperator re launches spark driver even when original driver still 
> running
> 
>
> Key: AIRFLOW-6994
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6994
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.6
>Reporter: t oo
>Assignee: t oo
>Priority: Major
> Fix For: 1.10.8
>
>
> https://issues.apache.org/jira/browse/AIRFLOW-6229 introduced a bug
> Due to temporary network blip in connection to spark the state goes to 
> unknown (as no tags found in curl response) and forces retry
> fix in spark_submit_hook.py:
>   
> {code:java}
>   def _process_spark_status_log(self, itr):
> """
> parses the logs of the spark driver status query process
> :param itr: An iterator which iterates over the input of the 
> subprocess
> """
> response_found = False
> driver_found = False
> # Consume the iterator
> for line in itr:
> line = line.strip()
> if "submissionId" in line:
> response_found = True
> 
> # Check if the log line is about the driver status and extract 
> the status.
> if "driverState" in line:
> self._driver_status = line.split(' : ')[1] \
> .replace(',', '').replace('\"', '').strip()
> driver_found = True
> self.log.debug("spark driver status log: {}".format(line))
> if response_found and not driver_found:
> self._driver_status = "UNKNOWN"
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6994) SparkSubmitOperator re launches spark driver even when original driver still running

2020-03-06 Thread t oo (Jira)
t oo created AIRFLOW-6994:
-

 Summary: SparkSubmitOperator re launches spark driver even when 
original driver still running
 Key: AIRFLOW-6994
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6994
 Project: Apache Airflow
  Issue Type: Bug
  Components: scheduler
Affects Versions: 1.10.6
Reporter: t oo
Assignee: t oo
 Fix For: 1.10.8


You click ‘release’ on a new spark cluster while the prior spark cluster is 
processing some spark submits from airflow. Then airflow is never able to 
finish the sparksubmit task as it polls from status on the new spark cluster 
build which it can’t find status for as the submit happened on earlier spark 
cluster build….the status loop goes on forever

 

[https://github.com/apache/airflow/blob/1.10.6/airflow/contrib/hooks/spark_submit_hook.py#L446]

[https://github.com/apache/airflow/blob/1.10.6/airflow/contrib/hooks/spark_submit_hook.py#L489]

It loops forever if it can’t find driverState tag in the json response, since 
the new build (pointed to by the released DNS name) doesn’t know about the 
driver submitted (in previously released build) then the 2nd response below 
does not contain the driverState tag.

  

#response before clicking release on new build

[ec2-user@reda ~]$

curl +[http://dns:6066/v1/submissions/status/driver-20191202142207-]+

{  "action" : "SubmissionStatusResponse",  "driverState" : "RUNNING",  
"serverSparkVersion" : "2.3.4",  "submissionId" : "driver-20191202142207-", 
 "success" : true,  "workerHostPort" : "reda:31489",  "workerId" : 
"worker-20191202133526-reda-31489"}

 

#response after clicking release on new build

[ec2-user@reda ~]$

curl [http://dns:6066/v1/submissions/status/driver-20191202142207-]     

{  "action" : "SubmissionStatusResponse",  "serverSparkVersion" : "2.3.4",  
"submissionId" : "driver-20191202142207-",  "success" : false               
}

               

 

Definitely a defect in current code. Can fix this by modifying 
_process_spark_status_log function to set driver status to UNKNOWN if 
driverState is not in response after iterating all lines.

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj commented on a change in pull request #7621: [AIRFLOW-6982] add native python exasol support

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7621: [AIRFLOW-6982] add native 
python exasol support
URL: https://github.com/apache/airflow/pull/7621#discussion_r388952345
 
 

 ##
 File path: airflow/providers/exasol/hooks/exasol.py
 ##
 @@ -0,0 +1,179 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from contextlib import closing
+
+import pyexasol
+from past.builtins import basestring
+
+from airflow.hooks.dbapi_hook import DbApiHook
+
+
+class ExasolHook(DbApiHook):
+"""
+Interact with Exasol.
+You can specify the pyexasol ``compression``, ``encryption``, ``json_lib``
+and ``client_name``  parameters in the extra field of your connection
+as ``{"compression": True, "json_lib": "rapidjson", etc}``.
+See `pyexasol reference
+`_
+for more details.
+"""
+conn_name_attr = 'exasol_conn_id'
+default_conn_name = 'exasol_default'
+supports_autocommit = True
+
+def __init__(self, *args, **kwargs):
+super(ExasolHook, self).__init__(*args, **kwargs)
+self.schema = kwargs.pop("schema", None)
+
+def get_conn(self):
+conn_id = getattr(self, self.conn_name_attr)
+conn = self.get_connection(conn_id)
+conn_args = dict(
+dsn='%s:%s' % (conn.host, conn.port),
+user=conn.login,
+password=conn.password,
+schema=self.schema or conn.schema)
+# check for parameters in conn.extra
+for arg_name, arg_val in conn.extra_dejson.items():
+if arg_name in ['compression', 'encryption', 'json_lib', 
'client_name']:
+conn_args[arg_name] = arg_val
+
+conn = pyexasol.connect(**conn_args)
+return conn
+
+def get_pandas_df(self, sql, parameters=None):
+"""
+Executes the sql and returns a pandas dataframe
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+conn.export_to_pandas(sql, query_params=parameters)
+
+def get_records(self, sql, parameters=None):
+"""
+Executes the sql and returns a set of records.
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+with closing(conn.execute(sql, parameters)) as cur:
+return cur.fetchall()
+
+def get_first(self, sql, parameters=None):
+"""
+Executes the sql and returns the first resulting row.
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+with closing(conn.execute(sql, parameters)) as cur:
+return cur.fetchone()
+
+def run(self, sql, autocommit=False, parameters=None):
+"""
+Runs a command or a list of commands. Pass a list of sql
+statements to the sql parameter to get them to execute
+sequentially
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param autocommit: What to set the connection's autocommit setting to
+before executing the query.
+:type autocommit: bool
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+if isinstance(sql, basestring):
+sql = [sql]
+
+with closing(self.get_conn()) as conn:
+if self.supports_autocommit:
+   

[GitHub] [airflow] inytar commented on a change in pull request #7621: [AIRFLOW-6982] add native python exasol support

2020-03-06 Thread GitBox
inytar commented on a change in pull request #7621: [AIRFLOW-6982] add native 
python exasol support
URL: https://github.com/apache/airflow/pull/7621#discussion_r388949599
 
 

 ##
 File path: airflow/providers/exasol/hooks/exasol.py
 ##
 @@ -0,0 +1,179 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from contextlib import closing
+
+import pyexasol
+from past.builtins import basestring
+
+from airflow.hooks.dbapi_hook import DbApiHook
+
+
+class ExasolHook(DbApiHook):
+"""
+Interact with Exasol.
+You can specify the pyexasol ``compression``, ``encryption``, ``json_lib``
+and ``client_name``  parameters in the extra field of your connection
+as ``{"compression": True, "json_lib": "rapidjson", etc}``.
+See `pyexasol reference
+`_
+for more details.
+"""
+conn_name_attr = 'exasol_conn_id'
+default_conn_name = 'exasol_default'
+supports_autocommit = True
+
+def __init__(self, *args, **kwargs):
+super(ExasolHook, self).__init__(*args, **kwargs)
+self.schema = kwargs.pop("schema", None)
+
+def get_conn(self):
+conn_id = getattr(self, self.conn_name_attr)
+conn = self.get_connection(conn_id)
+conn_args = dict(
+dsn='%s:%s' % (conn.host, conn.port),
+user=conn.login,
+password=conn.password,
+schema=self.schema or conn.schema)
+# check for parameters in conn.extra
+for arg_name, arg_val in conn.extra_dejson.items():
+if arg_name in ['compression', 'encryption', 'json_lib', 
'client_name']:
+conn_args[arg_name] = arg_val
+
+conn = pyexasol.connect(**conn_args)
+return conn
+
+def get_pandas_df(self, sql, parameters=None):
+"""
+Executes the sql and returns a pandas dataframe
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+conn.export_to_pandas(sql, query_params=parameters)
+
+def get_records(self, sql, parameters=None):
+"""
+Executes the sql and returns a set of records.
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+with closing(conn.execute(sql, parameters)) as cur:
+return cur.fetchall()
+
+def get_first(self, sql, parameters=None):
+"""
+Executes the sql and returns the first resulting row.
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+with closing(conn.execute(sql, parameters)) as cur:
+return cur.fetchone()
+
+def run(self, sql, autocommit=False, parameters=None):
+"""
+Runs a command or a list of commands. Pass a list of sql
+statements to the sql parameter to get them to execute
+sequentially
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param autocommit: What to set the connection's autocommit setting to
+before executing the query.
+:type autocommit: bool
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+if isinstance(sql, basestring):
+sql = [sql]
+
+with closing(self.get_conn()) as conn:
+if self.supports_autocommit:
+

[jira] [Resolved] (AIRFLOW-6990) Improve system tests for Google Marketing Platform

2020-03-06 Thread Tomasz Urbaszek (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6990?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tomasz Urbaszek resolved AIRFLOW-6990.
--
Fix Version/s: 2.0.0
   Resolution: Done

> Improve system tests for Google Marketing Platform
> --
>
> Key: AIRFLOW-6990
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6990
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, tests
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6990) Improve system tests for Google Marketing Platform

2020-03-06 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6990?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053480#comment-17053480
 ] 

ASF subversion and git services commented on AIRFLOW-6990:
--

Commit 6b65038fb409ba1040e70305444816d8f5cfdc47 in airflow's branch 
refs/heads/master from Tomasz Urbaszek
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=6b65038 ]

[AIRFLOW-6990] Improve system tests for Google Marketing Platform (#7631)



> Improve system tests for Google Marketing Platform
> --
>
> Key: AIRFLOW-6990
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6990
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, tests
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6990) Improve system tests for Google Marketing Platform

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6990?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053478#comment-17053478
 ] 

ASF GitHub Bot commented on AIRFLOW-6990:
-

nuclearpinguin commented on pull request #7631: [AIRFLOW-6990] Improve system 
tests for Google Marketing Platform
URL: https://github.com/apache/airflow/pull/7631
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Improve system tests for Google Marketing Platform
> --
>
> Key: AIRFLOW-6990
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6990
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, tests
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] nuclearpinguin merged pull request #7631: [AIRFLOW-6990] Improve system tests for Google Marketing Platform

2020-03-06 Thread GitBox
nuclearpinguin merged pull request #7631: [AIRFLOW-6990] Improve system tests 
for Google Marketing Platform
URL: https://github.com/apache/airflow/pull/7631
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-6984) Improve setup/teardown in SFTP-GCS system tests

2020-03-06 Thread Tomasz Urbaszek (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6984?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tomasz Urbaszek resolved AIRFLOW-6984.
--
Fix Version/s: 2.0.0
   Resolution: Done

> Improve setup/teardown in SFTP-GCS system tests
> ---
>
> Key: AIRFLOW-6984
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6984
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, tests
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Minor
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6984) Improve setup/teardown in SFTP-GCS system tests

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6984?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053465#comment-17053465
 ] 

ASF GitHub Bot commented on AIRFLOW-6984:
-

nuclearpinguin commented on pull request #7623: [AIRFLOW-6984] Improve 
setup/teardown in SFTP-GCS system tests
URL: https://github.com/apache/airflow/pull/7623
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Improve setup/teardown in SFTP-GCS system tests
> ---
>
> Key: AIRFLOW-6984
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6984
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, tests
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6984) Improve setup/teardown in SFTP-GCS system tests

2020-03-06 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6984?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053466#comment-17053466
 ] 

ASF subversion and git services commented on AIRFLOW-6984:
--

Commit ff38e15776b31f69f49fbcb1afbb416f21976e62 in airflow's branch 
refs/heads/master from Tomasz Urbaszek
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ff38e15 ]

[AIRFLOW-6984] Improve setup/teardown in SFTP-GCS system tests (#7623)



> Improve setup/teardown in SFTP-GCS system tests
> ---
>
> Key: AIRFLOW-6984
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6984
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, tests
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Minor
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] nuclearpinguin merged pull request #7623: [AIRFLOW-6984] Improve setup/teardown in SFTP-GCS system tests

2020-03-06 Thread GitBox
nuclearpinguin merged pull request #7623: [AIRFLOW-6984] Improve setup/teardown 
in SFTP-GCS system tests
URL: https://github.com/apache/airflow/pull/7623
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7624: [AIRFLOW-6973] Make GCSCreateBucketOperator idempotent (fix)

2020-03-06 Thread GitBox
nuclearpinguin commented on a change in pull request #7624: [AIRFLOW-6973] Make 
GCSCreateBucketOperator idempotent (fix)
URL: https://github.com/apache/airflow/pull/7624#discussion_r388935245
 
 

 ##
 File path: airflow/providers/google/cloud/hooks/gcs.py
 ##
 @@ -450,7 +450,6 @@ def get_md5hash(self, bucket_name, object_name):
 self.log.info('The md5Hash of %s is %s', object_name, blob_md5hash)
 return blob_md5hash
 
-@CloudBaseHook.catch_http_exception
 
 Review comment:
   That's true, but original exception was hard to catch like:
   ```
   try:
  create_smth()
   except AlreadyExists:
  get_smth()
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7621: [AIRFLOW-6982] add native python exasol support

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7621: [AIRFLOW-6982] add native 
python exasol support
URL: https://github.com/apache/airflow/pull/7621#discussion_r388920638
 
 

 ##
 File path: airflow/providers/exasol/hooks/exasol.py
 ##
 @@ -0,0 +1,179 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from contextlib import closing
+
+import pyexasol
+from past.builtins import basestring
+
+from airflow.hooks.dbapi_hook import DbApiHook
+
+
+class ExasolHook(DbApiHook):
+"""
+Interact with Exasol.
+You can specify the pyexasol ``compression``, ``encryption``, ``json_lib``
+and ``client_name``  parameters in the extra field of your connection
+as ``{"compression": True, "json_lib": "rapidjson", etc}``.
+See `pyexasol reference
+`_
+for more details.
+"""
+conn_name_attr = 'exasol_conn_id'
+default_conn_name = 'exasol_default'
+supports_autocommit = True
+
+def __init__(self, *args, **kwargs):
+super(ExasolHook, self).__init__(*args, **kwargs)
+self.schema = kwargs.pop("schema", None)
+
+def get_conn(self):
+conn_id = getattr(self, self.conn_name_attr)
+conn = self.get_connection(conn_id)
+conn_args = dict(
+dsn='%s:%s' % (conn.host, conn.port),
+user=conn.login,
+password=conn.password,
+schema=self.schema or conn.schema)
+# check for parameters in conn.extra
+for arg_name, arg_val in conn.extra_dejson.items():
+if arg_name in ['compression', 'encryption', 'json_lib', 
'client_name']:
+conn_args[arg_name] = arg_val
+
+conn = pyexasol.connect(**conn_args)
+return conn
+
+def get_pandas_df(self, sql, parameters=None):
+"""
+Executes the sql and returns a pandas dataframe
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+conn.export_to_pandas(sql, query_params=parameters)
+
+def get_records(self, sql, parameters=None):
+"""
+Executes the sql and returns a set of records.
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+with closing(conn.execute(sql, parameters)) as cur:
+return cur.fetchall()
+
+def get_first(self, sql, parameters=None):
+"""
+Executes the sql and returns the first resulting row.
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+with closing(self.get_conn()) as conn:
+with closing(conn.execute(sql, parameters)) as cur:
+return cur.fetchone()
+
+def run(self, sql, autocommit=False, parameters=None):
+"""
+Runs a command or a list of commands. Pass a list of sql
+statements to the sql parameter to get them to execute
+sequentially
+
+:param sql: the sql statement to be executed (str) or a list of
+sql statements to execute
+:type sql: str or list
+:param autocommit: What to set the connection's autocommit setting to
+before executing the query.
+:type autocommit: bool
+:param parameters: The parameters to render the SQL query with.
+:type parameters: mapping or iterable
+"""
+if isinstance(sql, basestring):
+sql = [sql]
+
+with closing(self.get_conn()) as conn:
+if self.supports_autocommit:
+   

[GitHub] [airflow] boring-cyborg[bot] commented on issue #7634: [AIRFLOW-XXXX] fix retry in HttpHook retry_args

2020-03-06 Thread GitBox
boring-cyborg[bot] commented on issue #7634: [AIRFLOW-] fix retry in 
HttpHook retry_args
URL: https://github.com/apache/airflow/pull/7634#issuecomment-595779836
 
 
   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://apache-airflow-slack.herokuapp.com/
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] lidalei opened a new pull request #7634: [AIRFLOW-XXXX] fix retry in HttpHook retry_args

2020-03-06 Thread GitBox
lidalei opened a new pull request #7634: [AIRFLOW-] fix retry in HttpHook 
retry_args
URL: https://github.com/apache/airflow/pull/7634
 
 
   retry=should be of type retry_if_exception_type instead of an exception type


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3404) Add support for Amazon SES

2020-03-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3404?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053406#comment-17053406
 ] 

ASF GitHub Bot commented on AIRFLOW-3404:
-

stale[bot] commented on pull request #4245: [AIRFLOW-3404] Add Amazon SES 
support
URL: https://github.com/apache/airflow/pull/4245
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add support for Amazon SES
> --
>
> Key: AIRFLOW-3404
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3404
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Loic Antoine-Gombeaud
>Assignee: Loic Antoine-Gombeaud
>Priority: Major
>
> Sendgrid is currently implemented for e-mail sending, it would be great to 
> have SES as well



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] stale[bot] closed pull request #4245: [AIRFLOW-3404] Add Amazon SES support

2020-03-06 Thread GitBox
stale[bot] closed pull request #4245: [AIRFLOW-3404] Add Amazon SES support
URL: https://github.com/apache/airflow/pull/4245
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-6993) Region Specific SQS endpoint option

2020-03-06 Thread Umesh Kant (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Umesh Kant updated AIRFLOW-6993:

Description: 
Hi,

We need to use region specific endpoint for SQS for Airflow-Celery setup and do 
not see an option to configure it using existing configurations 'broker_url' or 
'celery_broker_transport_options'. Can we please add an option to configure it 
so that we can route these requests to VPC endpoint rather than going to 
network.

Thanks,

Umesh

  was:
Hi,

We need to use region specific endpoint for SQS for Airflow-Celery setup and do 
not see an option to configure it. Existing configurations 'broker_url' or 
'celery_broker_transport_options'. Can we please add an option to configure it 
so that we can route these requests to VPC endpoint rather than going to 
network.

Thanks,

Umesh


> Region Specific SQS endpoint option
> ---
>
> Key: AIRFLOW-6993
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6993
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: celery
>Affects Versions: 1.10.3
>Reporter: Umesh Kant
>Priority: Major
>
> Hi,
> We need to use region specific endpoint for SQS for Airflow-Celery setup and 
> do not see an option to configure it using existing configurations 
> 'broker_url' or 'celery_broker_transport_options'. Can we please add an 
> option to configure it so that we can route these requests to VPC endpoint 
> rather than going to network.
> Thanks,
> Umesh



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] inytar commented on issue #7621: [AIRFLOW-6982] add native python exasol support

2020-03-06 Thread GitBox
inytar commented on issue #7621: [AIRFLOW-6982] add native python exasol support
URL: https://github.com/apache/airflow/pull/7621#issuecomment-595756254
 
 
   I've looked into the lint & test issues. I've got Jenkins almost happy, 
except for one warning in the build-docs. I get the the following message:
   ```
   opt/airflow/docs/_api/airflow/providers/exasol/hooks/exasol/index.rst:122: 
WARNING: Field list ends without a blank line; unexpected unindent.
   ```
   I'm assuming it has to do with the doc-strings in 
[`airflow/providers/exasol/hooks/exasol.py`](https://github.com/apache/airflow/pull/7621/files#diff-d4d9616d2fe28929ca431e540d4b86e9),
 but I can't find what it is. I'd love some input.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6993) Region Specific SQS endpoint option

2020-03-06 Thread Umesh Kant (Jira)
Umesh Kant created AIRFLOW-6993:
---

 Summary: Region Specific SQS endpoint option
 Key: AIRFLOW-6993
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6993
 Project: Apache Airflow
  Issue Type: Improvement
  Components: celery
Affects Versions: 1.10.3
Reporter: Umesh Kant


Hi,

We need to use region specific endpoint for SQS for Airflow-Celery setup and do 
not see an option to configure it. Existing configurations 'broker_url' or 
'celery_broker_transport_options'. Can we please add an option to configure it 
so that we can route these requests to VPC endpoint rather than going to 
network.

Thanks,

Umesh



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] michalslowikowski00 commented on a change in pull request #7630: [AIRFLOW-6724] Add Google Analytics 360 Accounts Retrieve Operator

2020-03-06 Thread GitBox
michalslowikowski00 commented on a change in pull request #7630: [AIRFLOW-6724] 
Add Google Analytics 360 Accounts Retrieve Operator
URL: https://github.com/apache/airflow/pull/7630#discussion_r388868149
 
 

 ##
 File path: airflow/providers/google/marketing_platform/hooks/analytics.py
 ##
 @@ -0,0 +1,87 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from googleapiclient.discovery import Resource, build
+
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+
+class GoogleAnalyticsHook(CloudBaseHook):
+"""
+Hook for Google Analytics 360.
+"""
+
+def __init__(
+self,
+api_version: str = "v3",
+gcp_connection_id: str = "google cloud default",
+*args,
+**kwargs
+):
+super().__init__(*args, **kwargs)
+self.api_version = api_version
+self.gcp_connection_is = gcp_connection_id
+self._conn = None
+
+def get_conn(self) -> Resource:
+"""
+Retrieves connection to Google Analytics 360.
+"""
+if not self._conn:
+http_authorized = self._authorize()
+self._conn = build(
+"analytics",
+self.api_version,
+http=http_authorized,
+cache_discovery=False,
+)
+return self._conn
+
+# pylint: disable=unused-argument
+def list_accounts(self, max_results: int, start_index: int) -> list:
+"""
+Lists accounts list from Google Analytics 360.
+
+:param max_results: The maximum number of accounts to include in this 
response.
+:type: max_result: int
+:param start_index: An index of the first account to retrieve
+Use this parameter as a pagination mechanism along with the 
max-results parameter.
 
 Review comment:
   Thank you :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7630: [AIRFLOW-6724] Add Google Analytics 360 Accounts Retrieve Operator

2020-03-06 Thread GitBox
mik-laj commented on a change in pull request #7630: [AIRFLOW-6724] Add Google 
Analytics 360 Accounts Retrieve Operator
URL: https://github.com/apache/airflow/pull/7630#discussion_r388867299
 
 

 ##
 File path: airflow/providers/google/marketing_platform/hooks/analytics.py
 ##
 @@ -0,0 +1,87 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from googleapiclient.discovery import Resource, build
+
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+
+class GoogleAnalyticsHook(CloudBaseHook):
+"""
+Hook for Google Analytics 360.
+"""
+
+def __init__(
+self,
+api_version: str = "v3",
+gcp_connection_id: str = "google cloud default",
+*args,
+**kwargs
+):
+super().__init__(*args, **kwargs)
+self.api_version = api_version
+self.gcp_connection_is = gcp_connection_id
+self._conn = None
+
+def get_conn(self) -> Resource:
+"""
+Retrieves connection to Google Analytics 360.
+"""
+if not self._conn:
+http_authorized = self._authorize()
+self._conn = build(
+"analytics",
+self.api_version,
+http=http_authorized,
+cache_discovery=False,
+)
+return self._conn
+
+# pylint: disable=unused-argument
+def list_accounts(self, max_results: int, start_index: int) -> list:
+"""
+Lists accounts list from Google Analytics 360.
+
+:param max_results: The maximum number of accounts to include in this 
response.
+:type: max_result: int
+:param start_index: An index of the first account to retrieve
+Use this parameter as a pagination mechanism along with the 
max-results parameter.
 
 Review comment:
   ```suggestion
   Use this parameter as a pagination mechanism along with the 
max-results parameter.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3097) Capability for nested SubDags

2020-03-06 Thread Juan RODRIGUEZ (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3097?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053285#comment-17053285
 ] 

Juan RODRIGUEZ commented on AIRFLOW-3097:
-

Hi [~brooked]!

I've been testing nested subdags with depth of 2, and I have perfomance issue :
 * I create 50 subdags
 * some of them takes 10 min to just print something
 * others fail randomly and don't create log file.

Did you have some perfomance problems using subdags? do you have a specific 
configuration (arflow, workers, subdag operator, scheduler ... ) to enhance dag 
execution performance?

Best Regards

Diego

> Capability for nested SubDags
> -
>
> Key: AIRFLOW-3097
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3097
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.8.0
>Reporter: John Longo
>Priority: Critical
>  Labels: subdag
>
> Unless I'm doing something incorrectly, it appears that you cannot nest 
> SubDags which would be a very helpful feature.  I've created a simple 
> pipeline to demonstrate the failure case below.  It produces the following in 
> Airflow:  Broken DAG: [/home/airflow/airflow/dags/test_dag.py] 'NoneType' 
> object has no attribute 'dag_id' 
> test_dag.py
> {code:java}
> from airflow import DAG
> from airflow.operators.subdag_operator import SubDagOperator
> import datetime
> from datetime import timedelta
> from test_subdag1 import TestSubDag1
> startDate = '2018-09-20'
> default_args = {
> 'owner': 'airflow',
> 'depends_on_past': False,
> 'email': ['em...@airflow.com'],
> 'start_date': datetime.datetime(2018, 3, 20, 9, 0),
> 'email_on_failure': False,
> 'email_on_retry': False,
> 'retries': 5,
> 'retry_delay': timedelta(seconds=30),
> 'run_as_user': 'airflow'
> }
> Test_DAG = DAG('Test_DAG', default_args=default_args, 
> start_date=datetime.datetime(2018, 3, 20, 9, 0), schedule_interval=None, 
> catchup=False)
> test_subdag1 = SubDagOperator(subdag=TestSubDag1('Test_DAG', 'test_subdag1', 
> startDate),
> task_id='test_subdag1',
> dag=Test_DAG)
> TestDagConsolidateTask = DummyOperator(task_id='TestDag_Consolidate', 
> dag=Test_DAG)
> test_subdag1 >> TestDagConsolidateTask
> {code}
> test_subdag1.py
> {code:java}
> from airflow import DAG
> from airflow.operators.subdag_operator import SubDagOperator
> from airflow.operators.dummy_operator import DummyOperator
> from test_subdag2 import TestSubDag2
> import datetime
> from datetime import timedelta
> def TestSubDag1(parent_dag_name, child_dag_name, startDate):
> subdag = DAG(
> '%s.%s' % (parent_dag_name, child_dag_name),
> schedule_interval=None,
> start_date=startDate)
> test_subdag2 = SubDagOperator(subdag=TestSubDag2('%s.%s' % (parent_dag_name, 
> child_dag_name), 'test_subdag2', startDate),
> task_id='test_subdag2',
> dag=subdag) 
> Subdag1ConsolidateTask = DummyOperator(task_id='Subdag1_Consolidate', 
> dag=subdag)
> test_subdag2 >> Subdag1ConsolidateTask
> {code}
>  
> test_subdag2.py
> {code:java}
> // code placeholder
> from airflow import DAG
> from airflow.operators.dummy_operator import DummyOperator
> import datetime
> from datetime import timedelta
> def TestSubDag2(parent_dag_name, child_dag_name, startDate):
> subdag = DAG(
> '%s.%s' % (parent_dag_name, child_dag_name),
> schedule_interval=None,
> start_date=startDate)
> TestTask = DummyOperator(task_id='TestTask', dag=subdag)
> Subdag2ConsolidateTask = DummyOperator(task_id='Subdag2_Consolidate', 
> dag=subdag)
> TestTask >> Subdag2ConsolidateTask
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-3097) Capability for nested SubDags

2020-03-06 Thread Juan RODRIGUEZ (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3097?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17053285#comment-17053285
 ] 

Juan RODRIGUEZ edited comment on AIRFLOW-3097 at 3/6/20, 10:45 AM:
---

Hi [~brooked]!

I've been testing nested subdags with depth of 2, and I have perfomance issue :
 * I create 50 subdags
 * some of them takes 10 min to just print something
 * others fail randomly and don't create log file.

Did you have some perfomance problems using subdags? do you have a specific 
configuration (arflow, workers, subdag operator, scheduler ... ) to enhance dag 
execution performance?

Best Regards


was (Author: jdrodriguez):
Hi [~brooked]!

I've been testing nested subdags with depth of 2, and I have perfomance issue :
 * I create 50 subdags
 * some of them takes 10 min to just print something
 * others fail randomly and don't create log file.

Did you have some perfomance problems using subdags? do you have a specific 
configuration (arflow, workers, subdag operator, scheduler ... ) to enhance dag 
execution performance?

Best Regards

Diego

> Capability for nested SubDags
> -
>
> Key: AIRFLOW-3097
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3097
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.8.0
>Reporter: John Longo
>Priority: Critical
>  Labels: subdag
>
> Unless I'm doing something incorrectly, it appears that you cannot nest 
> SubDags which would be a very helpful feature.  I've created a simple 
> pipeline to demonstrate the failure case below.  It produces the following in 
> Airflow:  Broken DAG: [/home/airflow/airflow/dags/test_dag.py] 'NoneType' 
> object has no attribute 'dag_id' 
> test_dag.py
> {code:java}
> from airflow import DAG
> from airflow.operators.subdag_operator import SubDagOperator
> import datetime
> from datetime import timedelta
> from test_subdag1 import TestSubDag1
> startDate = '2018-09-20'
> default_args = {
> 'owner': 'airflow',
> 'depends_on_past': False,
> 'email': ['em...@airflow.com'],
> 'start_date': datetime.datetime(2018, 3, 20, 9, 0),
> 'email_on_failure': False,
> 'email_on_retry': False,
> 'retries': 5,
> 'retry_delay': timedelta(seconds=30),
> 'run_as_user': 'airflow'
> }
> Test_DAG = DAG('Test_DAG', default_args=default_args, 
> start_date=datetime.datetime(2018, 3, 20, 9, 0), schedule_interval=None, 
> catchup=False)
> test_subdag1 = SubDagOperator(subdag=TestSubDag1('Test_DAG', 'test_subdag1', 
> startDate),
> task_id='test_subdag1',
> dag=Test_DAG)
> TestDagConsolidateTask = DummyOperator(task_id='TestDag_Consolidate', 
> dag=Test_DAG)
> test_subdag1 >> TestDagConsolidateTask
> {code}
> test_subdag1.py
> {code:java}
> from airflow import DAG
> from airflow.operators.subdag_operator import SubDagOperator
> from airflow.operators.dummy_operator import DummyOperator
> from test_subdag2 import TestSubDag2
> import datetime
> from datetime import timedelta
> def TestSubDag1(parent_dag_name, child_dag_name, startDate):
> subdag = DAG(
> '%s.%s' % (parent_dag_name, child_dag_name),
> schedule_interval=None,
> start_date=startDate)
> test_subdag2 = SubDagOperator(subdag=TestSubDag2('%s.%s' % (parent_dag_name, 
> child_dag_name), 'test_subdag2', startDate),
> task_id='test_subdag2',
> dag=subdag) 
> Subdag1ConsolidateTask = DummyOperator(task_id='Subdag1_Consolidate', 
> dag=subdag)
> test_subdag2 >> Subdag1ConsolidateTask
> {code}
>  
> test_subdag2.py
> {code:java}
> // code placeholder
> from airflow import DAG
> from airflow.operators.dummy_operator import DummyOperator
> import datetime
> from datetime import timedelta
> def TestSubDag2(parent_dag_name, child_dag_name, startDate):
> subdag = DAG(
> '%s.%s' % (parent_dag_name, child_dag_name),
> schedule_interval=None,
> start_date=startDate)
> TestTask = DummyOperator(task_id='TestTask', dag=subdag)
> Subdag2ConsolidateTask = DummyOperator(task_id='Subdag2_Consolidate', 
> dag=subdag)
> TestTask >> Subdag2ConsolidateTask
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6992) Add MongoToGCSOperator

2020-03-06 Thread Tania Batieva (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tania Batieva updated AIRFLOW-6992:
---
Description: Add MongoToGSCOperator similar to 
[mssql_to_gcs|https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]
 or [mongo_to_s3 
|https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]to
 export data from Mongo collections to Google Cloud Storage.  (was: Add 
MongoToGSCOperator similar to 
[mssql_to_gcs|[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]
 or 
[mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py])

> Add MongoToGCSOperator
> --
>
> Key: AIRFLOW-6992
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6992
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, operators
>Affects Versions: 1.10.9
>Reporter: Tania Batieva
>Assignee: Tania Batieva
>Priority: Minor
>  Labels: gsoc, outreachy2020
>
> Add MongoToGSCOperator similar to 
> [mssql_to_gcs|https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]
>  or [mongo_to_s3 
> |https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]to
>  export data from Mongo collections to Google Cloud Storage.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6992) Add MongoToGCSOperator

2020-03-06 Thread Tania Batieva (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tania Batieva updated AIRFLOW-6992:
---
Description: Add MongoToGSCOperator similar to [mssql_to_gcs | 
[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]]
 or 
[mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]]
  (was: Add MongoToGSCOperator similar to 
[mssql_to_gcs|[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]]
 or 
[mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]])

> Add MongoToGCSOperator
> --
>
> Key: AIRFLOW-6992
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6992
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, operators
>Affects Versions: 1.10.9
>Reporter: Tania Batieva
>Assignee: Tania Batieva
>Priority: Minor
>  Labels: gsoc, outreachy2020
>
> Add MongoToGSCOperator similar to [mssql_to_gcs | 
> [https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]]
>  or 
> [mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6992) Add MongoToGCSOperator

2020-03-06 Thread Tania Batieva (Jira)
Tania Batieva created AIRFLOW-6992:
--

 Summary: Add MongoToGCSOperator
 Key: AIRFLOW-6992
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6992
 Project: Apache Airflow
  Issue Type: New Feature
  Components: contrib, operators
Affects Versions: 1.10.9
Reporter: Tania Batieva
Assignee: Tania Batieva


**Add MongoToGSCOperator similar to 
[mssql_to_gcs|[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]]
 or 
[mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6992) Add MongoToGCSOperator

2020-03-06 Thread Tania Batieva (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tania Batieva updated AIRFLOW-6992:
---
Description: Add MongoToGSCOperator similar to 
[mssql_to_gcs|[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]]
 or 
[mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]]
  (was: **Add MongoToGSCOperator similar to 
[mssql_to_gcs|[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]]
 or 
[mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]])

> Add MongoToGCSOperator
> --
>
> Key: AIRFLOW-6992
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6992
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, operators
>Affects Versions: 1.10.9
>Reporter: Tania Batieva
>Assignee: Tania Batieva
>Priority: Minor
>  Labels: gsoc, outreachy2020
>
> Add MongoToGSCOperator similar to 
> [mssql_to_gcs|[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]]
>  or 
> [mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6992) Add MongoToGCSOperator

2020-03-06 Thread Tania Batieva (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tania Batieva updated AIRFLOW-6992:
---
Description: Add MongoToGSCOperator similar to 
[mssql_to_gcs|[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]
 or 
[mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]
  (was: Add MongoToGSCOperator similar to [mssql_to_gcs | 
[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]]
 or 
[mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]])

> Add MongoToGCSOperator
> --
>
> Key: AIRFLOW-6992
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6992
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, operators
>Affects Versions: 1.10.9
>Reporter: Tania Batieva
>Assignee: Tania Batieva
>Priority: Minor
>  Labels: gsoc, outreachy2020
>
> Add MongoToGSCOperator similar to 
> [mssql_to_gcs|[https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/operators/mssql_to_gcs.py]
>  or 
> [mongo_to_s3|[https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/operators/mongo_to_s3.py]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


  1   2   >