[jira] [Commented] (AIRFLOW-6588) json_format and write_stdout are boolean

2020-01-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6588?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018516#comment-17018516
 ] 

ASF GitHub Bot commented on AIRFLOW-6588:
-

pingzh commented on pull request #7199: [AIRFLOW-6588] write_stdout and 
json_format are boolean
URL: https://github.com/apache/airflow/pull/7199
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> json_format and write_stdout are boolean
> 
>
> Key: AIRFLOW-6588
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6588
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: master
>Reporter: Ping Zhang
>Assignee: Ping Zhang
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] pingzh opened a new pull request #7199: [AIRFLOW-6588] write_stdout and json_format are boolean

2020-01-17 Thread GitBox
pingzh opened a new pull request #7199: [AIRFLOW-6588] write_stdout and 
json_format are boolean
URL: https://github.com/apache/airflow/pull/7199
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-6588) json_format and write_stdout are boolean

2020-01-17 Thread Ping Zhang (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6588?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ping Zhang reassigned AIRFLOW-6588:
---

Assignee: Ping Zhang

> json_format and write_stdout are boolean
> 
>
> Key: AIRFLOW-6588
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6588
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: master
>Reporter: Ping Zhang
>Assignee: Ping Zhang
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6588) json_format and write_stdout are boolean

2020-01-17 Thread Ping Zhang (Jira)
Ping Zhang created AIRFLOW-6588:
---

 Summary: json_format and write_stdout are boolean
 Key: AIRFLOW-6588
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6588
 Project: Apache Airflow
  Issue Type: Bug
  Components: logging
Affects Versions: master
Reporter: Ping Zhang






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id 
to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368208678
 
 

 ##
 File path: airflow/utils/log/es_task_handler.py
 ##
 @@ -53,9 +53,11 @@ class ElasticsearchTaskHandler(FileTaskHandler, 
LoggingMixin):
 PAGE = 0
 MAX_LINE_PER_PAGE = 1000
 
+# 16 is reasonable in this case
+# pylint: disable-msg=too-many-arguments
 def __init__(self, base_log_folder, filename_template,
  log_id_template, end_of_log_mark,
- write_stdout, json_format, json_fields,
 
 Review comment:
   i mean a default value in the class constructor. although you have added an 
index in the airflow_local_settings, users might already have their own 
customized airflow_local_settings. If you don't put a default value in the 
index, it will break their code as the constructor requires `index`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6583) (BigQuery) Add query_params to templated_fields

2020-01-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018509#comment-17018509
 ] 

ASF GitHub Bot commented on AIRFLOW-6583:
-

jithin97 commented on pull request #7198: [AIRFLOW-6583] (BigQuery) Add 
query_params to templated_fields
URL: https://github.com/apache/airflow/pull/7198
 
 
   ### JIRA Issue
   https://issues.apache.org/jira/browse/AIRFLOW-6583
   
   ### Description
   Add **query_params** to _template_field_ in GCP's BigQuery Execute Query 
Operator.
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> (BigQuery) Add query_params to templated_fields
> ---
>
> Key: AIRFLOW-6583
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6583
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: gcp
>Affects Versions: 1.10.7
>Reporter: Jithin Sukumar
>Assignee: Jithin Sukumar
>Priority: Minor
>
> To query time-partitioned tables, I am passing \{{query_params}} like this
> yesterday = Variable.get('yesterday', '\{{yesterday_ds}}')
> today = Variable.get('today', '\{{ds}}')
> ...
> query_params=[\{'name': 'yesterday', 'parameterType': {'type': 'STRING'},
>'parameterValue': \{'value': yesterday}},
>   \{'name': 'today', 'parameterType': {'type': 'STRING'},
>'parameterValue': \{'value': today}}]
> query_params needs to be a template_field



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] jithin97 opened a new pull request #7198: [AIRFLOW-6583] (BigQuery) Add query_params to templated_fields

2020-01-17 Thread GitBox
jithin97 opened a new pull request #7198: [AIRFLOW-6583] (BigQuery) Add 
query_params to templated_fields
URL: https://github.com/apache/airflow/pull/7198
 
 
   ### JIRA Issue
   https://issues.apache.org/jira/browse/AIRFLOW-6583
   
   ### Description
   Add **query_params** to _template_field_ in GCP's BigQuery Execute Query 
Operator.
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7138: [AIRFLOW-5912] Expose lineage API

2020-01-17 Thread GitBox
mik-laj commented on a change in pull request #7138: [AIRFLOW-5912] Expose 
lineage API
URL: https://github.com/apache/airflow/pull/7138#discussion_r368198224
 
 

 ##
 File path: airflow/api/client/api_client.py
 ##
 @@ -70,3 +70,12 @@ def delete_pool(self, name):
 :param name: pool name
 """
 raise NotImplementedError()
+
+def get_lineage(self, dag_id: str, execution_date: str):
 
 Review comment:
   In my opinion, this class is not intended for external use. Someone can use 
it, but it requires Airflow installation. I think it would be better if we 
recommend using the REST API directly. This is described in detail in the 
documentation - 
[here](https://airflow.readthedocs.io/en/latest/usage-cli.html#set-up-connection-to-a-remote-airflow-instance)
 For me, this is an internal abstraction that allows us to execute some CLI 
commands using the REST API or execute some CLI command using a direct database 
query.  So if we add a method here, then we should also add the corresponding 
command in CLI. We should not allow developer to use this class directly 
because we have a lot of side effects when importing the `airflow` package e.g. 
changing the logging configuration or importing the webserver classes.
   
   We cannot deprecate local_client because only the CLI that uses local_client 
is fully functional and we use this class in the default configuration. 
   
   The bit confusing is that this code is in the `api` package and not in the 
`cli` package, but the `cli` package was created later and it is possible that 
we should move this code now.
   
   I would like to deal with the subject of API in the near future. I've 
planned it in Q1. I will present AIP in the near future.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] abdulbasitds edited a comment on issue #6007: Improved and Fixed Apache Glue hook/operator/sensor from [AIRFLOW-2310]

2020-01-17 Thread GitBox
abdulbasitds edited a comment on issue #6007: Improved and Fixed Apache Glue 
hook/operator/sensor from [AIRFLOW-2310] 
URL: https://github.com/apache/airflow/pull/6007#issuecomment-575844650
 
 
   I was asked to "rebase commits and resolve conflicts" , Since I didnt knew 
about doing it and didnt get time to see what is needed to be done on my side, 
I think it is pending on me.
   
   If anyone can do that at their end, I think we can have this operator 
finally.
   @letianw91  @ashb 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] abdulbasitds commented on issue #6007: Improved and Fixed Apache Glue hook/operator/sensor from [AIRFLOW-2310]

2020-01-17 Thread GitBox
abdulbasitds commented on issue #6007: Improved and Fixed Apache Glue 
hook/operator/sensor from [AIRFLOW-2310] 
URL: https://github.com/apache/airflow/pull/6007#issuecomment-575844650
 
 
   I was asked to "rebase commits and resolve conflicts" , Since I didnt knew 
about doing it and didnt get time to see what is needed to be done on my side, 
I think it is pending on me.
   
   If anyone can do that at their end, I think we can have this operator 
finally.
   @letianw91  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] letianw91 edited a comment on issue #6007: Improved and Fixed Apache Glue hook/operator/sensor from [AIRFLOW-2310]

2020-01-17 Thread GitBox
letianw91 edited a comment on issue #6007: Improved and Fixed Apache Glue 
hook/operator/sensor from [AIRFLOW-2310] 
URL: https://github.com/apache/airflow/pull/6007#issuecomment-575841159
 
 
   Hi, been watching this operator for a while. Is there anything I can help to 
speed up this a bit? Keen to use it without having to copy and paste.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] letianw91 commented on issue #6007: Improved and Fixed Apache Glue hook/operator/sensor from [AIRFLOW-2310]

2020-01-17 Thread GitBox
letianw91 commented on issue #6007: Improved and Fixed Apache Glue 
hook/operator/sensor from [AIRFLOW-2310] 
URL: https://github.com/apache/airflow/pull/6007#issuecomment-575841159
 
 
   Hi, been watching this operator for a while. Is there anything I can help to 
speed up this a bit? Keen to use it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6587) Docker opretaor to support PID mode

2020-01-17 Thread Harish (Jira)
Harish created AIRFLOW-6587:
---

 Summary: Docker  opretaor to support PID mode
 Key: AIRFLOW-6587
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6587
 Project: Apache Airflow
  Issue Type: Improvement
  Components: operators
Affects Versions: 2.0.0
Reporter: Harish


[https://docs.docker.com/engine/reference/run/#pid-settings---pid]

"PID namespace provides separation of processes. The PID Namespace removes the 
view of the system processes, and allows process ids to be reused including pid 
1."

i was able to get this working with following change:

[https://github.com/apache/airflow/commit/fb4cdd74de270ed679a36a916b673474166e6efb]

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] lucafuji commented on issue #6870: [AIRFLOW-0578] Check return code

2020-01-17 Thread GitBox
lucafuji commented on issue #6870: [AIRFLOW-0578] Check return code
URL: https://github.com/apache/airflow/pull/6870#issuecomment-575817766
 
 
   Besides that,  I kind of remember why Max recommend me to change base_job. 
   
   In airflow, it's base_job.run call base_job.execute and each sub class 
implements its own _execute function.
   And it's _execute's job to execute the actual command, but state management 
should be done in base_job.run, after _execute is done.
   The includes managing the state of the job and task instance
   We should not change the state in _execute.
   Therefore it might be better to implement a on_failure callback to be called 
after _execute. We can make default behaviour just do nothing and only 
implement specific logic in local_task_job.py


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6586) GCSUploadSessionCompleteSensor breaks in reschedule mode.

2020-01-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6586?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018358#comment-17018358
 ] 

ASF GitHub Bot commented on AIRFLOW-6586:
-

jaketf commented on pull request #7197: [AIRFLOW-6586] Improvements to gcs 
sensor
URL: https://github.com/apache/airflow/pull/7197
 
 
   refactors GoogleCloudStorageUploadSessionCompleteSensor to use set instead 
of number of objects
   
   add poke mode only decorator
   
   assert that poke_mode_only applied to child of BaseSensorOperator
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> GCSUploadSessionCompleteSensor breaks in reschedule mode.
> -
>
> Key: AIRFLOW-6586
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6586
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Jacob Ferriero
>Priority: Minor
>
> This sensor is stateful and loses state between reschedules. 
> We should: 
>  # Warn about this in docstring
>  # Add a `poke_mode_only` class decorator for sensors that aren't safe in 
> reschedule mode.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] jaketf opened a new pull request #7197: [AIRFLOW-6586] Improvements to gcs sensor

2020-01-17 Thread GitBox
jaketf opened a new pull request #7197: [AIRFLOW-6586] Improvements to gcs 
sensor
URL: https://github.com/apache/airflow/pull/7197
 
 
   refactors GoogleCloudStorageUploadSessionCompleteSensor to use set instead 
of number of objects
   
   add poke mode only decorator
   
   assert that poke_mode_only applied to child of BaseSensorOperator
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] lucafuji commented on issue #6870: [AIRFLOW-0578] Check return code

2020-01-17 Thread GitBox
lucafuji commented on issue #6870: [AIRFLOW-0578] Check return code
URL: https://github.com/apache/airflow/pull/6870#issuecomment-575816844
 
 
   I just added python operator calls os._exit(1). My code does catch it and 
change the status to FAILED. without my code. The failure is not caught.
   
   **However**, we probably don't want use to call os._exit() in Python 
operator. This is because:
   Python operator runs in the same process with task instance. Therefore when 
calling os._exit, some post execution code in task instance is not executed. 
This will leave task instance in RUNNING state forever. 
   
   Here is the code of pose execution of task_instance:
   
https://github.com/apache/airflow/blob/master/airflow/models/taskinstance.py#L965


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6586) GCSUploadSessionCompleteSensor breaks in reschedule mode.

2020-01-17 Thread Jacob Ferriero (Jira)
Jacob Ferriero created AIRFLOW-6586:
---

 Summary: GCSUploadSessionCompleteSensor breaks in reschedule mode.
 Key: AIRFLOW-6586
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6586
 Project: Apache Airflow
  Issue Type: Bug
  Components: operators
Affects Versions: 1.10.3
Reporter: Jacob Ferriero


This sensor is stateful and loses state between reschedules. 

We should: 
 # Warn about this in docstring
 # Add a `poke_mode_only` class decorator for sensors that aren't safe in 
reschedule mode.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] tooptoop4 commented on issue #7186: [AIRFLOW-6473] Show conf in response of dag_state cli command

2020-01-17 Thread GitBox
tooptoop4 commented on issue #7186: [AIRFLOW-6473] Show conf in response of 
dag_state cli command
URL: https://github.com/apache/airflow/pull/7186#issuecomment-575812436
 
 
   @feluelle pls merge


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #7186: [AIRFLOW-6473] Show conf in response of dag_state cli command

2020-01-17 Thread GitBox
codecov-io commented on issue #7186: [AIRFLOW-6473] Show conf in response of 
dag_state cli command
URL: https://github.com/apache/airflow/pull/7186#issuecomment-575812154
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7186?src=pr=h1) 
Report
   > :exclamation: No coverage uploaded for pull request base 
(`master@92521aa`). [Click here to learn what that 
means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).
   > The diff coverage is `86.5%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7186/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7186?src=pr=tree)
   
   ```diff
   @@   Coverage Diff@@
   ## master   #7186   +/-   ##
   
 Coverage  ? 85%   
   
 Files ? 753   
 Lines ?   39689   
 Branches  ?   0   
   
 Hits  ?   33739   
 Misses?5950   
 Partials  ?   0
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7186?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/pool.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvcG9vbC5weQ==)
 | `97.36% <ø> (ø)` | |
   | 
[airflow/contrib/operators/databricks\_operator.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9kYXRhYnJpY2tzX29wZXJhdG9yLnB5)
 | `92.24% <ø> (ø)` | |
   | 
[.../providers/apache/spark/hooks/spark\_jdbc\_script.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL3NwYXJrL2hvb2tzL3NwYXJrX2pkYmNfc2NyaXB0LnB5)
 | `0% <ø> (ø)` | |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `94.73% <ø> (ø)` | |
   | 
[...ders/google/cloud/example\_dags/example\_dataproc.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL2Nsb3VkL2V4YW1wbGVfZGFncy9leGFtcGxlX2RhdGFwcm9jLnB5)
 | `0% <0%> (ø)` | |
   | 
[airflow/macros/hive.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9tYWNyb3MvaGl2ZS5weQ==)
 | `38.7% <0%> (ø)` | |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <0%> (ø)` | |
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `68.78% <0%> (ø)` | |
   | 
[airflow/contrib/operators/vertica\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy92ZXJ0aWNhX3RvX2hpdmUucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/gcp/operators/bigquery.py](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree#diff-YWlyZmxvdy9nY3Avb3BlcmF0b3JzL2JpZ3F1ZXJ5LnB5)
 | `91.59% <100%> (ø)` | |
   | ... and [98 
more](https://codecov.io/gh/apache/airflow/pull/7186/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7186?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7186?src=pr=footer). 
Last update 
[92521aa...159735a](https://codecov.io/gh/apache/airflow/pull/7186?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tooptoop4 commented on a change in pull request #7186: [AIRFLOW-6473] Show conf in response of dag_state cli command

2020-01-17 Thread GitBox
tooptoop4 commented on a change in pull request #7186: [AIRFLOW-6473] Show conf 
in response of dag_state cli command
URL: https://github.com/apache/airflow/pull/7186#discussion_r368140839
 
 

 ##
 File path: airflow/cli/commands/dag_command.py
 ##
 @@ -200,16 +200,22 @@ def dag_show(args):
 @cli_utils.action_logging
 def dag_state(args):
 """
-Returns the state of a DagRun at the command line.
+Returns the state (and conf if exists) of a DagRun at the command line.
 >>> airflow dags state tutorial 2015-01-01T00:00:00.00
 running
+>>> airflow dags state a_dag_with_conf_passed 2015-01-01T00:00:00.00
+failed, {"name": "bob", "age": "42"}
 """
 if args.subdir:
 dag = get_dag(args.subdir, args.dag_id)
 else:
 dag = get_dag_by_file_location(args.dag_id)
 dr = DagRun.find(dag.dag_id, execution_date=args.execution_date)
-print(dr[0].state if len(dr) > 0 else None)  # pylint: 
disable=len-as-condition
+out = dr[0].state if len(dr) > 0 else None  # pylint: 
disable=len-as-condition
 
 Review comment:
   fixed


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tooptoop4 commented on issue #7157: [AIRFLOW-6251] add config for max tasks per dag

2020-01-17 Thread GitBox
tooptoop4 commented on issue #7157: [AIRFLOW-6251] add config for max tasks per 
dag
URL: https://github.com/apache/airflow/pull/7157#issuecomment-575794455
 
 
   travis happy


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] lucafuji edited a comment on issue #6870: [AIRFLOW-0578] Check return code

2020-01-17 Thread GitBox
lucafuji edited a comment on issue #6870: [AIRFLOW-0578] Check return code
URL: https://github.com/apache/airflow/pull/6870#issuecomment-575732495
 
 
   > Im not sure your test as it stands covers the new code - the BashOperator 
handles return code for that command already.
   > 
   > I think what you need to test is a Python operator that calls 
`os._exit(1)` - but the real way of testing would be to run the test without 
calling your new on_failure, see it fail, then run it again with it enabled
   
   On 
https://github.com/apache/airflow/blob/master/airflow/operators/bash_operator.py#L137,
 bash operator will throw an exception if the return code is non zero.
   
   However, on 
https://github.com/apache/airflow/blob/master/airflow/task/task_runner/standard_task_runner.py#L85,
 it will catch this exception and return a return code of 1. This is what this 
PR trying to solve. This return code is not handled properly by local_task_job. 
 And my test case did cover this.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral storage on KubernetesPodOp…

2020-01-17 Thread GitBox
potiuk commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral 
storage on KubernetesPodOp…
URL: https://github.com/apache/airflow/pull/6337#issuecomment-575792942
 
 
   Waiting for the Travis job to complete


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Comment Edited] (AIRFLOW-6560) db password leaks to logs

2020-01-17 Thread Jira


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6560?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018285#comment-17018285
 ] 

Marcin Jasiński edited comment on AIRFLOW-6560 at 1/17/20 8:32 PM:
---

[~rconroy293] Kombu version 4.6.3
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow'{code}
Kombu version 4.6.7
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:**@airflow-mysql:3306/airflow'{code}
You should bump (or remove like on master branch) Kombu version :) 

[https://github.com/apache/airflow/blob/73bf718358c01cc41e5f23b914a8824a2665a28c/setup.py#L161]

After bumping:
{code:java}
[2020-01-17 21:28:32,313: INFO/MainProcess] Connected to 
sqla+mysql://airflow:**@airflow-mysql:3306/airflow
[2020-01-17 21:28:32,427: INFO/MainProcess] celery@8085c534fdac ready.{code}


was (Author: mkjasinski):
Kombu version 4.6.3
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow'{code}
Kombu version 4.6.7
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:**@airflow-mysql:3306/airflow'{code}
You should bump (or remove like on master branch) Kombu version :) 

[https://github.com/apache/airflow/blob/73bf718358c01cc41e5f23b914a8824a2665a28c/setup.py#L161]

After bumping:
{code:java}
[2020-01-17 21:28:32,313: INFO/MainProcess] Connected to 
sqla+mysql://airflow:**@airflow-mysql:3306/airflow
[2020-01-17 21:28:32,427: INFO/MainProcess] celery@8085c534fdac ready.{code}

> db password leaks to logs
> -
>
> Key: AIRFLOW-6560
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6560
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging, security
>Affects Versions: 1.10.6
>Reporter: Marcin Jasiński
>Priority: Critical
>
> I have configured Airflow metadata db as MySQL.
> {code:java}
> sql_alchemy_conn = sqla+mysql://airflow:airflow@localhost:3306/airflow{code}
> After that I have used initdb command:
> {code:java}
> airflow initdb{code}
> Tables in the airflow db have been created.
> Then I have runed commands:
> {code:java}
> airflow worker{code}
> in logs:
> {code:java}
> [2020-01-14 18:39:03,457: INFO/MainProcess] Connected to 
> sqla+mysql://airflow:airflow@localhost:3306/airflow{code}
> password as a plain text.
> Probably comes that from Celery - 
> [https://github.com/celery/celery/blob/master/celery/worker/consumer/connection.py#L24]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-6560) db password leaks to logs

2020-01-17 Thread Jira


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6560?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018285#comment-17018285
 ] 

Marcin Jasiński edited comment on AIRFLOW-6560 at 1/17/20 8:31 PM:
---

Kombu version 4.6.3
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow'{code}
Kombu version 4.6.7
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:**@airflow-mysql:3306/airflow'{code}
You should bump (or remove like on master branch) Kombu version :) 

[https://github.com/apache/airflow/blob/73bf718358c01cc41e5f23b914a8824a2665a28c/setup.py#L161]

After bumping:
{code:java}
[2020-01-17 21:28:32,313: INFO/MainProcess] Connected to 
sqla+mysql://airflow:**@airflow-mysql:3306/airflow
[2020-01-17 21:28:32,427: INFO/MainProcess] celery@8085c534fdac ready.{code}


was (Author: mkjasinski):
Kombu version 4.6.3
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow'{code}
Kombu version 4.6.7
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:**@airflow-mysql:3306/airflow'{code}
You should bump (or remove like on master branch) Kombu version :) 

[https://github.com/apache/airflow/blob/73bf718358c01cc41e5f23b914a8824a2665a28c/setup.py#L161]

> db password leaks to logs
> -
>
> Key: AIRFLOW-6560
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6560
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging, security
>Affects Versions: 1.10.6
>Reporter: Marcin Jasiński
>Priority: Critical
>
> I have configured Airflow metadata db as MySQL.
> {code:java}
> sql_alchemy_conn = sqla+mysql://airflow:airflow@localhost:3306/airflow{code}
> After that I have used initdb command:
> {code:java}
> airflow initdb{code}
> Tables in the airflow db have been created.
> Then I have runed commands:
> {code:java}
> airflow worker{code}
> in logs:
> {code:java}
> [2020-01-14 18:39:03,457: INFO/MainProcess] Connected to 
> sqla+mysql://airflow:airflow@localhost:3306/airflow{code}
> password as a plain text.
> Probably comes that from Celery - 
> [https://github.com/celery/celery/blob/master/celery/worker/consumer/connection.py#L24]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-6560) db password leaks to logs

2020-01-17 Thread Jira


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6560?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018285#comment-17018285
 ] 

Marcin Jasiński edited comment on AIRFLOW-6560 at 1/17/20 8:24 PM:
---

Kombu version 4.6.3
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow'{code}
Kombu version 4.6.7
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:**@airflow-mysql:3306/airflow'{code}
You should bump (or remove like on master branch) Kombu version :) 

[https://github.com/apache/airflow/blob/73bf718358c01cc41e5f23b914a8824a2665a28c/setup.py#L161]


was (Author: mkjasinski):
Kombu version 4.6.3
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow'{code}
Kombu version 4.6.7
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:**@airflow-mysql:3306/airflow'{code}
You should bump (or remove) Kombu version :) 

https://github.com/apache/airflow/blob/73bf718358c01cc41e5f23b914a8824a2665a28c/setup.py#L161

> db password leaks to logs
> -
>
> Key: AIRFLOW-6560
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6560
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging, security
>Affects Versions: 1.10.6
>Reporter: Marcin Jasiński
>Priority: Critical
>
> I have configured Airflow metadata db as MySQL.
> {code:java}
> sql_alchemy_conn = sqla+mysql://airflow:airflow@localhost:3306/airflow{code}
> After that I have used initdb command:
> {code:java}
> airflow initdb{code}
> Tables in the airflow db have been created.
> Then I have runed commands:
> {code:java}
> airflow worker{code}
> in logs:
> {code:java}
> [2020-01-14 18:39:03,457: INFO/MainProcess] Connected to 
> sqla+mysql://airflow:airflow@localhost:3306/airflow{code}
> password as a plain text.
> Probably comes that from Celery - 
> [https://github.com/celery/celery/blob/master/celery/worker/consumer/connection.py#L24]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6560) db password leaks to logs

2020-01-17 Thread Jira


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6560?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018285#comment-17018285
 ] 

Marcin Jasiński commented on AIRFLOW-6560:
--

Kombu version 4.6.3
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow'{code}
Kombu version 4.6.7
{code:java}
>>> from kombu.connection import Connection
>>> c = Connection("sqla+mysql://airflow:airflow@airflow-mysql:3306/airflow")
>>> c.as_uri()
'sqla+mysql://airflow:**@airflow-mysql:3306/airflow'{code}
You should bump (or remove) Kombu version :) 

https://github.com/apache/airflow/blob/73bf718358c01cc41e5f23b914a8824a2665a28c/setup.py#L161

> db password leaks to logs
> -
>
> Key: AIRFLOW-6560
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6560
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging, security
>Affects Versions: 1.10.6
>Reporter: Marcin Jasiński
>Priority: Critical
>
> I have configured Airflow metadata db as MySQL.
> {code:java}
> sql_alchemy_conn = sqla+mysql://airflow:airflow@localhost:3306/airflow{code}
> After that I have used initdb command:
> {code:java}
> airflow initdb{code}
> Tables in the airflow db have been created.
> Then I have runed commands:
> {code:java}
> airflow worker{code}
> in logs:
> {code:java}
> [2020-01-14 18:39:03,457: INFO/MainProcess] Connected to 
> sqla+mysql://airflow:airflow@localhost:3306/airflow{code}
> password as a plain text.
> Probably comes that from Celery - 
> [https://github.com/celery/celery/blob/master/celery/worker/consumer/connection.py#L24]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] leonardoam commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral storage on KubernetesPodOp…

2020-01-17 Thread GitBox
leonardoam commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral 
storage on KubernetesPodOp…
URL: https://github.com/apache/airflow/pull/6337#issuecomment-575773177
 
 
   Done


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368107102
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -81,6 +85,17 @@ def setUp(self):
 self.ti.try_number = 1
 self.ti.state = State.RUNNING
 self.addCleanup(self.dag.clear)
+self.index_name2 = "test_index2"
+self.es_task_handler2 = ElasticsearchTaskHandler(
+self.local_log_location,
 
 Review comment:
   can you please suggest anything more meaningful. I would be happy to 
accommodate. it is literally handler2 with test_index2. Because the index name 
is part of the handler, we need to use a separate log handler here in order to 
use a different index name here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368107102
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -81,6 +85,17 @@ def setUp(self):
 self.ti.try_number = 1
 self.ti.state = State.RUNNING
 self.addCleanup(self.dag.clear)
+self.index_name2 = "test_index2"
+self.es_task_handler2 = ElasticsearchTaskHandler(
+self.local_log_location,
 
 Review comment:
   can you please suggest anything more meaningful. I would be happy to 
accommodate. it is literally handler2 with test_index2. Because the index name 
is part of the handler, we need to use a separate log handler here in order to 
use a different index name.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368105431
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -262,21 +277,55 @@ def 
test_set_context_w_json_format_and_write_stdout(self):
 self.es_task_handler.json_format = True
 self.es_task_handler.set_context(self.ti)
 
-def test_close(self):
-formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s 
- %(message)s')
-self.es_task_handler.formatter = formatter
+def test_close_with_log_id(self):
+es_task_handler = ElasticsearchTaskHandler(
+self.local_log_location,
+self.filename_template,
+self.log_id_template,
+self.end_of_log_mark,
+self.write_stdout,
+True,  # json_format
 
 Review comment:
   when you work with elastic search, json formatter is the way to go. We do 
not have grok filter in the unit tests to parse and index this non-json format 
message and then index them and look up as I do in the test case here. 
Realistically we should probably remove the non-json format option, and it is 
not practical to deploy it correctly as far as I can tell.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368107637
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -262,21 +277,55 @@ def 
test_set_context_w_json_format_and_write_stdout(self):
 self.es_task_handler.json_format = True
 self.es_task_handler.set_context(self.ti)
 
-def test_close(self):
-formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s 
- %(message)s')
 
 Review comment:
   as I mentioned non-json format does not really work in the context of 
elastic search. I suspect no one can deploy it correctly.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6850: [AIRFLOW-6296] add ODBC hook & deprecation warning for pymssql

2020-01-17 Thread GitBox
codecov-io edited a comment on issue #6850: [AIRFLOW-6296] add ODBC hook & 
deprecation warning for pymssql
URL: https://github.com/apache/airflow/pull/6850#issuecomment-567427016
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6850?src=pr=h1) 
Report
   > Merging 
[#6850](https://codecov.io/gh/apache/airflow/pull/6850?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/5abce471e0690c6b8d06ca25685b0845c5fd270f?src=pr=desc)
 will **decrease** coverage by `0.14%`.
   > The diff coverage is `92.43%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6850/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6850?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6850  +/-   ##
   ==
   - Coverage   85.41%   85.27%   -0.15% 
   ==
 Files 753  711  -42 
 Lines   3968539503 -182 
   ==
   - Hits3389833687 -211 
   - Misses   5787 5816  +29
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6850?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/sensors/sql\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy9zZW5zb3JzL3NxbF9zZW5zb3IucHk=)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `77.4% <100%> (+8.62%)` | :arrow_up: |
   | 
[airflow/hooks/mssql\_hook.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9tc3NxbF9ob29rLnB5)
 | `76.19% <100%> (+17.36%)` | :arrow_up: |
   | 
[airflow/utils/helpers.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5)
 | `82.5% <100%> (+0.8%)` | :arrow_up: |
   | 
[airflow/utils/log/json\_formatter.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvanNvbl9mb3JtYXR0ZXIucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/operators/mssql\_operator.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXNzcWxfb3BlcmF0b3IucHk=)
 | `90.62% <85.71%> (+90.62%)` | :arrow_up: |
   | 
[airflow/providers/odbc/hooks/odbc.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvb2RiYy9ob29rcy9vZGJjLnB5)
 | `92.22% <92.22%> (ø)` | |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | ... and [160 
more](https://codecov.io/gh/apache/airflow/pull/6850/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6850?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6850?src=pr=footer). 
Last update 
[5abce47...2fed850](https://codecov.io/gh/apache/airflow/pull/6850?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368107397
 
 

 ##
 File path: airflow/utils/log/es_task_handler.py
 ##
 @@ -53,9 +53,11 @@ class ElasticsearchTaskHandler(FileTaskHandler, 
LoggingMixin):
 PAGE = 0
 MAX_LINE_PER_PAGE = 1000
 
+# 16 is reasonable in this case
+# pylint: disable-msg=too-many-arguments
 def __init__(self, base_log_folder, filename_template,
  log_id_template, end_of_log_mark,
- write_stdout, json_format, json_fields,
 
 Review comment:
   the default right now is "*" which has the same semantic as before and it is 
backward compatible.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368107102
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -81,6 +85,17 @@ def setUp(self):
 self.ti.try_number = 1
 self.ti.state = State.RUNNING
 self.addCleanup(self.dag.clear)
+self.index_name2 = "test_index2"
+self.es_task_handler2 = ElasticsearchTaskHandler(
+self.local_log_location,
 
 Review comment:
   can you please suggest anything more meaningful. I would be happy to 
accommodate. it is literally handler2 with test_index2. Because the index name 
is part of the handler, we need to use a separate index in order to use a 
different index name here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368105431
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -262,21 +277,55 @@ def 
test_set_context_w_json_format_and_write_stdout(self):
 self.es_task_handler.json_format = True
 self.es_task_handler.set_context(self.ti)
 
-def test_close(self):
-formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s 
- %(message)s')
-self.es_task_handler.formatter = formatter
+def test_close_with_log_id(self):
+es_task_handler = ElasticsearchTaskHandler(
+self.local_log_location,
+self.filename_template,
+self.log_id_template,
+self.end_of_log_mark,
+self.write_stdout,
+True,  # json_format
 
 Review comment:
   when you work with elastic, json formatter is the way to go. We do not have 
grok filter in the unit tests to parse and index this non-json format message 
and then index them and look up as I do in the test case here. Realistically we 
should probably remove the non-json format option, and it is not practical to 
deploy it correctly as far as I can tell.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368103893
 
 

 ##
 File path: airflow/utils/log/es_task_handler.py
 ##
 @@ -255,7 +256,9 @@ def close(self):
 
 # Mark the end of file using end of log mark,
 # so we know where to stop while auto-tailing.
-self.handler.stream.write(self.end_of_log_mark)
+if self.write_stdout:
 
 Review comment:
   We use "end_of_log_for_airflow_task_instance" string literally as the mark 
in our deployment. As I mentioned earlier, the current code does not work if 
you have whitespace in the mark


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
larryzhu2018 commented on a change in pull request #7141: [AIRFLOW-6544] add 
log_id to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368101978
 
 

 ##
 File path: airflow/utils/log/es_task_handler.py
 ##
 @@ -255,7 +256,9 @@ def close(self):
 
 # Mark the end of file using end of log mark,
 # so we know where to stop while auto-tailing.
-self.handler.stream.write(self.end_of_log_mark)
+if self.write_stdout:
+print()
+self.handler.emit(logging.makeLogRecord({'msg': self.end_of_log_mark}))
 
 
 Review comment:
   this works as long as you do not put white spaces into your end_of_log_mark. 
I think you would just ask for troubles by putting white space characters into 
the end-of-log mark.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7146: [AIRFLOW-6541] Use EmrJobFlowSensor for other states

2020-01-17 Thread GitBox
codecov-io edited a comment on issue #7146: [AIRFLOW-6541] Use EmrJobFlowSensor 
for other states
URL: https://github.com/apache/airflow/pull/7146#issuecomment-573448185
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=h1) 
Report
   > Merging 
[#7146](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/5abce471e0690c6b8d06ca25685b0845c5fd270f?src=pr=desc)
 will **decrease** coverage by `0.28%`.
   > The diff coverage is `91.42%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7146/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7146  +/-   ##
   ==
   - Coverage   85.41%   85.13%   -0.29% 
   ==
 Files 753  753  
 Lines   3968539705  +20 
   ==
   - Hits3389833803  -95 
   - Misses   5787 5902 +115
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/providers/amazon/aws/sensors/emr\_step.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Vtcl9zdGVwLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[...rflow/providers/amazon/aws/sensors/emr\_job\_flow.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Vtcl9qb2JfZmxvdy5weQ==)
 | `96.29% <100%> (+0.84%)` | :arrow_up: |
   | 
[airflow/providers/amazon/aws/sensors/emr\_base.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Vtcl9iYXNlLnB5)
 | `91.66% <80%> (-8.34%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `76.47% <0%> (-21.18%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=footer). 
Last update 
[5abce47...cc340ea](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7146: [AIRFLOW-6541] Use EmrJobFlowSensor for other states

2020-01-17 Thread GitBox
codecov-io edited a comment on issue #7146: [AIRFLOW-6541] Use EmrJobFlowSensor 
for other states
URL: https://github.com/apache/airflow/pull/7146#issuecomment-573448185
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=h1) 
Report
   > Merging 
[#7146](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/5abce471e0690c6b8d06ca25685b0845c5fd270f?src=pr=desc)
 will **decrease** coverage by `0.49%`.
   > The diff coverage is `91.42%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7146/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master#7146 +/-   ##
   =
   - Coverage   85.41%   84.92%   -0.5% 
   =
 Files 753  753 
 Lines   3968539705 +20 
   =
   - Hits3389833719-179 
   - Misses   5787 5986+199
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/providers/amazon/aws/sensors/emr\_step.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Vtcl9zdGVwLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[...rflow/providers/amazon/aws/sensors/emr\_job\_flow.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Vtcl9qb2JfZmxvdy5weQ==)
 | `96.29% <100%> (+0.84%)` | :arrow_up: |
   | 
[airflow/providers/amazon/aws/sensors/emr\_base.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Vtcl9iYXNlLnB5)
 | `91.66% <80%> (-8.34%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `76.47% <0%> (-21.18%)` | :arrow_down: |
   | ... and [5 
more](https://codecov.io/gh/apache/airflow/pull/7146/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=footer). 
Last update 
[5abce47...cc340ea](https://codecov.io/gh/apache/airflow/pull/7146?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id 
to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368094796
 
 

 ##
 File path: airflow/utils/log/es_task_handler.py
 ##
 @@ -53,9 +53,11 @@ class ElasticsearchTaskHandler(FileTaskHandler, 
LoggingMixin):
 PAGE = 0
 MAX_LINE_PER_PAGE = 1000
 
+# 16 is reasonable in this case
+# pylint: disable-msg=too-many-arguments
 def __init__(self, base_log_folder, filename_template,
  log_id_template, end_of_log_mark,
- write_stdout, json_format, json_fields,
 
 Review comment:
   we may want to add a default value of `index` to be backward compatible. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id 
to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368094024
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -81,6 +85,17 @@ def setUp(self):
 self.ti.try_number = 1
 self.ti.state = State.RUNNING
 self.addCleanup(self.dag.clear)
+self.index_name2 = "test_index2"
+self.es_task_handler2 = ElasticsearchTaskHandler(
+self.local_log_location,
 
 Review comment:
   can you rename this to be more specific `es_task_handler2`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id 
to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368095723
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -262,21 +277,55 @@ def 
test_set_context_w_json_format_and_write_stdout(self):
 self.es_task_handler.json_format = True
 self.es_task_handler.set_context(self.ti)
 
-def test_close(self):
-formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s 
- %(message)s')
 
 Review comment:
   i think we should not remove this test as it is still a valid test case with 
the formatter set


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id 
to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368096806
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -262,21 +277,55 @@ def 
test_set_context_w_json_format_and_write_stdout(self):
 self.es_task_handler.json_format = True
 self.es_task_handler.set_context(self.ti)
 
-def test_close(self):
-formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s 
- %(message)s')
 
 Review comment:
   and your test only focus on `json format`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] lucafuji edited a comment on issue #6870: [AIRFLOW-0578] Check return code

2020-01-17 Thread GitBox
lucafuji edited a comment on issue #6870: [AIRFLOW-0578] Check return code
URL: https://github.com/apache/airflow/pull/6870#issuecomment-575732495
 
 
   > Im not sure your test as it stands covers the new code - the BashOperator 
handles return code for that command already.
   > 
   > I think what you need to test is a Python operator that calls 
`os._exit(1)` - but the real way of testing would be to run the test without 
calling your new on_failure, see it fail, then run it again with it enabled
   
   On 
https://github.com/apache/airflow/blob/master/airflow/operators/bash_operator.py#L137,
 bash operator will throw an exception if the return code is non zero.
   
   However, on 
https://github.com/apache/airflow/blob/master/airflow/task/task_runner/standard_task_runner.py#L85,
 it will catch this exception and return a return code of 1. This is what this 
PR trying to solve. This return code is not handled properly by local_task_job. 
 And my test case did cover this.
   
   But yep, I can still add another task with Python Operator to see whether it 
works.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id 
to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368092985
 
 

 ##
 File path: airflow/utils/log/es_task_handler.py
 ##
 @@ -255,7 +256,9 @@ def close(self):
 
 # Mark the end of file using end of log mark,
 # so we know where to stop while auto-tailing.
-self.handler.stream.write(self.end_of_log_mark)
+if self.write_stdout:
+print()
+self.handler.emit(logging.makeLogRecord({'msg': self.end_of_log_mark}))
 
 
 Review comment:
   ```
else logs[-1].message == self.end_of_log_mark.strip()
   ```
   I think this will not work if the `end_of_log_mark` is wrapped as a log 
record, as the `message` will be the format that Kevin mentioned and it also 
depends on the `log formatter`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id 
to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368090344
 
 

 ##
 File path: tests/utils/log/test_es_task_handler.py
 ##
 @@ -262,21 +277,55 @@ def 
test_set_context_w_json_format_and_write_stdout(self):
 self.es_task_handler.json_format = True
 self.es_task_handler.set_context(self.ti)
 
-def test_close(self):
-formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s 
- %(message)s')
-self.es_task_handler.formatter = formatter
+def test_close_with_log_id(self):
+es_task_handler = ElasticsearchTaskHandler(
+self.local_log_location,
+self.filename_template,
+self.log_id_template,
+self.end_of_log_mark,
+self.write_stdout,
+True,  # json_format
 
 Review comment:
   can you also have another test with non json format? and set the formatter 
as `logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')` 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id to end-of-file mark and also add an index config for logs

2020-01-17 Thread GitBox
pingzh commented on a change in pull request #7141: [AIRFLOW-6544] add log_id 
to end-of-file mark and also add an index config for logs
URL: https://github.com/apache/airflow/pull/7141#discussion_r368087731
 
 

 ##
 File path: airflow/utils/log/es_task_handler.py
 ##
 @@ -255,7 +256,9 @@ def close(self):
 
 # Mark the end of file using end of log mark,
 # so we know where to stop while auto-tailing.
-self.handler.stream.write(self.end_of_log_mark)
+if self.write_stdout:
 
 Review comment:
   what is your end_of_log_mark? we are using `END_OF_LOG_MARK = u'\u0004\n'` 
can you try this end of log?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] houqp commented on a change in pull request #7187: [AIRFLOW-6576] fix scheduler crash caused by deleted task with sla misses

2020-01-17 Thread GitBox
houqp commented on a change in pull request #7187: [AIRFLOW-6576] fix scheduler 
crash caused by deleted task with sla misses
URL: https://github.com/apache/airflow/pull/7187#discussion_r368085595
 
 

 ##
 File path: airflow/jobs/scheduler_job.py
 ##
 @@ -429,7 +429,15 @@ def manage_slas(self, dag, session=None):
 """.format(task_list=task_list, 
blocking_task_list=blocking_task_list,
bug=asciiart.bug)
 
-tasks_missed_sla = [dag.get_task(sla.task_id) for sla in slas]
+tasks_missed_sla = []
+for sla in slas:
+try:
+task = dag.get_task(sla.task_id)
+except AirflowException:
+# task already deleted from DAG, skip it
 
 Review comment:
   Yeah, that's the first thing came to my mind, I expected that method to 
throw a more specific exception. I think we should have a clean up PR to change 
that method's behavior.
   
   Ideally, i think it should just return None if the task is not found just 
like a dictionary. Then the caller can explicitly check for the return and 
handle the edge-case. This way, we won't run into issues like this going 
forward.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #7138: [AIRFLOW-5912] Expose lineage API

2020-01-17 Thread GitBox
codecov-io commented on issue #7138: [AIRFLOW-5912] Expose lineage API
URL: https://github.com/apache/airflow/pull/7138#issuecomment-575737364
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7138?src=pr=h1) 
Report
   > Merging 
[#7138](https://codecov.io/gh/apache/airflow/pull/7138?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/d654d69d7794a57c5c51685a8a96f1d7c38c2302?src=pr=desc)
 will **increase** coverage by `0.15%`.
   > The diff coverage is `77.02%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7138/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7138?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #7138  +/-   ##
   =
   + Coverage   85.24%   85.4%   +0.15% 
   =
 Files 683 755  +72 
 Lines   39155   39758 +603 
   =
   + Hits33378   33954 +576 
   - Misses   57775804  +27
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7138?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/api/client/json\_client.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY2xpZW50L2pzb25fY2xpZW50LnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/www/api/experimental/endpoints.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvYXBpL2V4cGVyaW1lbnRhbC9lbmRwb2ludHMucHk=)
 | `89.81% <100%> (+1.09%)` | :arrow_up: |
   | 
[airflow/lineage/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9saW5lYWdlL19faW5pdF9fLnB5)
 | `93.84% <100%> (ø)` | :arrow_up: |
   | 
[airflow/api/client/local\_client.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY2xpZW50L2xvY2FsX2NsaWVudC5weQ==)
 | `92% <50%> (-8%)` | :arrow_down: |
   | 
[airflow/api/client/api\_client.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY2xpZW50L2FwaV9jbGllbnQucHk=)
 | `63.15% <50%> (-1.55%)` | :arrow_down: |
   | 
[airflow/example\_dags/example\_papermill\_operator.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9wYXBlcm1pbGxfb3BlcmF0b3IucHk=)
 | `68.18% <68.18%> (ø)` | |
   | 
[airflow/api/common/experimental/get\_lineage.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC9nZXRfbGluZWFnZS5weQ==)
 | `89.47% <89.47%> (ø)` | |
   | 
[...rflow/contrib/sensors/sagemaker\_training\_sensor.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc2FnZW1ha2VyX3RyYWluaW5nX3NlbnNvci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/azure\_data\_lake\_hook.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2F6dXJlX2RhdGFfbGFrZV9ob29rLnB5)
 | `0% <0%> (-93.11%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/azure\_cosmos\_sensor.py](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvYXp1cmVfY29zbW9zX3NlbnNvci5weQ==)
 | `0% <0%> (-81.25%)` | :arrow_down: |
   | ... and [228 
more](https://codecov.io/gh/apache/airflow/pull/7138/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7138?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7138?src=pr=footer). 
Last update 
[d654d69...a55ee87](https://codecov.io/gh/apache/airflow/pull/7138?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7160: [AIRFLOW-1467] Dynamic pooling via allowing tasks to use more than one pool slot (depending upon the need)

2020-01-17 Thread GitBox
codecov-io edited a comment on issue #7160: [AIRFLOW-1467] Dynamic pooling via 
allowing tasks to use more than one pool slot (depending upon the need)
URL: https://github.com/apache/airflow/pull/7160#issuecomment-574096290
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7160?src=pr=h1) 
Report
   > Merging 
[#7160](https://codecov.io/gh/apache/airflow/pull/7160?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/5abce471e0690c6b8d06ca25685b0845c5fd270f?src=pr=desc)
 will **decrease** coverage by `0.99%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7160/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7160?src=pr=tree)
   
   ```diff
   @@   Coverage Diff@@
   ##   master#7160+/-   ##
   
   - Coverage   85.41%   84.42%-1% 
   
 Files 753  753
 Lines   3968539693 +8 
   
   - Hits3389833509   -389 
   - Misses   5787 6184   +397
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7160?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/pool.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvcG9vbC5weQ==)
 | `97.36% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/baseoperator.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvYmFzZW9wZXJhdG9yLnB5)
 | `96.28% <100%> (+0.02%)` | :arrow_up: |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `94.96% <100%> (+0.03%)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/pool\_slots\_available\_dep.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvcG9vbF9zbG90c19hdmFpbGFibGVfZGVwLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...flow/providers/apache/cassandra/hooks/cassandra.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL2Nhc3NhbmRyYS9ob29rcy9jYXNzYW5kcmEucHk=)
 | `21.51% <0%> (-72.16%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/api/auth/backend/kerberos\_auth.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvYXV0aC9iYWNrZW5kL2tlcmJlcm9zX2F1dGgucHk=)
 | `28.16% <0%> (-54.93%)` | :arrow_down: |
   | 
[...irflow/contrib/operators/redis\_publish\_operator.py](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9yZWRpc19wdWJsaXNoX29wZXJhdG9yLnB5)
 | `50% <0%> (-50%)` | :arrow_down: |
   | ... and [16 
more](https://codecov.io/gh/apache/airflow/pull/7160/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7160?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7160?src=pr=footer). 
Last update 
[5abce47...3a5351d](https://codecov.io/gh/apache/airflow/pull/7160?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] lucafuji commented on issue #6870: [AIRFLOW-0578] Check return code

2020-01-17 Thread GitBox
lucafuji commented on issue #6870: [AIRFLOW-0578] Check return code
URL: https://github.com/apache/airflow/pull/6870#issuecomment-575732495
 
 
   > Im not sure your test as it stands covers the new code - the BashOperator 
handles return code for that command already.
   > 
   > I think what you need to test is a Python operator that calls 
`os._exit(1)` - but the real way of testing would be to run the test without 
calling your new on_failure, see it fail, then run it again with it enabled
   
   On 
https://github.com/apache/airflow/blob/master/airflow/operators/bash_operator.py#L137,
 bash operator will throw an exception if the return code is non zero.
   
   However, on 
https://github.com/apache/airflow/blob/master/airflow/task/task_runner/standard_task_runner.py#L85,
 it will catch this exception and return a return code of 1. This is what this 
PR trying to solve. This return code is not handled properly by local_task_job. 
   
   But yep, I can still add another task with Python Operator to see whether it 
works.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #7162: [AIRFLOW-6557] Add test for newly added fields in BaseOperator

2020-01-17 Thread GitBox
kaxil commented on issue #7162: [AIRFLOW-6557] Add test for newly added fields 
in BaseOperator
URL: https://github.com/apache/airflow/pull/7162#issuecomment-575729720
 
 
   @bolkedebruin
   
   Would be curious to know why you feel so? And what would you recommend as an 
alternative?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #7138: [AIRFLOW-5912] Expose lineage API

2020-01-17 Thread GitBox
potiuk commented on issue #7138: [AIRFLOW-5912] Expose lineage API
URL: https://github.com/apache/airflow/pull/7138#issuecomment-575726019
 
 
   Will do this weeknd. Seems tha the travis errors are not completely gone - 
we have just one thing left (CoreDNS configuration in Kind) to fix in order to 
be ready to move to Github Actions.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #7162: [AIRFLOW-6557] Add test for newly added fields in BaseOperator

2020-01-17 Thread GitBox
potiuk commented on issue #7162: [AIRFLOW-6557] Add test for newly added fields 
in BaseOperator
URL: https://github.com/apache/airflow/pull/7162#issuecomment-575724850
 
 
   > Just dropped a thought. I am not happy with the choice for JSON Schema.
   
   Not that I am for JSON schema (I was barely involved in DAG serialisation). 
For me it's simply one of the options one could consider (alongside Protocol 
Buffers for example). But I am really curious what would be your arguments 
against it @bolkedebruin  - It seems you have pretty strong opinion.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second attempt to add singularity to airflow

2020-01-17 Thread GitBox
vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second 
attempt to add singularity to airflow
URL: https://github.com/apache/airflow/pull/7191#discussion_r368029609
 
 

 ##
 File path: airflow/contrib/operators/singularity_operator.py
 ##
 @@ -0,0 +1,176 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from spython.main import Client
+import shutil
+import ast
+import os
+
+
+class SingularityOperator(BaseOperator):
+"""
+Execute a command inside a Singularity container
+
+Singularity has more seamless connection to the host than Docker, so
+no special binds are needed to ensure binding content in the user $HOME
+and temporary directories. If the user needs custom binds, this can
+be done with --volumes
+
+:param image: Singularity image or URI from which to create the container.
+:type image: str
+:param auto_remove: Delete the container when the process exits
+The default is False.
+:type auto_remove: bool
+:param command: Command to be run in the container. (templated)
+:type command: str or list
+:param start_command: start command to pass to the container instance
+:type start_command: string or list
+:param environment: Environment variables to set in the container. 
(templated)
+:type environment: dict
+:param force_pull: Pull the image on every run. Default is False.
+:type force_pull: bool
+:param volumes: List of volumes to mount into the container, e.g.
+``['/host/path:/container/path', '/host/path2:/container/path2']``.
+:param options: other flags (list) to provide to the instance start
+:type options: list
+:param working_dir: Working directory to
+set on the container (equivalent to the -w switch the docker client)
+:type working_dir: str
+"""
+template_fields = ('command', 'environment',)
+template_ext = ('.sh', '.bash',)
+
+@apply_defaults
+def __init__(
+self,
+image,
+api_version=None,
+command=None,
+start_command=None,
+environment=None,
+pull_folder=None,
+force_pull=False,
+volumes=None,
+options=None,
+working_dir=None,
+auto_remove=False,
 
 Review comment:
   I don't see examples for other contrib operators (just looked at Docker) and 
I don't have experience using them, so I need to ask for help here. I can see 
that the imports are generally `from typing import ...` but I'm not sure how to 
use them.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second attempt to add singularity to airflow

2020-01-17 Thread GitBox
vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second 
attempt to add singularity to airflow
URL: https://github.com/apache/airflow/pull/7191#discussion_r368025594
 
 

 ##
 File path: airflow/contrib/operators/singularity_operator.py
 ##
 @@ -0,0 +1,176 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from spython.main import Client
+import shutil
+import ast
+import os
+
+
+class SingularityOperator(BaseOperator):
+"""
+Execute a command inside a Singularity container
+
+Singularity has more seamless connection to the host than Docker, so
+no special binds are needed to ensure binding content in the user $HOME
+and temporary directories. If the user needs custom binds, this can
+be done with --volumes
+
+:param image: Singularity image or URI from which to create the container.
+:type image: str
+:param auto_remove: Delete the container when the process exits
+The default is False.
+:type auto_remove: bool
+:param command: Command to be run in the container. (templated)
+:type command: str or list
+:param start_command: start command to pass to the container instance
+:type start_command: string or list
+:param environment: Environment variables to set in the container. 
(templated)
+:type environment: dict
+:param force_pull: Pull the image on every run. Default is False.
+:type force_pull: bool
+:param volumes: List of volumes to mount into the container, e.g.
+``['/host/path:/container/path', '/host/path2:/container/path2']``.
+:param options: other flags (list) to provide to the instance start
+:type options: list
+:param working_dir: Working directory to
+set on the container (equivalent to the -w switch the docker client)
+:type working_dir: str
+"""
+template_fields = ('command', 'environment',)
+template_ext = ('.sh', '.bash',)
+
+@apply_defaults
+def __init__(
+self,
+image,
+api_version=None,
+command=None,
+start_command=None,
+environment=None,
+pull_folder=None,
+force_pull=False,
+volumes=None,
+options=None,
+working_dir=None,
+auto_remove=False,
+*args,
+**kwargs):
+
+super(SingularityOperator, self).__init__(*args, **kwargs)
+self.api_version = api_version
+self.auto_remove = auto_remove
+self.command = command
+self.start_command = start_command
+self.environment = environment or {}
+self.force_pull = force_pull
+self.image = image
+self.instance = None
+self.options = options or []
+self.pull_folder = pull_folder
+self.volumes = volumes or []
+self.working_dir = working_dir
+self.cli = None
+self.container = None
+
+def execute(self, context):
+self.log.info('Preparing Singularity container %s', self.image)
+self.cli = Client
+
+if not self.command:
+raise AirflowException('You must define a command.')
+
+# Pull the container if asked, and ensure not a binary file
+if self.force_pull and not os.path.exists(self.image):
+self.log.info('Pulling container %s', self.image)
+image, lines = self.cli.pull(self.image, stream=True)
+for line in lines:
+self.log.info(line)
+
+# Move the container to where it's desired
+if self.pull_folder is not None:
+self.image = os.path.join(self.pull_folder, 
os.path.basename(image))
+shutil.move(image, self.image)
+
+# Prepare list of binds
+for bind in self.volumes:
+self.options = self.options + ['--bind', bind]
+
+# Does the user want a custom working directory?
+if self.working_dir is not None:
+self.options = self.options + ['--workdir', self.working_dir]
+
+ 

[GitHub] [airflow] vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second attempt to add singularity to airflow

2020-01-17 Thread GitBox
vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second 
attempt to add singularity to airflow
URL: https://github.com/apache/airflow/pull/7191#discussion_r368025298
 
 

 ##
 File path: airflow/contrib/operators/singularity_operator.py
 ##
 @@ -0,0 +1,176 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from spython.main import Client
+import shutil
+import ast
+import os
+
+
+class SingularityOperator(BaseOperator):
+"""
+Execute a command inside a Singularity container
+
+Singularity has more seamless connection to the host than Docker, so
+no special binds are needed to ensure binding content in the user $HOME
+and temporary directories. If the user needs custom binds, this can
+be done with --volumes
+
+:param image: Singularity image or URI from which to create the container.
+:type image: str
+:param auto_remove: Delete the container when the process exits
+The default is False.
+:type auto_remove: bool
+:param command: Command to be run in the container. (templated)
+:type command: str or list
+:param start_command: start command to pass to the container instance
+:type start_command: string or list
+:param environment: Environment variables to set in the container. 
(templated)
+:type environment: dict
+:param force_pull: Pull the image on every run. Default is False.
+:type force_pull: bool
+:param volumes: List of volumes to mount into the container, e.g.
+``['/host/path:/container/path', '/host/path2:/container/path2']``.
+:param options: other flags (list) to provide to the instance start
+:type options: list
+:param working_dir: Working directory to
+set on the container (equivalent to the -w switch the docker client)
+:type working_dir: str
+"""
+template_fields = ('command', 'environment',)
+template_ext = ('.sh', '.bash',)
+
+@apply_defaults
+def __init__(
+self,
+image,
+api_version=None,
+command=None,
+start_command=None,
+environment=None,
+pull_folder=None,
+force_pull=False,
+volumes=None,
+options=None,
+working_dir=None,
+auto_remove=False,
+*args,
+**kwargs):
+
+super(SingularityOperator, self).__init__(*args, **kwargs)
+self.api_version = api_version
+self.auto_remove = auto_remove
+self.command = command
+self.start_command = start_command
+self.environment = environment or {}
+self.force_pull = force_pull
+self.image = image
+self.instance = None
+self.options = options or []
+self.pull_folder = pull_folder
+self.volumes = volumes or []
+self.working_dir = working_dir
+self.cli = None
+self.container = None
+
+def execute(self, context):
+self.log.info('Preparing Singularity container %s', self.image)
+self.cli = Client
+
+if not self.command:
+raise AirflowException('You must define a command.')
+
+# Pull the container if asked, and ensure not a binary file
+if self.force_pull and not os.path.exists(self.image):
+self.log.info('Pulling container %s', self.image)
+image, lines = self.cli.pull(self.image, stream=True)
+for line in lines:
+self.log.info(line)
+
+# Move the container to where it's desired
+if self.pull_folder is not None:
+self.image = os.path.join(self.pull_folder, 
os.path.basename(image))
+shutil.move(image, self.image)
+
+# Prepare list of binds
+for bind in self.volumes:
+self.options = self.options + ['--bind', bind]
+
+# Does the user want a custom working directory?
+if self.working_dir is not None:
+self.options = self.options + ['--workdir', self.working_dir]
+
+ 

[GitHub] [airflow] vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second attempt to add singularity to airflow

2020-01-17 Thread GitBox
vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second 
attempt to add singularity to airflow
URL: https://github.com/apache/airflow/pull/7191#discussion_r368024809
 
 

 ##
 File path: airflow/contrib/operators/singularity_operator.py
 ##
 @@ -0,0 +1,176 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from spython.main import Client
+import shutil
+import ast
+import os
+
+
+class SingularityOperator(BaseOperator):
+"""
+Execute a command inside a Singularity container
+
+Singularity has more seamless connection to the host than Docker, so
+no special binds are needed to ensure binding content in the user $HOME
+and temporary directories. If the user needs custom binds, this can
+be done with --volumes
+
+:param image: Singularity image or URI from which to create the container.
+:type image: str
+:param auto_remove: Delete the container when the process exits
+The default is False.
+:type auto_remove: bool
+:param command: Command to be run in the container. (templated)
+:type command: str or list
+:param start_command: start command to pass to the container instance
+:type start_command: string or list
+:param environment: Environment variables to set in the container. 
(templated)
+:type environment: dict
+:param force_pull: Pull the image on every run. Default is False.
+:type force_pull: bool
+:param volumes: List of volumes to mount into the container, e.g.
+``['/host/path:/container/path', '/host/path2:/container/path2']``.
+:param options: other flags (list) to provide to the instance start
+:type options: list
+:param working_dir: Working directory to
+set on the container (equivalent to the -w switch the docker client)
+:type working_dir: str
+"""
+template_fields = ('command', 'environment',)
+template_ext = ('.sh', '.bash',)
+
+@apply_defaults
+def __init__(
+self,
+image,
+api_version=None,
+command=None,
+start_command=None,
+environment=None,
+pull_folder=None,
+force_pull=False,
+volumes=None,
+options=None,
+working_dir=None,
+auto_remove=False,
+*args,
+**kwargs):
+
+super(SingularityOperator, self).__init__(*args, **kwargs)
+self.api_version = api_version
+self.auto_remove = auto_remove
+self.command = command
+self.start_command = start_command
+self.environment = environment or {}
+self.force_pull = force_pull
+self.image = image
+self.instance = None
+self.options = options or []
+self.pull_folder = pull_folder
+self.volumes = volumes or []
+self.working_dir = working_dir
+self.cli = None
+self.container = None
+
+def execute(self, context):
+self.log.info('Preparing Singularity container %s', self.image)
+self.cli = Client
+
+if not self.command:
+raise AirflowException('You must define a command.')
+
+# Pull the container if asked, and ensure not a binary file
+if self.force_pull and not os.path.exists(self.image):
+self.log.info('Pulling container %s', self.image)
+image, lines = self.cli.pull(self.image, stream=True)
+for line in lines:
+self.log.info(line)
+
+# Move the container to where it's desired
+if self.pull_folder is not None:
+self.image = os.path.join(self.pull_folder, 
os.path.basename(image))
+shutil.move(image, self.image)
+
+# Prepare list of binds
+for bind in self.volumes:
+self.options = self.options + ['--bind', bind]
+
+# Does the user want a custom working directory?
+if self.working_dir is not None:
+self.options = self.options + ['--workdir', self.working_dir]
+
+ 

[GitHub] [airflow] vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second attempt to add singularity to airflow

2020-01-17 Thread GitBox
vsoch commented on a change in pull request #7191: [AIRFLOW-4030] second 
attempt to add singularity to airflow
URL: https://github.com/apache/airflow/pull/7191#discussion_r368024283
 
 

 ##
 File path: airflow/contrib/operators/singularity_operator.py
 ##
 @@ -0,0 +1,176 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from spython.main import Client
+import shutil
+import ast
+import os
+
+
+class SingularityOperator(BaseOperator):
+"""
+Execute a command inside a Singularity container
+
+Singularity has more seamless connection to the host than Docker, so
+no special binds are needed to ensure binding content in the user $HOME
+and temporary directories. If the user needs custom binds, this can
+be done with --volumes
+
+:param image: Singularity image or URI from which to create the container.
+:type image: str
+:param auto_remove: Delete the container when the process exits
+The default is False.
+:type auto_remove: bool
+:param command: Command to be run in the container. (templated)
+:type command: str or list
+:param start_command: start command to pass to the container instance
+:type start_command: string or list
+:param environment: Environment variables to set in the container. 
(templated)
+:type environment: dict
+:param force_pull: Pull the image on every run. Default is False.
+:type force_pull: bool
+:param volumes: List of volumes to mount into the container, e.g.
+``['/host/path:/container/path', '/host/path2:/container/path2']``.
+:param options: other flags (list) to provide to the instance start
+:type options: list
+:param working_dir: Working directory to
+set on the container (equivalent to the -w switch the docker client)
+:type working_dir: str
+"""
+template_fields = ('command', 'environment',)
+template_ext = ('.sh', '.bash',)
+
+@apply_defaults
+def __init__(
+self,
+image,
+api_version=None,
+command=None,
+start_command=None,
+environment=None,
+pull_folder=None,
+force_pull=False,
+volumes=None,
+options=None,
+working_dir=None,
+auto_remove=False,
+*args,
+**kwargs):
+
+super(SingularityOperator, self).__init__(*args, **kwargs)
+self.api_version = api_version
+self.auto_remove = auto_remove
+self.command = command
+self.start_command = start_command
+self.environment = environment or {}
+self.force_pull = force_pull
+self.image = image
+self.instance = None
+self.options = options or []
+self.pull_folder = pull_folder
+self.volumes = volumes or []
+self.working_dir = working_dir
+self.cli = None
+self.container = None
+
+def execute(self, context):
+self.log.info('Preparing Singularity container %s', self.image)
+self.cli = Client
+
+if not self.command:
+raise AirflowException('You must define a command.')
 
 Review comment:
   Singularity containers can usually serve multiple purposes, and especially 
for scientific use cases, they tend to be used with "exec" with custom entry 
points. So the thinking here was that the user should be able to instantiate 
the operator and then modify the command to be different things. This might be 
incorrect / not common usage of airflow (for example, if it would be expected 
to instantiate one container instance per command) in which case I agree, and 
we should remove the command=None and check for it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,

[jira] [Commented] (AIRFLOW-6582) Dag_stats endpoint doesn't filter correctly

2020-01-17 Thread Robin Edwards (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018165#comment-17018165
 ] 

Robin Edwards commented on AIRFLOW-6582:


Aha it seems AIRFLOW-6238 didnt get back ported in to 1.10.7 but the frontend 
does append the dag_ids parameter (most likely due to a follow up PR of mine 
that did get back ported.

> Dag_stats endpoint doesn't filter correctly
> ---
>
> Key: AIRFLOW-6582
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6582
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui, webserver
>Affects Versions: 1.10.7
>Reporter: Robin Edwards
>Assignee: Robin Edwards
>Priority: Minor
>
> Apologies my previous PR to restrict dags returned from the dag_stats end 
> point via  a get parameter applied the filter after the group by which had no 
> effect. So even if dag_ids was past all dags were still returned.
> Forthcoming PR fixes the issue by applying the filter before the group by



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6582) Dag_stats endpoint doesn't filter correctly

2020-01-17 Thread Robin Edwards (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robin Edwards updated AIRFLOW-6582:
---
Affects Version/s: (was: master)
   (was: 2.0.0)

> Dag_stats endpoint doesn't filter correctly
> ---
>
> Key: AIRFLOW-6582
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6582
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui, webserver
>Affects Versions: 1.10.7
>Reporter: Robin Edwards
>Assignee: Robin Edwards
>Priority: Minor
>
> Apologies my previous PR to restrict dags returned from the dag_stats end 
> point via  a get parameter applied the filter after the group by which had no 
> effect. So even if dag_ids was past all dags were still returned.
> Forthcoming PR fixes the issue by applying the filter before the group by



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6582) Dag_stats endpoint doesn't filter correctly

2020-01-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17018162#comment-17018162
 ] 

ASF GitHub Bot commented on AIRFLOW-6582:
-

robinedwards commented on pull request #7195: [AIRFLOW-6582] Dagstats not 
applying dag_id filter
URL: https://github.com/apache/airflow/pull/7195
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Dag_stats endpoint doesn't filter correctly
> ---
>
> Key: AIRFLOW-6582
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6582
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui, webserver
>Affects Versions: 2.0.0, 1.10.7, master
>Reporter: Robin Edwards
>Assignee: Robin Edwards
>Priority: Minor
>
> Apologies my previous PR to restrict dags returned from the dag_stats end 
> point via  a get parameter applied the filter after the group by which had no 
> effect. So even if dag_ids was past all dags were still returned.
> Forthcoming PR fixes the issue by applying the filter before the group by



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] robinedwards closed pull request #7195: [AIRFLOW-6582] Dagstats not applying dag_id filter

2020-01-17 Thread GitBox
robinedwards closed pull request #7195: [AIRFLOW-6582] Dagstats not applying 
dag_id filter
URL: https://github.com/apache/airflow/pull/7195
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] robinedwards commented on issue #7195: [AIRFLOW-6582] Dagstats not applying dag_id filter

2020-01-17 Thread GitBox
robinedwards commented on issue #7195: [AIRFLOW-6582] Dagstats not applying 
dag_id filter
URL: https://github.com/apache/airflow/pull/7195#issuecomment-575695393
 
 
   Aha I thought I was going insane. It seems AIRFLOW-6238 didnt make it into 
1.10.7 which is where this came from


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bolkedebruin commented on a change in pull request #7162: [AIRFLOW-6557] Add test for newly added fields in BaseOperator

2020-01-17 Thread GitBox
bolkedebruin commented on a change in pull request #7162: [AIRFLOW-6557] Add 
test for newly added fields in BaseOperator
URL: https://github.com/apache/airflow/pull/7162#discussion_r368018186
 
 

 ##
 File path: tests/serialization/test_dag_serialization.py
 ##
 @@ -543,6 +543,66 @@ def test_dag_serialized_fields_with_schema(self):
 dag_params: set = set(dag_schema.keys()) - ignored_keys
 self.assertEqual(set(DAG.get_serialized_fields()), dag_params)
 
+def test_no_new_fields_added_to_base_operator(self):
+"""
+This test verifies that there are no new fields added to BaseOperator. 
And reminds that
+tests should be added for it.
+"""
+base_operator = BaseOperator(task_id="10")
+fields = base_operator.__dict__
+self.assertEqual({'_dag': None,
+  '_downstream_task_ids': set(),
+  '_inlets': [],
+  '_log': base_operator.log,
+  '_outlets': [],
+  '_upstream_task_ids': set(),
+  'depends_on_past': False,
+  'do_xcom_push': True,
+  'email': None,
+  'email_on_failure': True,
+  'email_on_retry': True,
+  'end_date': None,
+  'execution_timeout': None,
+  'executor_config': {},
+  'inlets': [],
+  'max_retry_delay': None,
+  'on_execute_callback': None,
+  'on_failure_callback': None,
+  'on_retry_callback': None,
+  'on_success_callback': None,
+  'outlets': [],
+  'owner': 'airflow',
+  'params': {},
+  'pool': 'default_pool',
+  'priority_weight': 1,
+  'queue': 'default',
+  'resources': None,
+  'retries': 0,
+  'retry_delay': timedelta(0, 300),
+  'retry_exponential_backoff': False,
+  'run_as_user': None,
+  'sla': None,
+  'start_date': None,
+  'subdag': None,
+  'task_concurrency': None,
+  'task_id': '10',
+  'trigger_rule': 'all_success',
+  'wait_for_downstream': False,
+  'weight_rule': 'downstream'}, fields,
+ """
+!!!
+
+ ACTION NEEDED! PLEASE READ THIS CAREFULLY AND CORRECT TESTS CAREFULLY
+
+ Some fields were added to the BaseOperator! Please add them to the list above 
and make sure that
+ you add support for DAG serialization - you should add the field to
+ `airflow/serialization/schema.json` and add it in 
`serialized_simple_dag_ground_truth` above
 
 Review comment:
   Oh yes, and switch away from JSON Schema. Geez its 2020 :-P
   
   JSON you only use to interface *externally*. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bolkedebruin commented on a change in pull request #7162: [AIRFLOW-6557] Add test for newly added fields in BaseOperator

2020-01-17 Thread GitBox
bolkedebruin commented on a change in pull request #7162: [AIRFLOW-6557] Add 
test for newly added fields in BaseOperator
URL: https://github.com/apache/airflow/pull/7162#discussion_r368018186
 
 

 ##
 File path: tests/serialization/test_dag_serialization.py
 ##
 @@ -543,6 +543,66 @@ def test_dag_serialized_fields_with_schema(self):
 dag_params: set = set(dag_schema.keys()) - ignored_keys
 self.assertEqual(set(DAG.get_serialized_fields()), dag_params)
 
+def test_no_new_fields_added_to_base_operator(self):
+"""
+This test verifies that there are no new fields added to BaseOperator. 
And reminds that
+tests should be added for it.
+"""
+base_operator = BaseOperator(task_id="10")
+fields = base_operator.__dict__
+self.assertEqual({'_dag': None,
+  '_downstream_task_ids': set(),
+  '_inlets': [],
+  '_log': base_operator.log,
+  '_outlets': [],
+  '_upstream_task_ids': set(),
+  'depends_on_past': False,
+  'do_xcom_push': True,
+  'email': None,
+  'email_on_failure': True,
+  'email_on_retry': True,
+  'end_date': None,
+  'execution_timeout': None,
+  'executor_config': {},
+  'inlets': [],
+  'max_retry_delay': None,
+  'on_execute_callback': None,
+  'on_failure_callback': None,
+  'on_retry_callback': None,
+  'on_success_callback': None,
+  'outlets': [],
+  'owner': 'airflow',
+  'params': {},
+  'pool': 'default_pool',
+  'priority_weight': 1,
+  'queue': 'default',
+  'resources': None,
+  'retries': 0,
+  'retry_delay': timedelta(0, 300),
+  'retry_exponential_backoff': False,
+  'run_as_user': None,
+  'sla': None,
+  'start_date': None,
+  'subdag': None,
+  'task_concurrency': None,
+  'task_id': '10',
+  'trigger_rule': 'all_success',
+  'wait_for_downstream': False,
+  'weight_rule': 'downstream'}, fields,
+ """
+!!!
+
+ ACTION NEEDED! PLEASE READ THIS CAREFULLY AND CORRECT TESTS CAREFULLY
+
+ Some fields were added to the BaseOperator! Please add them to the list above 
and make sure that
+ you add support for DAG serialization - you should add the field to
+ `airflow/serialization/schema.json` and add it in 
`serialized_simple_dag_ground_truth` above
 
 Review comment:
   Oh yes, and switch away from JSON Schema. Geez its 2020 :-P


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bolkedebruin commented on issue #7138: [AIRFLOW-5912] Expose lineage API

2020-01-17 Thread GitBox
bolkedebruin commented on issue #7138: [AIRFLOW-5912] Expose lineage API
URL: https://github.com/apache/airflow/pull/7138#issuecomment-575687960
 
 
   Tests are passing (error seems transient). @mik-laj can you please review 
again? @potiuk maybe as well?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bolkedebruin commented on a change in pull request #7138: [AIRFLOW-5912] Expose lineage API

2020-01-17 Thread GitBox
bolkedebruin commented on a change in pull request #7138: [AIRFLOW-5912] Expose 
lineage API
URL: https://github.com/apache/airflow/pull/7138#discussion_r368011741
 
 

 ##
 File path: airflow/api/client/api_client.py
 ##
 @@ -70,3 +70,12 @@ def delete_pool(self, name):
 :param name: pool name
 """
 raise NotImplementedError()
+
+def get_lineage(self, dag_id: str, execution_date: str):
 
 Review comment:
   I'm not sure what you mean. The CLI can just use one of the interfaces to 
the API: local and json. The "local_client" should be eventually deprecated as 
it touches the database directly and that's a security concern. 
   
   Other application can use this package as well to interface with the API.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] jstern commented on a change in pull request #7187: [AIRFLOW-6576] fix scheduler crash caused by deleted task with sla misses

2020-01-17 Thread GitBox
jstern commented on a change in pull request #7187: [AIRFLOW-6576] fix 
scheduler crash caused by deleted task with sla misses
URL: https://github.com/apache/airflow/pull/7187#discussion_r368011208
 
 

 ##
 File path: airflow/jobs/scheduler_job.py
 ##
 @@ -429,7 +429,15 @@ def manage_slas(self, dag, session=None):
 """.format(task_list=task_list, 
blocking_task_list=blocking_task_list,
bug=asciiart.bug)
 
-tasks_missed_sla = [dag.get_task(sla.task_id) for sla in slas]
+tasks_missed_sla = []
+for sla in slas:
+try:
+task = dag.get_task(sla.task_id)
+except AirflowException:
+# task already deleted from DAG, skip it
 
 Review comment:
   I was going to suggest maybe catching a more specific exception subclass 
(like `AirflowNotFoundException`, `TaskNotFound`, or `TaskInstanceNotFound` as 
appropriate), but it looks like we really do just throw `AirflowException` in 
the method: 
https://github.com/apache/airflow/blob/master/airflow/models/dag.py#L1214. 
Changing the exception that `get_task` method raises might be out of scope for 
this PR and have other unanticipated consequences, but it's something to think 
about...


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mustafagok commented on a change in pull request #7146: [AIRFLOW-6541] Use EmrJobFlowSensor for other states

2020-01-17 Thread GitBox
mustafagok commented on a change in pull request #7146: [AIRFLOW-6541] Use 
EmrJobFlowSensor for other states
URL: https://github.com/apache/airflow/pull/7146#discussion_r367959255
 
 

 ##
 File path: airflow/contrib/sensors/emr_step_sensor.py
 ##
 @@ -16,52 +16,102 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+
 from airflow.contrib.hooks.emr_hook import EmrHook
 from airflow.contrib.sensors.emr_base_sensor import EmrBaseSensor
 from airflow.utils.decorators import apply_defaults
 
 
 class EmrStepSensor(EmrBaseSensor):
 """
-Asks for the state of the step until it reaches a terminal state.
+Asks for the state of the step until it reaches any of the target states.
 If it fails the sensor errors, failing the task.
 
+With the default target states, sensor waits step to be completed.
+
 :param job_flow_id: job_flow_id which contains the step check the state of
 :type job_flow_id: str
+
 
 Review comment:
   Some of the classes has this, and it seems better to me. But if you want so, 
I will remove all of the empty line between param-type groups.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mustafagok commented on a change in pull request #7146: [AIRFLOW-6541] Use EmrJobFlowSensor for other states

2020-01-17 Thread GitBox
mustafagok commented on a change in pull request #7146: [AIRFLOW-6541] Use 
EmrJobFlowSensor for other states
URL: https://github.com/apache/airflow/pull/7146#discussion_r367953646
 
 

 ##
 File path: airflow/contrib/sensors/emr_job_flow_sensor.py
 ##
 @@ -16,48 +16,96 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+
 from airflow.contrib.hooks.emr_hook import EmrHook
 from airflow.contrib.sensors.emr_base_sensor import EmrBaseSensor
 from airflow.utils.decorators import apply_defaults
 
 
 class EmrJobFlowSensor(EmrBaseSensor):
 """
-Asks for the state of the JobFlow until it reaches a terminal state.
+Asks for the state of the EMR JobFlow (Cluster) until it reaches
+any of the target states.
 If it fails the sensor errors, failing the task.
 
+With the default target states, sensor waits cluster to be terminated.
+When target_states is set to ['RUNNING', 'WAITING'] sensor waits
+until job flow to be ready (after 'STARTING' and 'BOOTSTRAPPING' states)
+
 :param job_flow_id: job_flow_id to check the state of
 :type job_flow_id: str
+
+:param target_states: the target states, sensor waits until
+job flow reaches any of these states
+:type target_states: list[str]
+
+:param failed_states: the failure states, sensor fails when
+job flow reaches any of these states
+:type failed_states: list[str]
 """
 
-NON_TERMINAL_STATES = ['STARTING', 'BOOTSTRAPPING', 'RUNNING',
-   'WAITING', 'TERMINATING']
-FAILED_STATE = ['TERMINATED_WITH_ERRORS']
-template_fields = ['job_flow_id']
+template_fields = ['job_flow_id', 'target_states', 'failed_states']
 template_ext = ()
 
 @apply_defaults
 def __init__(self,
  job_flow_id,
+ target_states=None,
+ failed_states=None,
  *args,
  **kwargs):
 super().__init__(*args, **kwargs)
 self.job_flow_id = job_flow_id
+if target_states is None:
+target_states = ['TERMINATED']
 
 Review comment:
   Actually I have reversed the state logic, previously it was checking whether 
the state is in the `NON_TERMINAL_STATES` or not, if it is in the list, it was 
returning `False`. Now it is checking whether the state is in the 
`self.target_states` or not, if it is in it returns `True`. I could say 
`self.non_target_states` but is does not seem well.
   I am sure that it is bacward compatible with this list, and if it was not, 
the unit test would fail. (I have also updated them to simplify, there were 4 
classes almost same, I changed them with 1 common class).
   
   oneliner is good idea, I will do it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] leonardoam commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral storage on KubernetesPodOp…

2020-01-17 Thread GitBox
leonardoam commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral 
storage on KubernetesPodOp…
URL: https://github.com/apache/airflow/pull/6337#issuecomment-575640561
 
 
   Nice, thanks @potiuk ! I will proceed rebasing.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] shaikshakeel commented on a change in pull request #6469: [AIRFLOW-5816] S3 to snowflake operator

2020-01-17 Thread GitBox
shaikshakeel commented on a change in pull request #6469: [AIRFLOW-5816] S3 to 
snowflake operator
URL: https://github.com/apache/airflow/pull/6469#discussion_r367951609
 
 

 ##
 File path: airflow/providers/snowflake/operators/snowflake.py
 ##
 @@ -0,0 +1,90 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from airflow.models import BaseOperator
+from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook
+from airflow.utils.decorators import apply_defaults
+
+
+class SnowflakeOperator(BaseOperator):
+"""
+Executes sql code in a Snowflake database
+
+:param snowflake_conn_id: reference to specific snowflake connection id
+:type snowflake_conn_id: str
+:param sql: the sql code to be executed. (templated)
+:type sql: Can receive a str representing a sql statement,
+a list of str (sql statements), or reference to a template file.
+Template reference are recognized by str ending in '.sql'
+:param autocommit: if True, each command is automatically committed.
+(default value: True)
+:type autocommit: bool
+:param parameters: (optional) the parameters to render the SQL query with.
+:type parameters: mapping or iterable
+:param warehouse: name of warehouse (will overwrite any warehouse
+defined in the connection's extra JSON)
+:type warehouse: str
+:param database: name of database (will overwrite database defined
+in connection)
+:type database: str
+:param schema: name of schema (will overwrite schema defined in
+connection)
+:type schema: str
+:param role: name of role (will overwrite any role defined in
+connection's extra JSON)
+:type role: str
+"""
+
+template_fields = ('sql',)
+template_ext = ('.sql',)
+ui_color = '#ededed'
+
+@apply_defaults
+def __init__(
+self, sql, snowflake_conn_id='snowflake_default', parameters=None,
+autocommit=True, warehouse=None, database=None, role=None,
+schema=None, *args, **kwargs):
+super(SnowflakeOperator, self).__init__(*args, **kwargs)
+self.snowflake_conn_id = snowflake_conn_id
+self.sql = sql
+self.autocommit = autocommit
+self.parameters = parameters
+self.warehouse = warehouse
+self.database = database
+self.role = role
+self.schema = schema
+
+def get_hook(self):
+"""
+Create and return SnowflakeHook.
+:return SnowflakeHook: An SnowflakeHook instance.
 
 Review comment:
   already line is removed


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral storage on KubernetesPodOp…

2020-01-17 Thread GitBox
potiuk commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral 
storage on KubernetesPodOp…
URL: https://github.com/apache/airflow/pull/6337#issuecomment-575638747
 
 
   @leonardoam @konpap94 -> We had some intermittent problems with Travis. They 
are now solved so you can rebase and there should be no more problems caused by 
environment. Apologies for the problems. We've suffered from that quite a lot 
but it's fixed now!
   
   I am happy to review it after it is rebased/green


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ramkrishnan8994 commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral storage on KubernetesPodOp…

2020-01-17 Thread GitBox
ramkrishnan8994 commented on issue #6337: [AIRFLOW-5659] - Add support for 
ephemeral storage on KubernetesPodOp…
URL: https://github.com/apache/airflow/pull/6337#issuecomment-575632471
 
 
   @leonardoam @konpap94  - When this will be merged. Causing issues for us and 
this might help us.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Issue Comment Deleted] (AIRFLOW-6117) Rename mssql_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6117?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski updated AIRFLOW-6117:

Comment: was deleted

(was: Here is the correct task
https://issues.apache.org/jira/browse/AIRFLOW-6097  )

> Rename mssql_to_gcs service
> ---
>
> Key: AIRFLOW-6117
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6117
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Issue Comment Deleted] (AIRFLOW-6151) Rename adls_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski updated AIRFLOW-6151:

Comment: was deleted

(was: Here is the correct task
https://issues.apache.org/jira/browse/AIRFLOW-6097  )

> Rename adls_to_gcs service
> --
>
> Key: AIRFLOW-6151
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6151
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Issue Comment Deleted] (AIRFLOW-6152) Rename bigquery_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski updated AIRFLOW-6152:

Comment: was deleted

(was: Here is the correct task
https://issues.apache.org/jira/browse/AIRFLOW-6097  )

> Rename bigquery_to_gcs service
> --
>
> Key: AIRFLOW-6152
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6152
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Issue Comment Deleted] (AIRFLOW-6149) Rename postgres_to_gcs. service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski updated AIRFLOW-6149:

Comment: was deleted

(was: Here is the correct task
https://issues.apache.org/jira/browse/AIRFLOW-6097  )

> Rename postgres_to_gcs. service
> ---
>
> Key: AIRFLOW-6149
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6149
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-6127) Rename text_to_speech service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski resolved AIRFLOW-6127.
-
Fix Version/s: 2.0.0
   Resolution: Fixed

> Rename text_to_speech service
> -
>
> Key: AIRFLOW-6127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6127
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
> Fix For: 2.0.0
>
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Issue Comment Deleted] (AIRFLOW-6148) Rename gcs_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski updated AIRFLOW-6148:

Comment: was deleted

(was: Here is the correct task
https://issues.apache.org/jira/browse/AIRFLOW-6097 )

> Rename gcs_to_gcs service
> -
>
> Key: AIRFLOW-6148
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6148
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6127) Rename text_to_speech service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6127:
-

> Rename text_to_speech service
> -
>
> Key: AIRFLOW-6127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6127
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Closed] (AIRFLOW-6127) Rename text_to_speech service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski closed AIRFLOW-6127.
---
Resolution: Fixed

> Rename text_to_speech service
> -
>
> Key: AIRFLOW-6127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6127
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Issue Comment Deleted] (AIRFLOW-6127) Rename text_to_speech service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski updated AIRFLOW-6127:

Comment: was deleted

(was: Here is the correct task
https://issues.apache.org/jira/browse/AIRFLOW-6097 )

> Rename text_to_speech service
> -
>
> Key: AIRFLOW-6127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6127
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6127) Rename text_to_speech service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6127:
-

> Rename text_to_speech service
> -
>
> Key: AIRFLOW-6127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6127
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6149) Rename postgres_to_gcs. service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6149:
-

> Rename postgres_to_gcs. service
> ---
>
> Key: AIRFLOW-6149
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6149
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6151) Rename adls_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6151:
-

> Rename adls_to_gcs service
> --
>
> Key: AIRFLOW-6151
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6151
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6148) Rename gcs_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6148:
-

> Rename gcs_to_gcs service
> -
>
> Key: AIRFLOW-6148
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6148
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6152) Rename bigquery_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6152:
-

> Rename bigquery_to_gcs service
> --
>
> Key: AIRFLOW-6152
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6152
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6117) Rename mssql_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6117?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6117:
-

> Rename mssql_to_gcs service
> ---
>
> Key: AIRFLOW-6117
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6117
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Closed] (AIRFLOW-6114) Rename video_intelligence service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6114?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski closed AIRFLOW-6114.
---
Resolution: Invalid

> Rename video_intelligence service
> -
>
> Key: AIRFLOW-6114
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6114
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Issue Comment Deleted] (AIRFLOW-6113) Rename data_transfer service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski updated AIRFLOW-6113:

Comment: was deleted

(was: Here is the correct task
https://issues.apache.org/jira/browse/AIRFLOW-6097 )

> Rename data_transfer service
> 
>
> Key: AIRFLOW-6113
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6113
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Issue Comment Deleted] (AIRFLOW-6114) Rename video_intelligence service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6114?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski updated AIRFLOW-6114:

Comment: was deleted

(was: Here is the correct task
https://issues.apache.org/jira/browse/AIRFLOW-6097 )

> Rename video_intelligence service
> -
>
> Key: AIRFLOW-6114
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6114
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6114) Rename video_intelligence service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6114?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6114:
-

> Rename video_intelligence service
> -
>
> Key: AIRFLOW-6114
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6114
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6113) Rename data_transfer service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6113:
-

> Rename data_transfer service
> 
>
> Key: AIRFLOW-6113
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6113
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-6113) Rename data_transfer service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski resolved AIRFLOW-6113.
-
Resolution: Fixed

> Rename data_transfer service
> 
>
> Key: AIRFLOW-6113
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6113
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-6152) Rename bigquery_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski reopened AIRFLOW-6152:
-

> Rename bigquery_to_gcs service
> --
>
> Key: AIRFLOW-6152
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6152
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Closed] (AIRFLOW-6152) Rename bigquery_to_gcs service

2020-01-17 Thread Jira


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michał Słowikowski closed AIRFLOW-6152.
---
Resolution: Fixed

> Rename bigquery_to_gcs service
> --
>
> Key: AIRFLOW-6152
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6152
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.6
>Reporter: Michał Słowikowski
>Assignee: Michał Słowikowski
>Priority: Minor
>
> Added these classes:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


  1   2   >