[GitHub] [airflow] potiuk commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
potiuk commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601#issuecomment-558509515 All checks passed @kaxil :) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (AIRFLOW-6068) [Airflow] Set coherence of KubernetesPodOperator args's behavior
[ https://issues.apache.org/jira/browse/AIRFLOW-6068?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] JungraeKim updated AIRFLOW-6068: Priority: Minor (was: Major) > [Airflow] Set coherence of KubernetesPodOperator args's behavior > > > Key: AIRFLOW-6068 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6068 > Project: Apache Airflow > Issue Type: Wish > Components: operators >Affects Versions: 1.10.6 > Environment: 1.10 >Reporter: JungraeKim >Assignee: JungraeKim >Priority: Minor > > There are many class used by KubernetesPodOperator. > - Port > - Resource > - PodRuntimeInfoEnv > - Secret > - Volume > - VolumeMount > Some of them provide convert (_set function) from dict type. But other class > need to be copied. > At least, PodRuntimeInfoEnv can used _set function with validate function -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6068) [Airflow] Set coherence of KubernetesPodOperator args's behavior
[ https://issues.apache.org/jira/browse/AIRFLOW-6068?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] JungraeKim updated AIRFLOW-6068: Issue Type: Wish (was: Bug) > [Airflow] Set coherence of KubernetesPodOperator args's behavior > > > Key: AIRFLOW-6068 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6068 > Project: Apache Airflow > Issue Type: Wish > Components: operators >Affects Versions: 1.10.6 > Environment: 1.10 >Reporter: JungraeKim >Assignee: JungraeKim >Priority: Major > > There are many class used by KubernetesPodOperator. > - Port > - Resource > - PodRuntimeInfoEnv > - Secret > - Volume > - VolumeMount > Some of them provide convert (_set function) from dict type. But other class > need to be copied. > At least, PodRuntimeInfoEnv can used _set function with validate function -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (AIRFLOW-6064) Graph view broken when starting at a specific node in graph (root)
[ https://issues.apache.org/jira/browse/AIRFLOW-6064?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Emmanuel Brard closed AIRFLOW-6064. --- Resolution: Duplicate > Graph view broken when starting at a specific node in graph (root) > -- > > Key: AIRFLOW-6064 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6064 > Project: Apache Airflow > Issue Type: Bug > Components: webserver >Affects Versions: 1.10.6 >Reporter: Emmanuel Brard >Priority: Major > Attachments: Screenshot 2019-11-25 at 18.17.47.png > > > The graph view when using > {code:java} > ?root= > {code} > seems broken. > For example > https://airflow./admin/airflow/graph?dag_id=maintenance_ghost_dags&root=are_there_ghost_dags > Gives: > {code} > dagre-d3.js:3696 Uncaught Error: Node 'check_dag_path' is not in graph > at Digraph.BaseGraph._strictGetNode (dagre-d3.js:3696) > at Digraph.BaseGraph._addEdge (dagre-d3.js:3642) > at Digraph.addEdge (dagre-d3.js:4037) > at dagre-d3.js:4890 > at Array.forEach () > at Object.exports.decode (dagre-d3.js:4889) > at graph?dag_id=maintenance_ghost_dags&root=are_there_ghost_dags:950 > {code} > Maybe related to this [PR|https://github.com/apache/airflow/pull/5874/files] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6067) [Airflow] pod_runtime_info_envs doesn't apply PodRuntimeInfoEnv class
[ https://issues.apache.org/jira/browse/AIRFLOW-6067?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982239#comment-16982239 ] JungraeKim commented on AIRFLOW-6067: - I didn't catch Optional in master branch > [Airflow] pod_runtime_info_envs doesn't apply PodRuntimeInfoEnv class > - > > Key: AIRFLOW-6067 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6067 > Project: Apache Airflow > Issue Type: Bug > Components: operators >Affects Versions: 1.10.6 > Environment: Airflow 10.x > Python 2.x, 3.x >Reporter: JungraeKim >Assignee: JungraeKim >Priority: Major > > When using KubernetesPodOperator, *pod_runtime_info_envs* doesn't work > properly. > > There are two way to solve it. > > # Using PodRuntimeInfoEnv class like _set_resources > # Use runtime_info["key"] not runtime_info.key > > Right now, you can resolve this issue by class converted from dict like below > code. > {code:python} > class AttrDict(dict): > def __init__(self, *args, **kwargs): > super(AttrDict, self).__init__(*args, **kwargs) > self.__dict__ = self > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] potiuk merged pull request #6662: [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts
potiuk merged pull request #6662: [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts URL: https://github.com/apache/airflow/pull/6662 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6066) We need pre-commit check to check for pydevd
[ https://issues.apache.org/jira/browse/AIRFLOW-6066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982251#comment-16982251 ] ASF GitHub Bot commented on AIRFLOW-6066: - potiuk commented on pull request #6662: [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts URL: https://github.com/apache/airflow/pull/6662 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > We need pre-commit check to check for pydevd > > > Key: AIRFLOW-6066 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6066 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6066) We need pre-commit check to check for pydevd
[ https://issues.apache.org/jira/browse/AIRFLOW-6066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982252#comment-16982252 ] ASF subversion and git services commented on AIRFLOW-6066: -- Commit 0ff9e2307042ba95e69b32e37f2fc767a5fdc36d in airflow's branch refs/heads/master from Jarek Potiuk [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=0ff9e23 ] [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts (#6662) > We need pre-commit check to check for pydevd > > > Key: AIRFLOW-6066 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6066 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6066) We need pre-commit check to check for pydevd
[ https://issues.apache.org/jira/browse/AIRFLOW-6066?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jarek Potiuk resolved AIRFLOW-6066. --- Fix Version/s: 1.10.7 Resolution: Fixed > We need pre-commit check to check for pydevd > > > Key: AIRFLOW-6066 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6066 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 1.10.7 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6066) We need pre-commit check to check for pydevd
[ https://issues.apache.org/jira/browse/AIRFLOW-6066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982259#comment-16982259 ] ASF subversion and git services commented on AIRFLOW-6066: -- Commit ff2ffa1b999ca35ec1831cac975c51666390fae1 in airflow's branch refs/heads/v1-10-test from Jarek Potiuk [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ff2ffa1 ] [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts (#6662) (cherry picked from commit 0ff9e2307042ba95e69b32e37f2fc767a5fdc36d) > We need pre-commit check to check for pydevd > > > Key: AIRFLOW-6066 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6066 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 1.10.7 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-5915) Add support for the new documentation theme
[ https://issues.apache.org/jira/browse/AIRFLOW-5915?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982258#comment-16982258 ] ASF subversion and git services commented on AIRFLOW-5915: -- Commit 2478f0cb1d7adf2731b2b0db1dc809c84b8eeac4 in airflow's branch refs/heads/v1-10-test from Kamil Breguła [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=2478f0c ] [AIRFLOW-5915] Add support for the new documentation theme (#6563) (cherry picked from commit 63fcc73ee17c031bf0c7973cef70e521602e78f5) > Add support for the new documentation theme > --- > > Key: AIRFLOW-5915 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5915 > Project: Apache Airflow > Issue Type: Improvement > Components: documentation >Affects Versions: 1.10.6 >Reporter: Kamil Bregula >Assignee: Kamil Bregula >Priority: Major > Fix For: 1.10.7 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] potiuk commented on issue #6596: [AIRFLOW-6004] Untangle Executors class to avoid cyclic imports. Depends on [AIRFLOW-6010]
potiuk commented on issue #6596: [AIRFLOW-6004] Untangle Executors class to avoid cyclic imports. Depends on [AIRFLOW-6010] URL: https://github.com/apache/airflow/pull/6596#issuecomment-558527405 Hey @kaxil -> wait for the next push. I am fixing a lot of those comments and I have all the tests passing but I also clean-up taskinstance.py + scheduler_job.py (make them pylint compliant + adding type annotations since I am already touching those files quite a lot) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io edited a comment on issue #6396: [AIRFLOW-5726] Delete table as file name in RedshiftToS3Transfer
codecov-io edited a comment on issue #6396: [AIRFLOW-5726] Delete table as file name in RedshiftToS3Transfer URL: https://github.com/apache/airflow/pull/6396#issuecomment-546875908 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6396?src=pr&el=h1) Report > Merging [#6396](https://codecov.io/gh/apache/airflow/pull/6396?src=pr&el=desc) into [master](https://codecov.io/gh/apache/airflow/commit/74d2a0d9e77cf90b85654f65d1adba0875e0fb1f?src=pr&el=desc) will **increase** coverage by `2.91%`. > The diff coverage is `100%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6396/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/airflow/pull/6396?src=pr&el=tree) ```diff @@Coverage Diff @@ ## master#6396 +/- ## == + Coverage 80.59% 83.51% +2.91% == Files 626 672 +46 Lines 3624337594+1351 == + Hits2921131396+2185 + Misses 7032 6198 -834 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6396?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [airflow/operators/redshift\_to\_s3\_operator.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcmVkc2hpZnRfdG9fczNfb3BlcmF0b3IucHk=) | `97.29% <100%> (+0.15%)` | :arrow_up: | | [airflow/contrib/sensors/cassandra\_record\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvY2Fzc2FuZHJhX3JlY29yZF9zZW5zb3IucHk=) | `0% <0%> (-100%)` | :arrow_down: | | [airflow/contrib/hooks/aws\_firehose\_hook.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2F3c19maXJlaG9zZV9ob29rLnB5) | `0% <0%> (-100%)` | :arrow_down: | | [airflow/contrib/sensors/cassandra\_table\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvY2Fzc2FuZHJhX3RhYmxlX3NlbnNvci5weQ==) | `0% <0%> (-100%)` | :arrow_down: | | [...low/contrib/sensors/aws\_redshift\_cluster\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvYXdzX3JlZHNoaWZ0X2NsdXN0ZXJfc2Vuc29yLnB5) | `0% <0%> (-100%)` | :arrow_down: | | [airflow/contrib/hooks/redshift\_hook.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3JlZHNoaWZ0X2hvb2sucHk=) | `0% <0%> (-75%)` | :arrow_down: | | [airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==) | `44.44% <0%> (-55.56%)` | :arrow_down: | | [airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==) | `52.94% <0%> (-47.06%)` | :arrow_down: | | [airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==) | `45.25% <0%> (-46.72%)` | :arrow_down: | | [airflow/example\_dags/example\_python\_operator.py](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9weXRob25fb3BlcmF0b3IucHk=) | `63.33% <0%> (-31.12%)` | :arrow_down: | | ... and [243 more](https://codecov.io/gh/apache/airflow/pull/6396/diff?src=pr&el=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6396?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6396?src=pr&el=footer). Last update [74d2a0d...297f06f](https://codecov.io/gh/apache/airflow/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] kgabryje opened a new pull request #189: Content/texts
kgabryje opened a new pull request #189: Content/texts URL: https://github.com/apache/airflow-site/pull/189 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] kgabryje opened a new pull request #190: Content/blogposts
kgabryje opened a new pull request #190: Content/blogposts URL: https://github.com/apache/airflow-site/pull/190 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] tyg03485 opened a new pull request #6665: [AIRFLOW-6068] Create func for apply class based K8SModel
tyg03485 opened a new pull request #6665: [AIRFLOW-6068] Create func for apply class based K8SModel URL: https://github.com/apache/airflow/pull/6665 Make sure you have checked _all_ steps below. ### Jira - [ ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-6068 ### Description - [ ] Here are some details about my PR, including screenshots of any UI changes: ### Tests - [ ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [ ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [ ] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6068) [Airflow] Set coherence of KubernetesPodOperator args's behavior
[ https://issues.apache.org/jira/browse/AIRFLOW-6068?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982270#comment-16982270 ] ASF GitHub Bot commented on AIRFLOW-6068: - tyg03485 commented on pull request #6665: [AIRFLOW-6068] Create func for apply class based K8SModel URL: https://github.com/apache/airflow/pull/6665 Make sure you have checked _all_ steps below. ### Jira - [ ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-6068 ### Description - [ ] Here are some details about my PR, including screenshots of any UI changes: ### Tests - [ ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [ ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [ ] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > [Airflow] Set coherence of KubernetesPodOperator args's behavior > > > Key: AIRFLOW-6068 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6068 > Project: Apache Airflow > Issue Type: Wish > Components: operators >Affects Versions: 1.10.6 > Environment: 1.10 >Reporter: JungraeKim >Assignee: JungraeKim >Priority: Minor > > There are many class used by KubernetesPodOperator. > - Port > - Resource > - PodRuntimeInfoEnv > - Secret > - Volume > - VolumeMount > Some of them provide convert (_set function) from dict type. But other class > need to be copied. > At least, PodRuntimeInfoEnv can used _set function with validate function -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6068) [Airflow] Set coherence of KubernetesPodOperator args's behavior
[ https://issues.apache.org/jira/browse/AIRFLOW-6068?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] JungraeKim updated AIRFLOW-6068: Description: There are many class used by KubernetesPodOperator. - Port - Resource - PodRuntimeInfoEnv - Secret - Volume - VolumeMount Some of them provide convert (_set function) from dict type. But other class need to be copied. So, Create general function for converting class instance from dict. was: There are many class used by KubernetesPodOperator. - Port - Resource - PodRuntimeInfoEnv - Secret - Volume - VolumeMount Some of them provide convert (_set function) from dict type. But other class need to be copied. At least, PodRuntimeInfoEnv can used _set function with validate function > [Airflow] Set coherence of KubernetesPodOperator args's behavior > > > Key: AIRFLOW-6068 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6068 > Project: Apache Airflow > Issue Type: Wish > Components: operators >Affects Versions: 1.10.6 > Environment: 1.10 >Reporter: JungraeKim >Assignee: JungraeKim >Priority: Minor > > There are many class used by KubernetesPodOperator. > - Port > - Resource > - PodRuntimeInfoEnv > - Secret > - Volume > - VolumeMount > Some of them provide convert (_set function) from dict type. But other class > need to be copied. > So, Create general function for converting class instance from dict. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6066) We need pre-commit check to check for pydevd
[ https://issues.apache.org/jira/browse/AIRFLOW-6066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982281#comment-16982281 ] ASF subversion and git services commented on AIRFLOW-6066: -- Commit ca1ae8e8646910a19d51a9634fdafa566cfc0b9c in airflow's branch refs/heads/v1-10-test from Jarek Potiuk [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ca1ae8e ] [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts (#6662) (cherry picked from commit 0ff9e2307042ba95e69b32e37f2fc767a5fdc36d) > We need pre-commit check to check for pydevd > > > Key: AIRFLOW-6066 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6066 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 1.10.7 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6069) Host Travis build should use always python 3.6
Jarek Potiuk created AIRFLOW-6069: - Summary: Host Travis build should use always python 3.6 Key: AIRFLOW-6069 URL: https://issues.apache.org/jira/browse/AIRFLOW-6069 Project: Apache Airflow Issue Type: Improvement Components: ci Affects Versions: 1.10.6, 2.0.0 Reporter: Jarek Potiuk Host python version for Travis can be standardized to python 3.6 - which will make the pre-commit checks and scripts more robust. PYTHON_VERSION sets the version used inside CI containers to run the tests instead. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] potiuk opened a new pull request #6666: [AIRFLOW-6069] Python host version in travis is set to 3.6 always
potiuk opened a new pull request #: [AIRFLOW-6069] Python host version in travis is set to 3.6 always URL: https://github.com/apache/airflow/pull/ This will make the scripts more "stable" - no problems with features missing in 3.5 for host scripts. Python version for all tests in container is controlled via PYTHON_VERSION variable. Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-6069 ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [x] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on issue #6666: [AIRFLOW-6069] Python host version in travis is set to 3.6 always
potiuk commented on issue #: [AIRFLOW-6069] Python host version in travis is set to 3.6 always URL: https://github.com/apache/airflow/pull/#issuecomment-558539894 @feluelle -> turned out that debug-statement pre-commit was python3.6+ so I set python 3.6 as host python version on both - master and v1-10-test. On v1-10-test it's already added. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6069) Host Travis build should use always python 3.6
[ https://issues.apache.org/jira/browse/AIRFLOW-6069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982282#comment-16982282 ] ASF GitHub Bot commented on AIRFLOW-6069: - potiuk commented on pull request #: [AIRFLOW-6069] Python host version in travis is set to 3.6 always URL: https://github.com/apache/airflow/pull/ This will make the scripts more "stable" - no problems with features missing in 3.5 for host scripts. Python version for all tests in container is controlled via PYTHON_VERSION variable. Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-6069 ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [x] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Host Travis build should use always python 3.6 > -- > > Key: AIRFLOW-6069 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6069 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > > Host python version for Travis can be standardized to python 3.6 - which will > make the pre-commit checks and scripts more robust. PYTHON_VERSION sets the > version used inside CI containers to run the tests instead. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6066) We need pre-commit check to check for pydevd
[ https://issues.apache.org/jira/browse/AIRFLOW-6066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982287#comment-16982287 ] ASF subversion and git services commented on AIRFLOW-6066: -- Commit fd06f18e299234b3640e71745b4e3369b4ee09d3 in airflow's branch refs/heads/v1-10-test from Jarek Potiuk [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=fd06f18 ] [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts (#6662) (cherry picked from commit 0ff9e2307042ba95e69b32e37f2fc767a5fdc36d) > We need pre-commit check to check for pydevd > > > Key: AIRFLOW-6066 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6066 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 1.10.7 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] kaxil commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
kaxil commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601#issuecomment-558548281 @potiuk - We have merge conflicts ! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Resolved] (AIRFLOW-5726) Delete table as file name in RedshiftToS3Transfer
[ https://issues.apache.org/jira/browse/AIRFLOW-5726?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-5726. - Fix Version/s: 1.10.7 2.0.0 Resolution: Fixed > Delete table as file name in RedshiftToS3Transfer > - > > Key: AIRFLOW-5726 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5726 > Project: Apache Airflow > Issue Type: Improvement > Components: operators >Affects Versions: 1.10.5 >Reporter: Javier Lopez Tomas >Assignee: Javier Lopez Tomas >Priority: Minor > Labels: redshift, s3 > Fix For: 2.0.0, 1.10.7 > > > Right now, if you want to unload a redshift table called 'people' to s3, in a > bucket X, key Y with name 'bad_people' you can't, and the directory is set to > be s3://X/Y/people_ > This limitates your freedom to save your table with the name that you want -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] kaxil merged pull request #6396: [AIRFLOW-5726] Delete table as file name in RedshiftToS3Transfer
kaxil merged pull request #6396: [AIRFLOW-5726] Delete table as file name in RedshiftToS3Transfer URL: https://github.com/apache/airflow/pull/6396 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] daphshez opened a new pull request #6667: Clarified a grammatically incorrect sentence
daphshez opened a new pull request #6667: Clarified a grammatically incorrect sentence URL: https://github.com/apache/airflow/pull/6667 Make sure you have checked _all_ steps below. ### Jira - [ ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)). - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). ### Description - [ ] Here are some details about my PR, including screenshots of any UI changes: ### Tests - [ ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [ ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [ ] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-5726) Delete table as file name in RedshiftToS3Transfer
[ https://issues.apache.org/jira/browse/AIRFLOW-5726?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982292#comment-16982292 ] ASF GitHub Bot commented on AIRFLOW-5726: - kaxil commented on pull request #6396: [AIRFLOW-5726] Delete table as file name in RedshiftToS3Transfer URL: https://github.com/apache/airflow/pull/6396 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Delete table as file name in RedshiftToS3Transfer > - > > Key: AIRFLOW-5726 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5726 > Project: Apache Airflow > Issue Type: Improvement > Components: operators >Affects Versions: 1.10.5 >Reporter: Javier Lopez Tomas >Assignee: Javier Lopez Tomas >Priority: Minor > Labels: redshift, s3 > > Right now, if you want to unload a redshift table called 'people' to s3, in a > bucket X, key Y with name 'bad_people' you can't, and the directory is set to > be s3://X/Y/people_ > This limitates your freedom to save your table with the name that you want -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-5726) Delete table as file name in RedshiftToS3Transfer
[ https://issues.apache.org/jira/browse/AIRFLOW-5726?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982293#comment-16982293 ] ASF subversion and git services commented on AIRFLOW-5726: -- Commit 4a17bca9e7f3c64ccce11b9d0144249b1259f667 in airflow's branch refs/heads/master from JavierLopezT [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=4a17bca ] [AIRFLOW-5726] Allow custom filename in RedshiftToS3Transfer (#6396) > Delete table as file name in RedshiftToS3Transfer > - > > Key: AIRFLOW-5726 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5726 > Project: Apache Airflow > Issue Type: Improvement > Components: operators >Affects Versions: 1.10.5 >Reporter: Javier Lopez Tomas >Assignee: Javier Lopez Tomas >Priority: Minor > Labels: redshift, s3 > Fix For: 2.0.0, 1.10.7 > > > Right now, if you want to unload a redshift table called 'people' to s3, in a > bucket X, key Y with name 'bad_people' you can't, and the directory is set to > be s3://X/Y/people_ > This limitates your freedom to save your table with the name that you want -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] nuclearpinguin commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
nuclearpinguin commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350637482 ## File path: tests/pytest.ini ## @@ -0,0 +1,37 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[pytest] +addopts = +-rasl +--cov=airflow/ +--cov-report html:airflow/www/static/coverage/ +--ignore=tests/dags/test_dag_serialization.py +;This will treat all tests as flaky +;--force-flaky +norecursedirs = +tests/dags_with_system_exit +tests/test_utils +tests/dags_corrupted +faulthandler_timeout=180 +log_print = True +log_level = INFO +filterwarnings = +ignore::DeprecationWarning +ignore::PendingDeprecationWarning +ignore::RuntimeWarning Review comment: I see no `RuntimeWarning` now. What do you think about filtering out other warnings? @potiuk @mik-laj @ashb @feluelle This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] nuclearpinguin commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
nuclearpinguin commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350637795 ## File path: tests/__init__.py ## @@ -20,4 +20,4 @@ # flake8: noqa from .api import * # type: ignore -from .core import * # type: ignore +from .test_core import * # type: ignore Review comment: It seems that everything can be removed from this `__init__` ;) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] nuclearpinguin commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
nuclearpinguin commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350638827 ## File path: tests/conftest.py ## @@ -0,0 +1,47 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +import os +from copy import deepcopy + +import pytest + +from airflow.utils import db + + +@pytest.fixture(autouse=True) +def reset_environment(): +""" +Resets env variables. +""" +init_env = deepcopy(os.environ) Review comment: I checked all those occurrences and fixed `test_hive_hook` in other places it's used in helper or handled properly. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] kgabryje opened a new pull request #191: center icon in accordion
kgabryje opened a new pull request #191: center icon in accordion URL: https://github.com/apache/airflow-site/pull/191 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6066) We need pre-commit check to check for pydevd
[ https://issues.apache.org/jira/browse/AIRFLOW-6066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982300#comment-16982300 ] ASF subversion and git services commented on AIRFLOW-6066: -- Commit 01b59e291f2e80d017a87a0b8a4be72118f0248f in airflow's branch refs/heads/v1-10-test from Jarek Potiuk [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=01b59e2 ] [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts (#6662) (cherry picked from commit 0ff9e2307042ba95e69b32e37f2fc767a5fdc36d) > We need pre-commit check to check for pydevd > > > Key: AIRFLOW-6066 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6066 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 1.10.7 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] potiuk commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
potiuk commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601#issuecomment-558553486 @kaxil -> thanks! Rebased. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] kgabryje opened a new pull request #192: move Aizhamal and Kevin from commiters to pmcs
kgabryje opened a new pull request #192: move Aizhamal and Kevin from commiters to pmcs URL: https://github.com/apache/airflow-site/pull/192 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (AIRFLOW-6071) On the admin dashboard, the recent tasks column has no tooltip for tasks in 'null' state
Adam Hopkinson created AIRFLOW-6071: --- Summary: On the admin dashboard, the recent tasks column has no tooltip for tasks in 'null' state Key: AIRFLOW-6071 URL: https://issues.apache.org/jira/browse/AIRFLOW-6071 Project: Apache Airflow Issue Type: Bug Components: ui Affects Versions: 1.10.2 Environment: GCP Composer composer-1.7.5-airflow-1.10.2 Reporter: Adam Hopkinson Attachments: image-2019-11-26-10-04-27-133.png On the DAGS listing template, the circles in the _Recent Tasks_ column all have a tooltip apart from the second to last - which is for tasks with state = `null` !image-2019-11-26-10-04-27-133.png|width=261,height=37! I believe this is happening in [this line of code|https://github.com/apache/airflow/blob/0ff9e2307042ba95e69b32e37f2fc767a5fdc36d/airflow/www/templates/airflow/dags.html#L447], which is: {{.attr('title', function(d) \{return d.state || 'none'})}} I'm not sure why it's not falling back to 'none' - I think it's possibly seeing the value of d.state as the text value 'null' rather than a true null, but then putting that into the title as true null. I'm using GCP Composer, so don't have a local instance of Airflow that I can test. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6070) On the admin dashboard, the recent tasks column has no tooltip for tasks in 'null' state
Adam Hopkinson created AIRFLOW-6070: --- Summary: On the admin dashboard, the recent tasks column has no tooltip for tasks in 'null' state Key: AIRFLOW-6070 URL: https://issues.apache.org/jira/browse/AIRFLOW-6070 Project: Apache Airflow Issue Type: Bug Components: ui Affects Versions: 1.10.2 Environment: GCP Composer composer-1.7.5-airflow-1.10.2 Reporter: Adam Hopkinson Attachments: image-2019-11-26-10-03-40-377.png On the DAGS listing template, the circles in the _Recent Tasks_ column all have a tooltip apart from the second to last - which is for tasks with state = `null` !image-2019-11-26-10-03-40-377.png|width=261,height=37! I believe this is happening in [this line of code|https://github.com/apache/airflow/blob/0ff9e2307042ba95e69b32e37f2fc767a5fdc36d/airflow/www/templates/airflow/dags.html#L447], which is: {{.attr('title', function(d) \{return d.state || 'none'})}} I'm not sure why it's not falling back to 'none' - I think it's possibly seeing the value of d.state as the text value 'null' rather than a true null, but then putting that into the title as true null. I'm using GCP Composer, so don't have a local instance of Airflow that I can test. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (AIRFLOW-6071) On the admin dashboard, the recent tasks column has no tooltip for tasks in 'null' state
[ https://issues.apache.org/jira/browse/AIRFLOW-6071?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Adam Hopkinson closed AIRFLOW-6071. --- Resolution: Duplicate Apologies, issue submitted twice > On the admin dashboard, the recent tasks column has no tooltip for tasks in > 'null' state > > > Key: AIRFLOW-6071 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6071 > Project: Apache Airflow > Issue Type: Bug > Components: ui >Affects Versions: 1.10.2 > Environment: GCP Composer composer-1.7.5-airflow-1.10.2 >Reporter: Adam Hopkinson >Priority: Trivial > Attachments: image-2019-11-26-10-04-27-133.png > > > On the DAGS listing template, the circles in the _Recent Tasks_ column all > have a tooltip apart from the second to last - which is for tasks with state > = `null` > !image-2019-11-26-10-04-27-133.png|width=261,height=37! > I believe this is happening in [this line of > code|https://github.com/apache/airflow/blob/0ff9e2307042ba95e69b32e37f2fc767a5fdc36d/airflow/www/templates/airflow/dags.html#L447], > which is: > {{.attr('title', function(d) \{return d.state || 'none'})}} > I'm not sure why it's not falling back to 'none' - I think it's possibly > seeing the value of d.state as the text value 'null' rather than a true null, > but then putting that into the title as true null. > I'm using GCP Composer, so don't have a local instance of Airflow that I can > test. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350648473 ## File path: airflow/www/views.py ## @@ -1029,7 +1029,6 @@ def blocked(self, session=None): for dag_id, active_dag_runs in dags: max_active_runs = 0 dag = dagbag.get_dag(dag_id) -max_active_runs = dagbag.dags[dag_id].max_active_runs Review comment: 👍 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] kgabryje opened a new pull request #193: temporarily replace link to install page with link to docs
kgabryje opened a new pull request #193: temporarily replace link to install page with link to docs URL: https://github.com/apache/airflow-site/pull/193 When the texts on Install page are ready, please revert this commit. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350648236 ## File path: tests/conftest.py ## @@ -0,0 +1,47 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +import os +from copy import deepcopy + +import pytest + +from airflow.utils import db + + +@pytest.fixture(autouse=True) +def reset_environment(): +""" +Resets env variables. +""" +init_env = deepcopy(os.environ) Review comment: @nuclearpinguin ^^ monkeypatch seems nicer and pytest-dedicated way of temporary modifying the environment (and other things). It is - far less code and it restores environment automatically. So if we decide to optimise the speed and get-rid of auto-fixture, I think monkeypatch is better solution. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350650581 ## File path: tests/pytest.ini ## @@ -0,0 +1,37 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[pytest] +addopts = +-rasl +--cov=airflow/ +--cov-report html:airflow/www/static/coverage/ +--ignore=tests/dags/test_dag_serialization.py +;This will treat all tests as flaky +;--force-flaky +norecursedirs = +tests/dags_with_system_exit +tests/test_utils +tests/dags_corrupted +faulthandler_timeout=180 +log_print = True +log_level = INFO +filterwarnings = +ignore::DeprecationWarning +ignore::PendingDeprecationWarning +ignore::RuntimeWarning Review comment: I am all ok for filtering Deprecation warnings but only when we run in CI. If we run tests locally, we should see the warnings. I think there is very little value in warnings displayed in CI for regular PRs. Usually those will be the "other" warnings that you are not really interested in. Maybe, what could be done instead - we have the CRON build run daily where we already have a more comprehensive tests (for example we build the whole Docker image from the scratch and we always run Kubernetes tests no matter if any of kubernetes files changed). Maybe we could change it that Deprecation warnings are still displayed when you run tests locally and when you run CRON CI job, but not when you run regular PR. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350650581 ## File path: tests/pytest.ini ## @@ -0,0 +1,37 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[pytest] +addopts = +-rasl +--cov=airflow/ +--cov-report html:airflow/www/static/coverage/ +--ignore=tests/dags/test_dag_serialization.py +;This will treat all tests as flaky +;--force-flaky +norecursedirs = +tests/dags_with_system_exit +tests/test_utils +tests/dags_corrupted +faulthandler_timeout=180 +log_print = True +log_level = INFO +filterwarnings = +ignore::DeprecationWarning +ignore::PendingDeprecationWarning +ignore::RuntimeWarning Review comment: I am all ok for filtering warnings but only when we run in CI. If we run tests locally, we should see the warnings. I think there is very little value in warnings displayed in CI for regular PRs. Usually those will be the "other" warnings that you are not really interested in. Maybe, what could be done instead - we have the CRON build run daily where we already have a more comprehensive tests (for example we build the whole Docker image from the scratch and we always run Kubernetes tests no matter if any of kubernetes files changed). Maybe we could change it that Deprecation warnings are still displayed when you run tests locally and when you run CRON CI job, but not when you run regular PR. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350651531 ## File path: tests/task/task_runner/test_standard_task_runner.py ## @@ -126,7 +126,11 @@ def test_on_kill(self): runner.terminate() # Wait some time for the result -time.sleep(40) +for _ in range(5): Review comment: Much better ! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350651117 ## File path: setup.py ## @@ -301,7 +301,6 @@ def write_version(filename: str = os.path.join(*["airflow", "git_version"])): 'pytest-cov==2.8.1', 'pywinrm', 'qds-sdk>=1.9.6', -'rednose', Review comment: Right :). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350651423 ## File path: tests/pytest.ini ## @@ -20,7 +20,6 @@ addopts = -rasl --cov=airflow/ --cov-report html:airflow/www/static/coverage/ ---ignore=tests/dags/test_dag_serialization.py Review comment: Good :) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] kgabryje closed pull request #193: temporarily replace link to install page with link to docs
kgabryje closed pull request #193: temporarily replace link to install page with link to docs URL: https://github.com/apache/airflow-site/pull/193 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350650581 ## File path: tests/pytest.ini ## @@ -0,0 +1,37 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[pytest] +addopts = +-rasl +--cov=airflow/ +--cov-report html:airflow/www/static/coverage/ +--ignore=tests/dags/test_dag_serialization.py +;This will treat all tests as flaky +;--force-flaky +norecursedirs = +tests/dags_with_system_exit +tests/test_utils +tests/dags_corrupted +faulthandler_timeout=180 +log_print = True +log_level = INFO +filterwarnings = +ignore::DeprecationWarning +ignore::PendingDeprecationWarning +ignore::RuntimeWarning Review comment: I am all ok for filtering Deprecation warnings but only when we run regular PRs in CI. If we run tests locally, we should still see the warnings (that encourages to fix the warnings). I think there is very little value in warnings displayed in CI for regular PRs. Usually those will be the "other" warnings that you are not really interested in. Maybe, what could be done instead - we have the CRON build run daily where we already have a more comprehensive tests (for example we build the whole Docker image from the scratch and we always run Kubernetes tests no matter if any of kubernetes files changed). Maybe we could change it that Deprecation warnings are still displayed when you run tests locally and when you run CRON CI job, but not when you run regular PR. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
potiuk commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350650581 ## File path: tests/pytest.ini ## @@ -0,0 +1,37 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[pytest] +addopts = +-rasl +--cov=airflow/ +--cov-report html:airflow/www/static/coverage/ +--ignore=tests/dags/test_dag_serialization.py +;This will treat all tests as flaky +;--force-flaky +norecursedirs = +tests/dags_with_system_exit +tests/test_utils +tests/dags_corrupted +faulthandler_timeout=180 +log_print = True +log_level = INFO +filterwarnings = +ignore::DeprecationWarning +ignore::PendingDeprecationWarning +ignore::RuntimeWarning Review comment: I am all ok for filtering Deprecation warnings but only when we run regular PRs in CI. If we run tests locally, we should see the warnings. I think there is very little value in warnings displayed in CI for regular PRs. Usually those will be the "other" warnings that you are not really interested in. Maybe, what could be done instead - we have the CRON build run daily where we already have a more comprehensive tests (for example we build the whole Docker image from the scratch and we always run Kubernetes tests no matter if any of kubernetes files changed). Maybe we could change it that Deprecation warnings are still displayed when you run tests locally and when you run CRON CI job, but not when you run regular PR. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] feluelle commented on a change in pull request #6666: [AIRFLOW-6069] Python host version in travis is set to 3.6 always
feluelle commented on a change in pull request #: [AIRFLOW-6069] Python host version in travis is set to 3.6 always URL: https://github.com/apache/airflow/pull/#discussion_r350655874 ## File path: .travis.yml ## @@ -42,29 +41,25 @@ jobs: - name: "Build documentation" env: >- PYTHON_VERSION=3.6 - python: "3.6" stage: pre-test script: ./scripts/ci/ci_docs.sh - name: "Tests postgres python 3.6" env: >- BACKEND=postgres ENV=docker PYTHON_VERSION=3.6 - python: "3.6" stage: test - name: "Tests sqlite python 3.6" env: BACKEND=sqlite ENV=docker PYTHON_VERSION=3.6 - python: "3.6" stage: test - name: "Tests mysql python 3.7" env: BACKEND=mysql ENV=docker PYTHON_VERSION=3.7 - python: "3.7" Review comment: So this is the only major difference? - that this will run on a python 3.6 host in a python 3.7 docker env? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] kgabryje opened a new pull request #194: temporarily hide unfinished content
kgabryje opened a new pull request #194: temporarily hide unfinished content URL: https://github.com/apache/airflow-site/pull/194 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] feluelle commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
feluelle commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350658156 ## File path: tests/pytest.ini ## @@ -0,0 +1,37 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[pytest] +addopts = +-rasl +--cov=airflow/ +--cov-report html:airflow/www/static/coverage/ +--ignore=tests/dags/test_dag_serialization.py +;This will treat all tests as flaky +;--force-flaky +norecursedirs = +tests/dags_with_system_exit +tests/test_utils +tests/dags_corrupted +faulthandler_timeout=180 +log_print = True +log_level = INFO +filterwarnings = +ignore::DeprecationWarning +ignore::PendingDeprecationWarning +ignore::RuntimeWarning Review comment: I think we should differ between `DeprecationWarning`s in our Airflow code base and external modules. IMO we should always try to fix deprecation warnings in our code base. And we can ignore the external ones. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] feluelle commented on a change in pull request #6650: [AIRFLOW-6051] Make DAG optional during displaying the log
feluelle commented on a change in pull request #6650: [AIRFLOW-6051] Make DAG optional during displaying the log URL: https://github.com/apache/airflow/pull/6650#discussion_r350660913 ## File path: airflow/www/views.py ## @@ -578,7 +578,8 @@ def _get_logs_with_metadata(try_number, metadata): try: if ti is not None: dag = dagbag.get_dag(dag_id) -ti.task = dag.get_task(ti.task_id) +if dag: Review comment: I got it. 👍 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (AIRFLOW-6072) aws_hook: Ability to set outbound proxy
Bjorn Olsen created AIRFLOW-6072: Summary: aws_hook: Ability to set outbound proxy Key: AIRFLOW-6072 URL: https://issues.apache.org/jira/browse/AIRFLOW-6072 Project: Apache Airflow Issue Type: Improvement Components: aws Affects Versions: 1.10.6 Reporter: Bjorn Olsen Assignee: Bjorn Olsen The boto3 connection used by aws_hook does not respect outbound http_proxy settings (even if these are set in system wide). The way to configure a proxy is to pass a botocore.config.Config object to boto3 when creating a client (according to this SO post). [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] While the aws_hook get_client_type() method is used extensively by AWS Operators, the "config" argument is not used by any operator. Adding a check to aws_hook for "config" in the "extra_config" of the Airflow Connection, could allow us to pass kwargs there that build the Config object automatically by the hook is created. Otherwise we have to update every AWS Operator to also take a "config" parameter. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] potiuk commented on a change in pull request #6666: [AIRFLOW-6069] Python host version in travis is set to 3.6 always
potiuk commented on a change in pull request #: [AIRFLOW-6069] Python host version in travis is set to 3.6 always URL: https://github.com/apache/airflow/pull/#discussion_r350662057 ## File path: .travis.yml ## @@ -42,29 +41,25 @@ jobs: - name: "Build documentation" env: >- PYTHON_VERSION=3.6 - python: "3.6" stage: pre-test script: ./scripts/ci/ci_docs.sh - name: "Tests postgres python 3.6" env: >- BACKEND=postgres ENV=docker PYTHON_VERSION=3.6 - python: "3.6" stage: test - name: "Tests sqlite python 3.6" env: BACKEND=sqlite ENV=docker PYTHON_VERSION=3.6 - python: "3.6" stage: test - name: "Tests mysql python 3.7" env: BACKEND=mysql ENV=docker PYTHON_VERSION=3.7 - python: "3.7" Review comment: Exactly. This make a difference for example for static checks which do not use container (for example debug statements check). It's not relevant for the: code tests/mypy/pylint/flake (they are all using containers with PYTHON_VERSION. BTW. See here for all the tests with "Breeze" * -> those all use PYTHON_VERSION and containers: https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#supported-pre-commit-hooks This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] feluelle commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
feluelle commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350661917 ## File path: tests/pytest.ini ## @@ -0,0 +1,37 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[pytest] +addopts = +-rasl +--cov=airflow/ +--cov-report html:airflow/www/static/coverage/ +--ignore=tests/dags/test_dag_serialization.py +;This will treat all tests as flaky +;--force-flaky +norecursedirs = +tests/dags_with_system_exit +tests/test_utils +tests/dags_corrupted +faulthandler_timeout=180 +log_print = True +log_level = INFO +filterwarnings = +ignore::DeprecationWarning +ignore::PendingDeprecationWarning +ignore::RuntimeWarning Review comment: > And we can ignore the external ones. Because sometimes those `DeprecationWarnings` of external modules cannot be fixed by us. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (AIRFLOW-6072) aws_hook: Ability to set outbound proxy
[ https://issues.apache.org/jira/browse/AIRFLOW-6072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bjorn Olsen updated AIRFLOW-6072: - Priority: Minor (was: Major) > aws_hook: Ability to set outbound proxy > --- > > Key: AIRFLOW-6072 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6072 > Project: Apache Airflow > Issue Type: Improvement > Components: aws >Affects Versions: 1.10.6 >Reporter: Bjorn Olsen >Assignee: Bjorn Olsen >Priority: Minor > > The boto3 connection used by aws_hook does not respect outbound http_proxy > settings (even if these are set in system wide). > > The way to configure a proxy is to pass a botocore.config.Config object to > boto3 when creating a client (according to this SO post). > [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] > While the aws_hook get_client_type() method is used extensively by AWS > Operators, the "config" argument is not used by any operator. > Adding a check to aws_hook for "config" in the "extra_config" of the Airflow > Connection, could allow us to pass kwargs there that build the Config object > automatically by the hook is created. > Otherwise we have to update every AWS Operator to also take a "config" > parameter. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] stale[bot] commented on issue #5517: [AIRFLOW-4292] Cleanup and improve SLA code
stale[bot] commented on issue #5517: [AIRFLOW-4292] Cleanup and improve SLA code URL: https://github.com/apache/airflow/pull/5517#issuecomment-558572967 This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] kgabryje opened a new pull request #195: [depends on #194] Page 404
kgabryje opened a new pull request #195: [depends on #194] Page 404 URL: https://github.com/apache/airflow-site/pull/195 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6069) Host Travis build should use always python 3.6
[ https://issues.apache.org/jira/browse/AIRFLOW-6069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982351#comment-16982351 ] ASF GitHub Bot commented on AIRFLOW-6069: - potiuk commented on pull request #: [AIRFLOW-6069] Python host version in travis is set to 3.6 always URL: https://github.com/apache/airflow/pull/ This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Host Travis build should use always python 3.6 > -- > > Key: AIRFLOW-6069 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6069 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > > Host python version for Travis can be standardized to python 3.6 - which will > make the pre-commit checks and scripts more robust. PYTHON_VERSION sets the > version used inside CI containers to run the tests instead. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] potiuk merged pull request #6666: [AIRFLOW-6069] Python host version in travis is set to 3.6 always
potiuk merged pull request #: [AIRFLOW-6069] Python host version in travis is set to 3.6 always URL: https://github.com/apache/airflow/pull/ This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6069) Host Travis build should use always python 3.6
[ https://issues.apache.org/jira/browse/AIRFLOW-6069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982352#comment-16982352 ] ASF subversion and git services commented on AIRFLOW-6069: -- Commit c3358524c4943909eaea8884fe199adc0c8dd908 in airflow's branch refs/heads/master from Jarek Potiuk [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=c335852 ] [AIRFLOW-6069] Python host version in travis is set to 3.6 always (#) This will make the scripts more "stable" - no problems with features missing in 3.5 for host scripts. Python version for all tests in container is controlled via PYTHON_VERSION variable. > Host Travis build should use always python 3.6 > -- > > Key: AIRFLOW-6069 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6069 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Jarek Potiuk >Priority: Major > > Host python version for Travis can be standardized to python 3.6 - which will > make the pre-commit checks and scripts more robust. PYTHON_VERSION sets the > version used inside CI containers to run the tests instead. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow-site] mik-laj merged pull request #189: Update copy of website
mik-laj merged pull request #189: Update copy of website URL: https://github.com/apache/airflow-site/pull/189 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[airflow-site] branch aip-11 updated (943783f -> 2d231f4)
This is an automated email from the ASF dual-hosted git repository. kamilbregula pushed a change to branch aip-11 in repository https://gitbox.apache.org/repos/asf/airflow-site.git. from 943783f Set hollow button background to white (#188) add 2d231f4 Update copy of website (#189) No new revisions were added by this update. Summary of changes: .../site/assets/scss/_text-with-icon.scss | 2 +- .../site/content/en/community/_index.html | 125 - landing-pages/site/layouts/community/list.html | 3 +- landing-pages/site/layouts/index.html | 2 +- landing-pages/site/layouts/meetups/list.html | 8 +- .../site/layouts/partials/text-with-icon.html | 4 +- .../site/layouts/shortcodes/accordion.html | 8 +- 7 files changed, 85 insertions(+), 67 deletions(-)
[GitHub] [airflow-site] mik-laj merged pull request #190: Add new blog post and remove dummy data
mik-laj merged pull request #190: Add new blog post and remove dummy data URL: https://github.com/apache/airflow-site/pull/190 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[airflow-site] branch aip-11 updated (2d231f4 -> e96522b)
This is an automated email from the ASF dual-hosted git repository. kamilbregula pushed a change to branch aip-11 in repository https://gitbox.apache.org/repos/asf/airflow-site.git. from 2d231f4 Update copy of website (#189) add e96522b Add new blog post and remove dummy data (#190) No new revisions were added by this update. Summary of changes: .../site/assets/scss/_markdown-content.scss| 1 + landing-pages/site/content/en/blog/Grumpy-cat.md | 47 -- .../blog/Its-a-breeze-to-develop-apache-airflow.md | 18 + .../Its-a-breeze-to-develop-apache-airflow2.md | 38 - .../Its-a-breeze-to-develop-apache-airflow3.md | 38 - .../Its-a-breeze-to-develop-apache-airflow4.md | 38 - .../Its-a-breeze-to-develop-apache-airflow5.md | 38 - ...9-thoughts-and-insights-by-airflow-commiters.md | 14 +++ 8 files changed, 33 insertions(+), 199 deletions(-) delete mode 100644 landing-pages/site/content/en/blog/Grumpy-cat.md create mode 100644 landing-pages/site/content/en/blog/Its-a-breeze-to-develop-apache-airflow.md delete mode 100644 landing-pages/site/content/en/blog/Its-a-breeze-to-develop-apache-airflow2.md delete mode 100644 landing-pages/site/content/en/blog/Its-a-breeze-to-develop-apache-airflow3.md delete mode 100644 landing-pages/site/content/en/blog/Its-a-breeze-to-develop-apache-airflow4.md delete mode 100644 landing-pages/site/content/en/blog/Its-a-breeze-to-develop-apache-airflow5.md create mode 100644 landing-pages/site/content/en/blog/apache-con-europe-2019-thoughts-and-insights-by-airflow-commiters.md
[GitHub] [airflow] codecov-io commented on issue #6472: [AIRFLOW-6058] Running tests with pytest
codecov-io commented on issue #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#issuecomment-558580149 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6472?src=pr&el=h1) Report > Merging [#6472](https://codecov.io/gh/apache/airflow/pull/6472?src=pr&el=desc) into [master](https://codecov.io/gh/apache/airflow/commit/172b46df183df9563efdc767f63653d76ddfe335?src=pr&el=desc) will **increase** coverage by `0.97%`. > The diff coverage is `57.14%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6472/graphs/tree.svg?width=650&token=WdLKlKHOAU&height=150&src=pr)](https://codecov.io/gh/apache/airflow/pull/6472?src=pr&el=tree) ```diff @@Coverage Diff @@ ## master#6472 +/- ## == + Coverage 83.75% 84.73% +0.97% == Files 672 673 +1 Lines 3759237610 +18 == + Hits3148431867 +383 + Misses 6108 5743 -365 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6472?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=) | `75.74% <ø> (-0.86%)` | :arrow_down: | | [airflow/utils/log/colored\_log.py](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy91dGlscy9sb2cvY29sb3JlZF9sb2cucHk=) | `89.58% <57.14%> (-3.6%)` | :arrow_down: | | [airflow/executors/sequential\_executor.py](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvc2VxdWVudGlhbF9leGVjdXRvci5weQ==) | `85.71% <0%> (-14.29%)` | :arrow_down: | | [...low/ti\_deps/deps/exec\_date\_after\_start\_date\_dep.py](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvZXhlY19kYXRlX2FmdGVyX3N0YXJ0X2RhdGVfZGVwLnB5) | `80% <0%> (-10%)` | :arrow_down: | | [airflow/utils/log/es\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZXNfdGFza19oYW5kbGVyLnB5) | `88.07% <0%> (-3.67%)` | :arrow_down: | | [airflow/jobs/backfill\_job.py](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy9qb2JzL2JhY2tmaWxsX2pvYi5weQ==) | `89.9% <0%> (-1.53%)` | :arrow_down: | | [airflow/utils/log/s3\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy91dGlscy9sb2cvczNfdGFza19oYW5kbGVyLnB5) | `98.5% <0%> (-1.5%)` | :arrow_down: | | [airflow/executors/base\_executor.py](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvYmFzZV9leGVjdXRvci5weQ==) | `95.65% <0%> (-1.45%)` | :arrow_down: | | [airflow/bin/airflow](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree#diff-YWlyZmxvdy9iaW4vYWlyZmxvdw==) | `84.61% <0%> (ø)` | | | ... and [26 more](https://codecov.io/gh/apache/airflow/pull/6472/diff?src=pr&el=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6472?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6472?src=pr&el=footer). Last update [172b46d...e578180](https://codecov.io/gh/apache/airflow/pull/6472?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow-site] mik-laj merged pull request #192: Move Aizhamal and Kevin from Commiters to PMCs
mik-laj merged pull request #192: Move Aizhamal and Kevin from Commiters to PMCs URL: https://github.com/apache/airflow-site/pull/192 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[airflow-site] branch aip-11 updated (e96522b -> 08bb23d)
This is an automated email from the ASF dual-hosted git repository. kamilbregula pushed a change to branch aip-11 in repository https://gitbox.apache.org/repos/asf/airflow-site.git. from e96522b Add new blog post and remove dummy data (#190) add 08bb23d Move Aizhamal and Kevin from Commiters to PMCs (#192) No new revisions were added by this update. Summary of changes: landing-pages/site/data/commiters.json | 12 landing-pages/site/data/pmc.json | 12 2 files changed, 12 insertions(+), 12 deletions(-)
[GitHub] [airflow] feluelle commented on a change in pull request #6659: [AIRFLOW-6063] Remove astroid dependency
feluelle commented on a change in pull request #6659: [AIRFLOW-6063] Remove astroid dependency URL: https://github.com/apache/airflow/pull/6659#discussion_r350679751 ## File path: setup.py ## @@ -277,7 +277,6 @@ def write_version(filename: str = os.path.join(*["airflow", "git_version"])): # DEPENDENCIES_EPOCH_NUMBER in the Dockerfile devel = [ -'astroid~=2.2.5', # to be removed after pylint solves this: https://github.com/PyCQA/pylint/issues/3123 Review comment: I wanted to create a separate PR for that due to a huge amount of changes when upgrading pylint to >= 2.4 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] feluelle commented on a change in pull request #6659: [AIRFLOW-6063] Remove astroid dependency
feluelle commented on a change in pull request #6659: [AIRFLOW-6063] Remove astroid dependency URL: https://github.com/apache/airflow/pull/6659#discussion_r350679751 ## File path: setup.py ## @@ -277,7 +277,6 @@ def write_version(filename: str = os.path.join(*["airflow", "git_version"])): # DEPENDENCIES_EPOCH_NUMBER in the Dockerfile devel = [ -'astroid~=2.2.5', # to be removed after pylint solves this: https://github.com/PyCQA/pylint/issues/3123 Review comment: I would like to create a separate PR for that due to a huge amount of changes when upgrading pylint to >= 2.4 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on issue #6667: Clarified a grammatically incorrect sentence
potiuk commented on issue #6667: Clarified a grammatically incorrect sentence URL: https://github.com/apache/airflow/pull/6667#issuecomment-558585935 Thanks @daphshez ! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk merged pull request #6667: Clarified a grammatically incorrect sentence
potiuk merged pull request #6667: Clarified a grammatically incorrect sentence URL: https://github.com/apache/airflow/pull/6667 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk merged pull request #6659: [AIRFLOW-6063] Remove astroid dependency
potiuk merged pull request #6659: [AIRFLOW-6063] Remove astroid dependency URL: https://github.com/apache/airflow/pull/6659 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6063) Remove astroid dependency
[ https://issues.apache.org/jira/browse/AIRFLOW-6063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982378#comment-16982378 ] ASF GitHub Bot commented on AIRFLOW-6063: - potiuk commented on pull request #6659: [AIRFLOW-6063] Remove astroid dependency URL: https://github.com/apache/airflow/pull/6659 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Remove astroid dependency > - > > Key: AIRFLOW-6063 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6063 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Felix Uellendall >Assignee: Felix Uellendall >Priority: Major > > Note this dependency was only meant to temporarily fix an issue with pylint > and astroid. > See https://github.com/PyCQA/pylint/issues/3123 for more information. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] potiuk commented on a change in pull request #6659: [AIRFLOW-6063] Remove astroid dependency
potiuk commented on a change in pull request #6659: [AIRFLOW-6063] Remove astroid dependency URL: https://github.com/apache/airflow/pull/6659#discussion_r350683949 ## File path: setup.py ## @@ -277,7 +277,6 @@ def write_version(filename: str = os.path.join(*["airflow", "git_version"])): # DEPENDENCIES_EPOCH_NUMBER in the Dockerfile devel = [ -'astroid~=2.2.5', # to be removed after pylint solves this: https://github.com/PyCQA/pylint/issues/3123 Review comment: OK then :). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Resolved] (AIRFLOW-6063) Remove astroid dependency
[ https://issues.apache.org/jira/browse/AIRFLOW-6063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jarek Potiuk resolved AIRFLOW-6063. --- Fix Version/s: 2.0.0 Resolution: Fixed > Remove astroid dependency > - > > Key: AIRFLOW-6063 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6063 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Felix Uellendall >Assignee: Felix Uellendall >Priority: Major > Fix For: 2.0.0 > > > Note this dependency was only meant to temporarily fix an issue with pylint > and astroid. > See https://github.com/PyCQA/pylint/issues/3123 for more information. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6063) Remove astroid dependency
[ https://issues.apache.org/jira/browse/AIRFLOW-6063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982379#comment-16982379 ] ASF subversion and git services commented on AIRFLOW-6063: -- Commit e51e1c770dad235b3fd8fdc330e44b83df8dcc4a in airflow's branch refs/heads/master from Felix Uellendall [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=e51e1c7 ] [AIRFLOW-6063] Remove astroid dependency (#6659) Note this dependency was only meant to temporarily fix an issue with pylint and astroid. See https://github.com/PyCQA/pylint/issues/3123 for more information. > Remove astroid dependency > - > > Key: AIRFLOW-6063 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6063 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0, 1.10.6 >Reporter: Felix Uellendall >Assignee: Felix Uellendall >Priority: Major > > Note this dependency was only meant to temporarily fix an issue with pylint > and astroid. > See https://github.com/PyCQA/pylint/issues/3123 for more information. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6063) Remove astroid dependency
[ https://issues.apache.org/jira/browse/AIRFLOW-6063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jarek Potiuk updated AIRFLOW-6063: -- Affects Version/s: (was: 1.10.6) > Remove astroid dependency > - > > Key: AIRFLOW-6063 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6063 > Project: Apache Airflow > Issue Type: Improvement > Components: ci >Affects Versions: 2.0.0 >Reporter: Felix Uellendall >Assignee: Felix Uellendall >Priority: Major > Fix For: 2.0.0 > > > Note this dependency was only meant to temporarily fix an issue with pylint > and astroid. > See https://github.com/PyCQA/pylint/issues/3123 for more information. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] ashb commented on issue #6647: [AIRFLOW-6049] set propagate True when needed in airflow test
ashb commented on issue #6647: [AIRFLOW-6049] set propagate True when needed in airflow test URL: https://github.com/apache/airflow/pull/6647#issuecomment-558587771 @pingzh When/how do you get this logging config set like this? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] mik-laj commented on a change in pull request #6612: [AIRFLOW-6018] Display task instance in table during backfilling
mik-laj commented on a change in pull request #6612: [AIRFLOW-6018] Display task instance in table during backfilling URL: https://github.com/apache/airflow/pull/6612#discussion_r350693661 ## File path: airflow/models/taskinstance.py ## @@ -122,6 +122,9 @@ def clear_task_instances(tis, dr.start_date = timezone.utcnow() +TaskInstanceKey = Tuple[str, str, datetime, int] Review comment: I will add documentation, but in other places I would prefer not to fix it. Currently Jarek deals with this and checking each place thoroughly is very difficult. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] mik-laj commented on a change in pull request #6612: [AIRFLOW-6018] Display task instance in table during backfilling
mik-laj commented on a change in pull request #6612: [AIRFLOW-6018] Display task instance in table during backfilling URL: https://github.com/apache/airflow/pull/6612#discussion_r350694826 ## File path: airflow/models/taskinstance.py ## @@ -122,6 +122,9 @@ def clear_task_instances(tis, dr.start_date = timezone.utcnow() +TaskInstanceKey = Tuple[str, str, datetime, int] Review comment: We want to introduce named tuple, but first we have to solve problems with cyclical imports. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (AIRFLOW-6073) Move Qubole Operator Link class to qubole_operator.py
Kaxil Naik created AIRFLOW-6073: --- Summary: Move Qubole Operator Link class to qubole_operator.py Key: AIRFLOW-6073 URL: https://issues.apache.org/jira/browse/AIRFLOW-6073 Project: Apache Airflow Issue Type: Sub-task Components: contrib Affects Versions: 2.0.0 Reporter: Kaxil Naik Assignee: Kaxil Naik Fix For: 2.0.0 The OperatorLink should be in the file where Operator is defined for consistency -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] kaxil opened a new pull request #6668: [AIRFLOW-6073] Move Qubole Operator Link class to qubole_operator.py
kaxil opened a new pull request #6668: [AIRFLOW-6073] Move Qubole Operator Link class to qubole_operator.py URL: https://github.com/apache/airflow/pull/6668 Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. - https://issues.apache.org/jira/browse/AIRFLOW-6073 ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: The OperatorLink should be in the file where Operator is defined for consistency ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: Current tests cover it ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [x] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6073) Move Qubole Operator Link class to qubole_operator.py
[ https://issues.apache.org/jira/browse/AIRFLOW-6073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982407#comment-16982407 ] ASF GitHub Bot commented on AIRFLOW-6073: - kaxil commented on pull request #6668: [AIRFLOW-6073] Move Qubole Operator Link class to qubole_operator.py URL: https://github.com/apache/airflow/pull/6668 Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. - https://issues.apache.org/jira/browse/AIRFLOW-6073 ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: The OperatorLink should be in the file where Operator is defined for consistency ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: Current tests cover it ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [x] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Move Qubole Operator Link class to qubole_operator.py > - > > Key: AIRFLOW-6073 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6073 > Project: Apache Airflow > Issue Type: Sub-task > Components: contrib >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > The OperatorLink should be in the file where Operator is defined for > consistency -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow-site] kgabryje opened a new pull request #196: set hugo env to production
kgabryje opened a new pull request #196: set hugo env to production URL: https://github.com/apache/airflow-site/pull/196 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (AIRFLOW-6074) Logging to Azure Blob Storage
david diaz created AIRFLOW-6074: --- Summary: Logging to Azure Blob Storage Key: AIRFLOW-6074 URL: https://issues.apache.org/jira/browse/AIRFLOW-6074 Project: Apache Airflow Issue Type: Bug Components: logging Affects Versions: 1.10.6 Reporter: david diaz Attachments: image-2019-11-26-13-20-13-271.png The template in airflow/airflow/config_templates/airflow_local_setting.py contains a hard coded name for the wasb_container attribute in REMOTE_HANDLERS !image-2019-11-26-13-20-13-271.png! The azure blob hook uses that name to look for the container to place the logs, but if it fails due to that container not existing it will log that the container doesnt exist using the name of the enviromental variable REMOTE_BASE_LOG_FOLDER . Either the logging or the hardcoded value should be changed. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] nuclearpinguin commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest
nuclearpinguin commented on a change in pull request #6472: [AIRFLOW-6058] Running tests with pytest URL: https://github.com/apache/airflow/pull/6472#discussion_r350710976 ## File path: tests/pytest.ini ## @@ -0,0 +1,37 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +[pytest] +addopts = +-rasl +--cov=airflow/ +--cov-report html:airflow/www/static/coverage/ +--ignore=tests/dags/test_dag_serialization.py +;This will treat all tests as flaky +;--force-flaky +norecursedirs = +tests/dags_with_system_exit +tests/test_utils +tests/dags_corrupted +faulthandler_timeout=180 +log_print = True +log_level = INFO +filterwarnings = +ignore::DeprecationWarning +ignore::PendingDeprecationWarning +ignore::RuntimeWarning Review comment: I agree with both of you @feluelle and @potiuk. Removing depreciation warning adds around 800 lines to CI log. So I would do as @potiuk suggested: filter warning only when run on CI. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] yuqian90 commented on issue #6633: [AIRFLOW-2279] Clear tasks across DAGs if marked by ExternalTaskMarker
yuqian90 commented on issue #6633: [AIRFLOW-2279] Clear tasks across DAGs if marked by ExternalTaskMarker URL: https://github.com/apache/airflow/pull/6633#issuecomment-558607500 > @yuqian90 there is a problem with the order of your imports the test is failing due to it Thanks @OmerJog. sorry i missed that static check in my own testing. I'll fix the import order in a week's time after my vacation. In the meantime, if there are any comments on the idea or the code changes, pls let me know, I'll address those too. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (AIRFLOW-6072) aws_hook: Ability to set outbound proxy
[ https://issues.apache.org/jira/browse/AIRFLOW-6072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bjorn Olsen updated AIRFLOW-6072: - Description: The boto3 connection used by aws_hook does not respect outbound http_proxy settings (even if these are set in system wide). The way to configure a proxy is to pass a botocore.config.Config object to boto3 when creating a client (according to this SO post). [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] While the aws_hook get_client_type() method is used extensively by AWS Operators, the "config" argument is not used by any operator. Adding a check to aws_hook for "config" in the "extra_config" of the Airflow Connection, could allow us to pass kwargs there that build the Config object automatically by the hook. Otherwise we have to update every AWS Operator to also take a "config" parameter. To set an outbound proxy is then as simple as adding this to your extra_config: #{ .. , #"config":{"proxies": { #"http": "http://myproxy:8080";, #"https": "http://myproxy:8080"}}, # ..} was: The boto3 connection used by aws_hook does not respect outbound http_proxy settings (even if these are set in system wide). The way to configure a proxy is to pass a botocore.config.Config object to boto3 when creating a client (according to this SO post). [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] While the aws_hook get_client_type() method is used extensively by AWS Operators, the "config" argument is not used by any operator. Adding a check to aws_hook for "config" in the "extra_config" of the Airflow Connection, could allow us to pass kwargs there that build the Config object automatically by the hook is created. Otherwise we have to update every AWS Operator to also take a "config" parameter. > aws_hook: Ability to set outbound proxy > --- > > Key: AIRFLOW-6072 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6072 > Project: Apache Airflow > Issue Type: Improvement > Components: aws >Affects Versions: 1.10.6 >Reporter: Bjorn Olsen >Assignee: Bjorn Olsen >Priority: Minor > > The boto3 connection used by aws_hook does not respect outbound http_proxy > settings (even if these are set in system wide). > > The way to configure a proxy is to pass a botocore.config.Config object to > boto3 when creating a client (according to this SO post). > [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] > While the aws_hook get_client_type() method is used extensively by AWS > Operators, the "config" argument is not used by any operator. > Adding a check to aws_hook for "config" in the "extra_config" of the Airflow > Connection, could allow us to pass kwargs there that build the Config object > automatically by the hook. > Otherwise we have to update every AWS Operator to also take a "config" > parameter. > > To set an outbound proxy is then as simple as adding this to your > extra_config: > #{ .. , > #"config":{"proxies": { > #"http": "http://myproxy:8080";, > #"https": "http://myproxy:8080"}}, > # ..} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6072) aws_hook: Ability to set outbound proxy
[ https://issues.apache.org/jira/browse/AIRFLOW-6072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bjorn Olsen updated AIRFLOW-6072: - Description: The boto3 connection used by aws_hook does not respect outbound http_proxy settings (even if these are set in system wide). The way to configure a proxy is to pass a botocore.config.Config object to boto3 when creating a client (according to this SO post). [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] While the aws_hook get_client_type() method is used extensively by AWS Operators, the "config" argument is not used by any operator. Adding a check to aws_hook for "config" in the "extra_config" of the Airflow Connection, could allow us to pass kwargs there that build the Config object automatically by the hook. Otherwise we have to update every AWS Operator to also take a "config" parameter. To set an outbound proxy is then as simple as adding this to your extra_config: {code:java} { .. , "config":{ "proxies": { "http": "http://myproxy:8080";, "https": "http://myproxy:8080"; }}, .. } {code} This needs to work both for the main boto3 clients that do task work, but also during the assume_role process which also uses a boto3 client. was: The boto3 connection used by aws_hook does not respect outbound http_proxy settings (even if these are set in system wide). The way to configure a proxy is to pass a botocore.config.Config object to boto3 when creating a client (according to this SO post). [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] While the aws_hook get_client_type() method is used extensively by AWS Operators, the "config" argument is not used by any operator. Adding a check to aws_hook for "config" in the "extra_config" of the Airflow Connection, could allow us to pass kwargs there that build the Config object automatically by the hook. Otherwise we have to update every AWS Operator to also take a "config" parameter. To set an outbound proxy is then as simple as adding this to your extra_config: #{ .. , #"config":{"proxies": { #"http": "http://myproxy:8080";, #"https": "http://myproxy:8080"}}, # ..} > aws_hook: Ability to set outbound proxy > --- > > Key: AIRFLOW-6072 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6072 > Project: Apache Airflow > Issue Type: Improvement > Components: aws >Affects Versions: 1.10.6 >Reporter: Bjorn Olsen >Assignee: Bjorn Olsen >Priority: Minor > > The boto3 connection used by aws_hook does not respect outbound http_proxy > settings (even if these are set in system wide). > > The way to configure a proxy is to pass a botocore.config.Config object to > boto3 when creating a client (according to this SO post). > [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] > While the aws_hook get_client_type() method is used extensively by AWS > Operators, the "config" argument is not used by any operator. > Adding a check to aws_hook for "config" in the "extra_config" of the Airflow > Connection, could allow us to pass kwargs there that build the Config object > automatically by the hook. > Otherwise we have to update every AWS Operator to also take a "config" > parameter. > > To set an outbound proxy is then as simple as adding this to your > extra_config: > {code:java} > { .. , > "config":{ "proxies": { > "http": "http://myproxy:8080";, > "https": "http://myproxy:8080"; }}, > .. } > {code} > > This needs to work both for the main boto3 clients that do task work, but > also during the assume_role process which also uses a boto3 client. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] feluelle commented on a change in pull request #6573: [AIRFLOW-5904] "Trigger DAG" should redirect to the "dag_default_view" page
feluelle commented on a change in pull request #6573: [AIRFLOW-5904] "Trigger DAG" should redirect to the "dag_default_view" page URL: https://github.com/apache/airflow/pull/6573#discussion_r350731708 ## File path: tests/www/test_views.py ## @@ -1744,12 +1744,23 @@ def setUp(self): super().setUp() self.session = Session() models.DagBag().get_dag("example_bash_operator").sync_to_db(session=self.session) +self.graph_endpoint = '/graph?dag_id=example_bash_operator' +self.trigger_url = 'origin=%2F{}%3Fdag_id%3Dexample_bash_operator' def test_trigger_dag_button_normal_exist(self): resp = self.client.get('/', follow_redirects=True) self.assertIn('/trigger?dag_id=example_bash_operator', resp.data.decode('utf-8')) self.assertIn("return confirmDeleteDag(this, 'example_bash_operator')", resp.data.decode('utf-8')) +def test_trigger_dag_default_view(self): +resp = self.client.get(self.graph_endpoint, follow_redirects=False) +self.assertIn(self.trigger_url.format('tree'), resp.data.decode('utf-8')) + +def test_trigger_dag_graph_view(self): +conf.set("webserver", "dag_default_view", "graph") Review comment: ```suggestion ``` It is safer to use the conf context manager: ```python with conf_vars({('webserver', 'dag_default_view'): 'graph'}): ... ``` so that value will be reset afterwards. You can also use it as function decorator `@conf_vars({('webserver', 'dag_default_view'): 'graph'})` which in my opinion is even cleaner. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] OmerJog commented on issue #4751: [AIRFLOW-3607] collected trigger rule dep check per dag run
OmerJog commented on issue #4751: [AIRFLOW-3607] collected trigger rule dep check per dag run URL: https://github.com/apache/airflow/pull/4751#issuecomment-558623032 @amichai07 still there are errors: ``` 43) ERROR: test_backfill_examples_0_example_branch_operator (tests.jobs.test_backfill_job.TestBackfillJob) -- Traceback (most recent call last): /usr/local/lib/python3.6/site-packages/parameterized/parameterized.py line 518 in standalone_func return func(*(a + p.args), **p.kwargs) tests/jobs/test_backfill_job.py line 274 in test_backfill_examples job.run() airflow/jobs/base_job.py line 217 in run self._execute() airflow/utils/db.py line 68 in wrapper return func(*args, **kwargs) airflow/jobs/backfill_job.py line 766 in _execute session=session) airflow/utils/db.py line 64 in wrapper return func(*args, **kwargs) airflow/jobs/backfill_job.py line 696 in _execute_for_run_dates session=session) airflow/utils/db.py line 64 in wrapper return func(*args, **kwargs) airflow/jobs/backfill_job.py line 582 in _process_backfill_task_instances _per_task_process(task, key, ti) airflow/utils/db.py line 68 in wrapper return func(*args, **kwargs) airflow/jobs/backfill_job.py line 475 in _per_task_process verbose=self.verbose): airflow/utils/db.py line 64 in wrapper return func(*args, **kwargs) airflow/models/taskinstance.py line 603 in are_dependencies_met session=session): airflow/models/taskinstance.py line 627 in get_failed_dep_statuses dep_context): airflow/ti_deps/deps/base_ti_dep.py line 106 in get_dep_statuses yield from self._get_dep_statuses(ti, session, dep_context) airflow/ti_deps/deps/trigger_rule_dep.py line 68 in _get_dep_statuses state=State.finished() + [State.UPSTREAM_FAILED], session=session)\ AttributeError: 'list' object has no attribute 'options' >> begin captured logging << tests.jobs.test_backfill_job: INFO: *** Running example DAG: example_branch_operator tests.executors.test_executor.TestExecutor: INFO: Adding to queue: ['airflow', 'tasks', 'run', 'example_branch_operator', 'run_this_first', '2016-01-01T00:00:00+00:00', '--pickle', '4', '-I', '--local', '--pool', 'default_pool'] - >> end captured logging << - ``` ``` == 48) FAIL: test_backfill_max_limit_check (tests.jobs.test_backfill_job.TestBackfillJob) -- Traceback (most recent call last): tests/jobs/test_backfill_job.py line 972 in test_backfill_max_limit_check self.assertEqual(3, len(dagruns)) # 2 from backfill + 1 existing AssertionError: 3 != 2 >> begin captured logging << tests.executors.test_executor.TestExecutor: INFO: Adding to queue: ['airflow', 'tasks', 'run', 'test_backfill_max_limit_check', 'leave1', '2015-12-31T23:00:00+00:00', '--local', '--pool', 'default_pool'] - >> end captured logging << - ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] baolsen opened a new pull request #6669: [AIRFLOW-6072] aws_hook: Ability to set outbound proxy
baolsen opened a new pull request #6669: [AIRFLOW-6072] aws_hook: Ability to set outbound proxy URL: https://github.com/apache/airflow/pull/6669 Make sure you have checked _all_ steps below. ### Jira - [X] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)). - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). ### Description - [X] Here are some details about my PR, including screenshots of any UI changes: Description on JIRA ### Tests - [X] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: Not a major change ### Commits - [X] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [X] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release There are code comments for users who need to go down the proxy rabbit hole. However the whole AWS Hook should be documented better as the code is quite confusing and I often resorted to adding logging/debug statements just to figure out what was going on. I'll create a new JIRA for that. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6072) aws_hook: Ability to set outbound proxy
[ https://issues.apache.org/jira/browse/AIRFLOW-6072?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982468#comment-16982468 ] ASF GitHub Bot commented on AIRFLOW-6072: - baolsen commented on pull request #6669: [AIRFLOW-6072] aws_hook: Ability to set outbound proxy URL: https://github.com/apache/airflow/pull/6669 Make sure you have checked _all_ steps below. ### Jira - [X] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)). - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). ### Description - [X] Here are some details about my PR, including screenshots of any UI changes: Description on JIRA ### Tests - [X] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: Not a major change ### Commits - [X] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [X] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release There are code comments for users who need to go down the proxy rabbit hole. However the whole AWS Hook should be documented better as the code is quite confusing and I often resorted to adding logging/debug statements just to figure out what was going on. I'll create a new JIRA for that. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > aws_hook: Ability to set outbound proxy > --- > > Key: AIRFLOW-6072 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6072 > Project: Apache Airflow > Issue Type: Improvement > Components: aws >Affects Versions: 1.10.6 >Reporter: Bjorn Olsen >Assignee: Bjorn Olsen >Priority: Minor > > The boto3 connection used by aws_hook does not respect outbound http_proxy > settings (even if these are set in system wide). > > The way to configure a proxy is to pass a botocore.config.Config object to > boto3 when creating a client (according to this SO post). > [https://stackoverflow.com/questions/33480108/how-do-you-use-an-http-https-proxy-with-boto3] > While the aws_hook get_client_type() method is used extensively by AWS > Operators, the "config" argument is not used by any operator. > Adding a check to aws_hook for "config" in the "extra_config" of the Airflow > Connection, could allow us to pass kwargs there that build the Config object > automatically by the hook. > Otherwise we have to update every AWS Operator to also take a "config" > parameter. > > To set an outbound proxy is then as simple as adding this to your > extra_config: > {code:java} > { .. , > "config":{ "proxies": { > "http": "http://myproxy:8080";, > "https": "http://myproxy:8080"; }}, > .. } > {code} > > This needs to work both for the main boto3 clients that do task work, but > also during the assume_role process which also uses a boto3 client. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6075) AWS Hook: Improve documentation
Bjorn Olsen created AIRFLOW-6075: Summary: AWS Hook: Improve documentation Key: AIRFLOW-6075 URL: https://issues.apache.org/jira/browse/AIRFLOW-6075 Project: Apache Airflow Issue Type: Improvement Components: aws Affects Versions: 1.10.6 Reporter: Bjorn Olsen The AWS Hook logic is confusing and could do with some extra logging. This would help users to correctly set up their AWS connections for the first time. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-203) Scheduler fails to reliably schedule tasks when many dag runs are triggered
[ https://issues.apache.org/jira/browse/AIRFLOW-203?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16982469#comment-16982469 ] Shlomi Cohen commented on AIRFLOW-203: -- Hi i know this post is old but 3 year later, we face the same problem with version 1.10-5 . we have 1 DAG and want to launch hundreds of dag runs for it (manually with different configuration) , even if taking the example dag and launching 100 runs for it - stuck the scheduler and it needs to be restarted. any help will be appreciated - cause i have played with every possible configuration airflow has to offer and still get to this problem. thanks Shlomi > Scheduler fails to reliably schedule tasks when many dag runs are triggered > --- > > Key: AIRFLOW-203 > URL: https://issues.apache.org/jira/browse/AIRFLOW-203 > Project: Apache Airflow > Issue Type: Bug > Components: scheduler >Affects Versions: 1.7.1.2 >Reporter: Sergei Iakhnin >Priority: Major > Attachments: airflow.cfg, airflow_scheduler_non_working.log, > airflow_scheduler_working.log > > > Using Airflow with Celery, Rabbitmq, and Postgres backend. Running 1 master > node and 115 worker nodes, each with 8 cores. The workflow consists of series > of 27 tasks, some of which are nearly instantaneous and some take hours to > complete. Dag runs are manually triggered, about 3000 at a time, resulting in > roughly 75 000 tasks. > My observations are that the scheduling behaviour is extremely inconsistent, > i.e. about 1000 tasks get scheduled and executed and then no new tasks get > scheduled after that. Sometimes it is enough to restart the scheduler for new > tasks to get scheduled, sometimes the scheduler and worker services need to > be restarted multiple times to get any progress. When I look at the scheduler > output it seems to be chugging away at trying to schedule tasks with messages > like: > "2016-06-01 11:28:25,908] {base_executor.py:34} INFO - Adding to queue: > airflow run ..." > However, these tasks do not show up in queued status on the UI and don't > actually get scheduled out to the workers (nor make it into the rabbitmq > queue, or the task_instance table). > It is unclear what may be causing this behaviour as no errors are produced > anywhere. The impact is especially high when short-running tasks are > concerned because the cluster should be able to blow through them within a > couple of minutes, but instead it takes hours of manual restarts to get > through them. > I'm happy to share logs or any other useful debug output as desired. > Thanks in advance. > Sergei. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] mik-laj commented on a change in pull request #6644: [AIRFLOW-6047] Simplify the logging configuration template
mik-laj commented on a change in pull request #6644: [AIRFLOW-6047] Simplify the logging configuration template URL: https://github.com/apache/airflow/pull/6644#discussion_r350742530 ## File path: airflow/config_templates/airflow_local_settings.py ## @@ -191,32 +215,6 @@ 'json_format': ELASTICSEARCH_JSON_FORMAT, 'json_fields': ELASTICSEARCH_JSON_FIELDS }, -}, -} - -REMOTE_LOGGING = conf.getboolean('core', 'remote_logging') - -# Only update the handlers and loggers when CONFIG_PROCESSOR_MANAGER_LOGGER is set. -# This is to avoid exceptions when initializing RotatingFileHandler multiple times -# in multiple processes. -if os.environ.get('CONFIG_PROCESSOR_MANAGER_LOGGER') == 'True': -DEFAULT_LOGGING_CONFIG['handlers'] \ -.update(DEFAULT_DAG_PARSING_LOGGING_CONFIG['handlers']) -DEFAULT_LOGGING_CONFIG['loggers'] \ -.update(DEFAULT_DAG_PARSING_LOGGING_CONFIG['loggers']) - -# Manually create log directory for processor_manager handler as RotatingFileHandler -# will only create file but not the directory. -processor_manager_handler_config = DEFAULT_DAG_PARSING_LOGGING_CONFIG['handlers'][ -'processor_manager'] -directory = os.path.dirname(processor_manager_handler_config['filename']) -mkdirs(directory, 0o755) +} -if REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('s3://'): -DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['s3']) -elif REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('gs://'): -DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['gcs']) -elif REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('wasb'): -DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['wasb']) -elif REMOTE_LOGGING and ELASTICSEARCH_HOST: -DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['elasticsearch']) +DEFAULT_LOGGING_CONFIG['handlers'].update(ELASTIC_REMOTE_HANDLERS) Review comment: I added else statement at the end. Is it looks good for you? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services