[GitHub] [airflow] codecov-io edited a comment on issue #7163: [AIRFLOW-6542] add spark-on-k8s operator/hook/sensor

2020-02-27 Thread GitBox
codecov-io edited a comment on issue #7163: [AIRFLOW-6542] add spark-on-k8s 
operator/hook/sensor
URL: https://github.com/apache/airflow/pull/7163#issuecomment-574641714
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=h1) 
Report
   > Merging 
[#7163](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/2cc8d20fcfff64717b492ed34f2808bf4a5c85c9?src=pr=desc)
 will **decrease** coverage by `53.92%`.
   > The diff coverage is `32.52%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7163/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master#7163   +/-   ##
   ===
   - Coverage   86.82%   32.89%   -53.93% 
   ===
 Files 896  899+3 
 Lines   4263542745  +110 
   ===
   - Hits3701714062-22955 
   - Misses   561828683+23065
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `53.52% <ø> (-41.55%)` | :arrow_down: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `25.74% <0%> (-50.5%)` | :arrow_down: |
   | 
[.../example\_dags/example\_spark\_kubernetes\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvY25jZi9rdWJlcm5ldGVzL2V4YW1wbGVfZGFncy9leGFtcGxlX3NwYXJrX2t1YmVybmV0ZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/www/forms.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZm9ybXMucHk=)
 | `93.1% <100%> (-6.9%)` | :arrow_down: |
   | 
[...flow/providers/cncf/kubernetes/hooks/kubernetes.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvY25jZi9rdWJlcm5ldGVzL2hvb2tzL2t1YmVybmV0ZXMucHk=)
 | `28.57% <28.57%> (ø)` | |
   | 
[...viders/cncf/kubernetes/sensors/spark\_kubernetes.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvY25jZi9rdWJlcm5ldGVzL3NlbnNvcnMvc3Bhcmtfa3ViZXJuZXRlcy5weQ==)
 | `33.33% <33.33%> (ø)` | |
   | 
[...ders/cncf/kubernetes/operators/spark\_kubernetes.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvY25jZi9rdWJlcm5ldGVzL29wZXJhdG9ycy9zcGFya19rdWJlcm5ldGVzLnB5)
 | `39.47% <39.47%> (ø)` | |
   | 
[...low/contrib/operators/wasb\_delete\_blob\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy93YXNiX2RlbGV0ZV9ibG9iX29wZXJhdG9yLnB5)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...ing\_platform/example\_dags/example\_display\_video.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL21hcmtldGluZ19wbGF0Zm9ybS9leGFtcGxlX2RhZ3MvZXhhbXBsZV9kaXNwbGF5X3ZpZGVvLnB5)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/vertica\_hook.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3ZlcnRpY2FfaG9vay5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | ... and [774 
more](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=footer). 
Last update 
[2cc8d20...b479022](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6948) Remove ASCII Airflow from version command

2020-02-27 Thread Tomasz Urbaszek (Jira)
Tomasz Urbaszek created AIRFLOW-6948:


 Summary: Remove ASCII Airflow from version command
 Key: AIRFLOW-6948
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6948
 Project: Apache Airflow
  Issue Type: Improvement
  Components: cli
Affects Versions: 2.0.0
Reporter: Tomasz Urbaszek






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] konpap94 commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral storage on KubernetesPodOp…

2020-02-27 Thread GitBox
konpap94 commented on issue #6337: [AIRFLOW-5659] - Add support for ephemeral 
storage on KubernetesPodOp…
URL: https://github.com/apache/airflow/pull/6337#issuecomment-592023535
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] roitvt commented on issue #7163: [AIRFLOW-6542] add spark-on-k8s operator/hook/sensor

2020-02-27 Thread GitBox
roitvt commented on issue #7163: [AIRFLOW-6542] add spark-on-k8s 
operator/hook/sensor
URL: https://github.com/apache/airflow/pull/7163#issuecomment-592021693
 
 
   @kaxil and @ashb I added test and fixed the things from your last review and 
you check again?
   Thanks 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] roitvt commented on issue #7163: [AIRFLOW-6542] add spark-on-k8s operator/hook/sensor

2020-02-27 Thread GitBox
roitvt commented on issue #7163: [AIRFLOW-6542] add spark-on-k8s 
operator/hook/sensor
URL: https://github.com/apache/airflow/pull/7163#issuecomment-592021378
 
 
   @kaxil and @ashb I added test and fixed the things from your last review and 
you check again?
   Thanks 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on issue #7343: [AIRFLOW-6719] Introduce pyupgrade to enforce latest syntax

2020-02-27 Thread GitBox
zhongjiajie commented on issue #7343: [AIRFLOW-6719] Introduce pyupgrade to 
enforce latest syntax
URL: https://github.com/apache/airflow/pull/7343#issuecomment-592015585
 
 
   @mik-laj Sure, but how could I do? submit a new PR to this repo? or create a 
PR to PolideaInternal:AIRFLOW-6719-pyupgrade?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7565: [AIRFLOW-6941][WIP] Add queries count test for create_dagrun

2020-02-27 Thread GitBox
mik-laj commented on a change in pull request #7565: [AIRFLOW-6941][WIP] Add 
queries count test for create_dagrun
URL: https://github.com/apache/airflow/pull/7565#discussion_r385176259
 
 

 ##
 File path: tests/models/test_dag.py
 ##
 @@ -1341,3 +1344,26 @@ class DAGsubclass(DAG):
 self.assertEqual(hash(dag_eq), hash(dag))
 self.assertNotEqual(hash(dag_diff_name), hash(dag))
 self.assertNotEqual(hash(dag_subclass), hash(dag))
+
+
+class TestPerformance(unittest.TestCase):
 
 Review comment:
   Work on other performance tests will start soon and then we should plan a 
solution on how we will mark the tests.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6939) Executor configuration via import path

2020-02-27 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6939?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046714#comment-17046714
 ] 

ASF subversion and git services commented on AIRFLOW-6939:
--

Commit 37a8f6a91075207f56bb856661fb75b1c814f2de in airflow's branch 
refs/heads/master from Kamil Breguła
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=37a8f6a ]

[AIRFLOW-6939] Executor configuration via import path (#7563)



> Executor configuration via import path
> --
>
> Key: AIRFLOW-6939
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6939
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: executors
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ryw commented on a change in pull request #7553: [AIRFLOW-XXXX] Update LICENSE versions and remove old licenses

2020-02-27 Thread GitBox
ryw commented on a change in pull request #7553: [AIRFLOW-] Update LICENSE 
versions and remove old licenses
URL: https://github.com/apache/airflow/pull/7553#discussion_r385171624
 
 

 ##
 File path: LICENSE
 ##
 @@ -229,36 +229,22 @@ MIT licenses
 The following components are provided under the MIT License. See project link 
for details.
 The text of each license is also included at licenses/LICENSE-[project].txt.
 
-(MIT License) jquery v2.1.4 (https://jquery.org/license/)
-(MIT License) dagre-d3 v0.6.1 (https://github.com/cpettitt/dagre-d3)
+(MIT License) jquery v3.4.1 (https://jquery.org/license/)
+(MIT License) dagre-d3 v0.8.5 (https://github.com/cpettitt/dagre-d3)
 
 Review comment:
   good catch, i was looking at current version of dagre rather than dagre-d3


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6939) Executor configuration via import path

2020-02-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6939?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046712#comment-17046712
 ] 

ASF GitHub Bot commented on AIRFLOW-6939:
-

mik-laj commented on pull request #7563: [AIRFLOW-6939] Executor configuration 
via import path
URL: https://github.com/apache/airflow/pull/7563
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Executor configuration via import path
> --
>
> Key: AIRFLOW-6939
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6939
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: executors
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj merged pull request #7563: [AIRFLOW-6939] Executor configuration via import path

2020-02-27 Thread GitBox
mik-laj merged pull request #7563: [AIRFLOW-6939] Executor configuration via 
import path
URL: https://github.com/apache/airflow/pull/7563
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7565: [AIRFLOW-6941][WIP] Add queries count test for create_dagrun

2020-02-27 Thread GitBox
mik-laj commented on a change in pull request #7565: [AIRFLOW-6941][WIP] Add 
queries count test for create_dagrun
URL: https://github.com/apache/airflow/pull/7565#discussion_r385172573
 
 

 ##
 File path: tests/models/test_dag.py
 ##
 @@ -1341,3 +1344,26 @@ class DAGsubclass(DAG):
 self.assertEqual(hash(dag_eq), hash(dag))
 self.assertNotEqual(hash(dag_diff_name), hash(dag))
 self.assertNotEqual(hash(dag_subclass), hash(dag))
+
+
+class TestPerformance(unittest.TestCase):
 
 Review comment:
   I don't think that's needed. These are very simple and small tests. In the 
future, when we introduce integration tests and tests for components, not just 
functions, we can do it. For me, this is a unit test, although it has slightly 
different assertions. Performance tests are tests that have metrics that can 
change in the environment. These tests do not change in the environment.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-6939) Executor configuration via import path

2020-02-27 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6939?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-6939.

Fix Version/s: 2.0.0
   Resolution: Fixed

> Executor configuration via import path
> --
>
> Key: AIRFLOW-6939
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6939
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: executors
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj commented on a change in pull request #7565: [AIRFLOW-6941][WIP] Add queries count test for create_dagrun

2020-02-27 Thread GitBox
mik-laj commented on a change in pull request #7565: [AIRFLOW-6941][WIP] Add 
queries count test for create_dagrun
URL: https://github.com/apache/airflow/pull/7565#discussion_r385170832
 
 

 ##
 File path: tests/models/test_dag.py
 ##
 @@ -1341,3 +1344,26 @@ class DAGsubclass(DAG):
 self.assertEqual(hash(dag_eq), hash(dag))
 self.assertNotEqual(hash(dag_diff_name), hash(dag))
 self.assertNotEqual(hash(dag_subclass), hash(dag))
+
+
+class TestPerformance(unittest.TestCase):
 
 Review comment:
   I don't think that's needed. These are very simple and small tests. In the 
future, when we introduce integration tests and tests for components, not just 
functions, we can do it. For me, this is a unit test, although it has slightly 
different assertions. Performance tests are tests that have metrics that can 
change in the environment. These tests do not change in the environment.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on issue #7536: [AIRFLOW-6918] don't use 'is' in if conditions comparing STATE

2020-02-27 Thread GitBox
feluelle commented on issue #7536: [AIRFLOW-6918] don't use 'is' in if 
conditions comparing STATE
URL: https://github.com/apache/airflow/pull/7536#issuecomment-592005586
 
 
   Can you try a rebase?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tooptoop4 commented on issue #7536: [AIRFLOW-6918] don't use 'is' in if conditions comparing STATE

2020-02-27 Thread GitBox
tooptoop4 commented on issue #7536: [AIRFLOW-6918] don't use 'is' in if 
conditions comparing STATE
URL: https://github.com/apache/airflow/pull/7536#issuecomment-592000630
 
 
   @feluelle any idea why travis unhappy? maybe the current code (before my PR) 
was always evaluating to false 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7565: [AIRFLOW-6941][WIP] Add queries count test for create_dagrun

2020-02-27 Thread GitBox
nuclearpinguin commented on a change in pull request #7565: [AIRFLOW-6941][WIP] 
Add queries count test for create_dagrun
URL: https://github.com/apache/airflow/pull/7565#discussion_r385154503
 
 

 ##
 File path: tests/models/test_dag.py
 ##
 @@ -1341,3 +1344,26 @@ class DAGsubclass(DAG):
 self.assertEqual(hash(dag_eq), hash(dag))
 self.assertNotEqual(hash(dag_diff_name), hash(dag))
 self.assertNotEqual(hash(dag_subclass), hash(dag))
+
+
+class TestPerformance(unittest.TestCase):
 
 Review comment:
   I would like to suggest to add new marker `performance`. Then we don't have 
to run this test multiple times. What do you think @mik-laj ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7565: [AIRFLOW-6941][WIP] Add queries count test for create_dagrun

2020-02-27 Thread GitBox
nuclearpinguin commented on a change in pull request #7565: [AIRFLOW-6941][WIP] 
Add queries count test for create_dagrun
URL: https://github.com/apache/airflow/pull/7565#discussion_r385154503
 
 

 ##
 File path: tests/models/test_dag.py
 ##
 @@ -1341,3 +1344,26 @@ class DAGsubclass(DAG):
 self.assertEqual(hash(dag_eq), hash(dag))
 self.assertNotEqual(hash(dag_diff_name), hash(dag))
 self.assertNotEqual(hash(dag_subclass), hash(dag))
+
+
+class TestPerformance(unittest.TestCase):
 
 Review comment:
   I would like to suggest to add new marker `performance`. Then we don't have 
to run this test multiple times. What do you think @mik-laj ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7559: [AIRFLOW-6935] Pass the SchedulerJob configuration using the constructor

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7559: [AIRFLOW-6935] Pass the 
SchedulerJob configuration using the constructor
URL: https://github.com/apache/airflow/pull/7559#discussion_r385146211
 
 

 ##
 File path: airflow/jobs/scheduler_job.py
 ##
 @@ -991,10 +1057,13 @@ def is_alive(self, grace_multiplier=None):
 if grace_multiplier is not None:
 # Accept the same behaviour as superclass
 return super().is_alive(grace_multiplier=grace_multiplier)
-scheduler_health_check_threshold = conf.getint('scheduler', 
'scheduler_health_check_threshold')
+# The object can be retrieved from the database, so it does not 
contain all the attributes.
+self.scheduler_health_check_threshold = conf.getint(
 
 Review comment:
   None of that affects that you have created a 
`scheduler_health_check_threshold` kwarg to the constructor that doesn't ever 
do anything, but that looks like it does.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7559: [AIRFLOW-6935] Pass the SchedulerJob configuration using the constructor

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7559: [AIRFLOW-6935] Pass the 
SchedulerJob configuration using the constructor
URL: https://github.com/apache/airflow/pull/7559#discussion_r385146089
 
 

 ##
 File path: airflow/jobs/scheduler_job.py
 ##
 @@ -991,10 +1057,13 @@ def is_alive(self, grace_multiplier=None):
 if grace_multiplier is not None:
 # Accept the same behaviour as superclass
 return super().is_alive(grace_multiplier=grace_multiplier)
-scheduler_health_check_threshold = conf.getint('scheduler', 
'scheduler_health_check_threshold')
+# The object can be retrieved from the database, so it does not 
contain all the attributes.
+self.scheduler_health_check_threshold = conf.getint(
 
 Review comment:
   None of that affects that you have created a 
`scheduler_health_check_threshold` kwarg to the constructor that doesn't ever 
do anything, but that looks like it does.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on issue #7148: [AIRFLOW-6472] Correct short option in cli

2020-02-27 Thread GitBox
zhongjiajie commented on issue #7148: [AIRFLOW-6472] Correct short option in cli
URL: https://github.com/apache/airflow/pull/7148#issuecomment-591988120
 
 
   @potiuk and @mik-laj CI green, could you please review this PR, don't forget 
we still have discuss in 
https://github.com/apache/airflow/pull/7148#issuecomment-591951660


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #6576: [AIRFLOW-5922] Add option to specify the mysql client library used in MySqlHook

2020-02-27 Thread GitBox
nuclearpinguin commented on a change in pull request #6576: [AIRFLOW-5922] Add 
option to specify the mysql client library used in MySqlHook
URL: https://github.com/apache/airflow/pull/6576#discussion_r385135203
 
 

 ##
 File path: airflow/providers/mysql/hooks/mysql.py
 ##
 @@ -113,8 +107,44 @@ def get_conn(self):
 conn_config['unix_socket'] = conn.extra_dejson['unix_socket']
 if local_infile:
 conn_config["local_infile"] = 1
-conn = MySQLdb.connect(**conn_config)
-return conn
+return conn_config
+
+def _get_conn_config_mysql_connector_python(self, conn):
+conn_config = {
+'user': conn.login,
+'password': conn.password or '',
+'host': conn.host or 'localhost',
+'database': self.schema or conn.schema or '',
+'port': int(conn.port) if conn.port else 3306
+}
+
+if conn.extra_dejson.get('allow_local_infile', False):
+conn_config["allow_local_infile"] = True
+
+return conn_config
+
+def get_conn(self):
+"""
+Establishes a connection to a mysql database
+by extracting the connection configuration from the Airflow connection.
+
+.. note:: By default it connects to the database via the mysqlclient 
library.
+But you can also choose the mysql-connector-python library which 
lets you connect through ssl
+without any further ssl parameters required.
+
+:return: a mysql connection object
+"""
+conn = self.connection or self.get_connection(self.mysql_conn_id)  # 
pylint: disable=no-member
 
 Review comment:
   Why `#no-member` here?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385142816
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_ecs.py
 ##
 @@ -48,12 +48,10 @@
 ]
 }
 
-
+# pylint: disable=unused-argument
+@mock.patch('airflow.providers.amazon.aws.operators.ecs.AwsBaseHook')
 
 Review comment:
   That's weird. It should work.
   
   1. `@mock.patch('airflow.providers.amazon.aws.operators.ecs.AwsBaseHook')` 
   2. `self.aws_hook_mock = aws_hook_mock`
   3.  Access the mocked hook `self.aws_hook_mock` in the tests


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #7547: [AIRFLOW-6926] Fix Google Tasks operators return types and idempotency

2020-02-27 Thread GitBox
codecov-io commented on issue #7547: [AIRFLOW-6926] Fix Google Tasks operators 
return types and idempotency
URL: https://github.com/apache/airflow/pull/7547#issuecomment-591984953
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7547?src=pr=h1) 
Report
   > Merging 
[#7547](https://codecov.io/gh/apache/airflow/pull/7547?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/bb552b2d9fd595cc3eb1b3a2f637f29b814878d7?src=pr=desc)
 will **decrease** coverage by `0.31%`.
   > The diff coverage is `92.85%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7547/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7547?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7547  +/-   ##
   ==
   - Coverage   86.86%   86.54%   -0.32% 
   ==
 Files 896  896  
 Lines   4263842651  +13 
   ==
   - Hits3703636912 -124 
   - Misses   5602 5739 +137
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7547?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...oviders/google/cloud/example\_dags/example\_tasks.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL2Nsb3VkL2V4YW1wbGVfZGFncy9leGFtcGxlX3Rhc2tzLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/providers/google/cloud/operators/tasks.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL2Nsb3VkL29wZXJhdG9ycy90YXNrcy5weQ==)
 | `99.14% <92.59%> (-0.86%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `47.18% <0%> (-45.08%)` | :arrow_down: |
   | 
[...viders/cncf/kubernetes/operators/kubernetes\_pod.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvY25jZi9rdWJlcm5ldGVzL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZC5weQ==)
 | `69.69% <0%> (-25.26%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[airflow/utils/helpers.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5)
 | `71.31% <0%> (-11.3%)` | :arrow_down: |
   | 
[airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==)
 | `86.51% <0%> (-1.83%)` | :arrow_down: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `90.9% <0%> (-0.4%)` | :arrow_down: |
   | ... and [22 
more](https://codecov.io/gh/apache/airflow/pull/7547/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7547?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7547?src=pr=footer). 
Last update 
[bb552b2...2a1df2a](https://codecov.io/gh/apache/airflow/pull/7547?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385139258
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_ecs.py
 ##
 @@ -171,7 +166,7 @@ def test_execute_with_failures(self):
 }
 )
 
-def test_wait_end_tasks(self):
+def test_wait_end_tasks(self, aws_hook_mock):
 
 Review comment:
   Okay, but can you please then add the pylint disable rule only to these 
lines. So that other unused arguments won't be disabled. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385139258
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_ecs.py
 ##
 @@ -171,7 +166,7 @@ def test_execute_with_failures(self):
 }
 )
 
-def test_wait_end_tasks(self):
+def test_wait_end_tasks(self, aws_hook_mock):
 
 Review comment:
   Okay, but can you please then add the pylint disable rule only to these 
lines. So that other unused arguments won't be disabled, too. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7563: [AIRFLOW-6939] Executor configuration via import path

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7563: [AIRFLOW-6939] Executor 
configuration via import path
URL: https://github.com/apache/airflow/pull/7563#discussion_r385131154
 
 

 ##
 File path: UPDATING.md
 ##
 @@ -61,6 +61,29 @@ https://developers.google.com/style/inclusive-documentation
 
 -->
 
+### Custom executors is loaded using full import path
+
+In previous versions of Airflow it was possible to use plugins to load custom 
executors. It is still
+possible, but the configuration has changed. Now you don't have to create a 
plugin to configure a
+custom executor, but you need to provide the full path to the module in the 
`executor` option
+in the `core` section. The purpose of this change is to simplify the plugin 
mechanism and make
+it easier to configure executor.
+
+If your module was in the path `my_acme_company.executors.MyCustomExecutor`  
and the plugin was
+called `my_plugin` then your configuration looks like this
+
+```ini
+[core]
+executor = my_plguin.MyCustomExecutor
 
 Review comment:
   Oh wow missed that one totally!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on issue #6576: [AIRFLOW-5922] Add option to specify the mysql client library used in MySqlHook

2020-02-27 Thread GitBox
feluelle commented on issue #6576: [AIRFLOW-5922] Add option to specify the 
mysql client library used in MySqlHook
URL: https://github.com/apache/airflow/pull/6576#issuecomment-591974065
 
 
   It is green  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7437: [AIRFLOW-2325] Add CloudwatchTaskHandler option for remote task loggi…

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7437: [AIRFLOW-2325] Add 
CloudwatchTaskHandler option for remote task loggi…
URL: https://github.com/apache/airflow/pull/7437#discussion_r385123141
 
 

 ##
 File path: airflow/utils/log/cloudwatch_task_handler.py
 ##
 @@ -0,0 +1,114 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import watchtower
+from cached_property import cached_property
+
+from airflow.configuration import conf
+from airflow.utils.log.file_task_handler import FileTaskHandler
+from airflow.utils.log.logging_mixin import LoggingMixin
+
+
+class CloudwatchTaskHandler(FileTaskHandler, LoggingMixin):
+"""
+CloudwatchTaskHandler is a python log handler that handles and reads task 
instance logs.
+
+It extends airflow FileTaskHandler and uploads to and reads from 
Cloudwatch.
+
+:param base_log_folder: base folder to store logs locally
+:type base_log_folder: str
+:param log_group_arn: ARN of the Cloudwatch log group for remote log 
storage
+:type log_group_arn: str
+:param filename_template: template for file name (local storage) or log 
stream name (remote)
+:type filename_template: str
+"""
+def __init__(self, base_log_folder, log_group_arn, filename_template):
+super().__init__(base_log_folder, filename_template)
+split_arn = log_group_arn.split(':')
 
 Review comment:
   Could you add an example ARN here in a comment? Makes it easier to 
tell/sanity check what the 4th and 7th parts are.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7437: [AIRFLOW-2325] Add CloudwatchTaskHandler option for remote task loggi…

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7437: [AIRFLOW-2325] Add 
CloudwatchTaskHandler option for remote task loggi…
URL: https://github.com/apache/airflow/pull/7437#discussion_r382901838
 
 

 ##
 File path: setup.py
 ##
 @@ -157,6 +157,7 @@ def write_version(filename: str = 
os.path.join(*["airflow", "git_version"])):
 ]
 aws = [
 'boto3~=1.10',
+'watchtower>=0.7.3',
 
 Review comment:
   `~=0.7.3` please


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7437: [AIRFLOW-2325] Add CloudwatchTaskHandler option for remote task loggi…

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7437: [AIRFLOW-2325] Add 
CloudwatchTaskHandler option for remote task loggi…
URL: https://github.com/apache/airflow/pull/7437#discussion_r385128092
 
 

 ##
 File path: docs/howto/write-logs.rst
 ##
 @@ -115,6 +115,29 @@ To configure it, you must additionally set the endpoint 
url to point to your loc
 You can do this via the Connection Extra ``host`` field.
 For example, ``{"host": "http://localstack:4572"}``
 
+.. _write-logs-amazon-cloudwatch:
+
+Writing Logs to Amazon Cloudwatch
+-
+
+
+Enabling remote logging
+'''
+
+To enable this feature, ``airflow.cfg`` must be configured as follows:
+
+.. code-block:: ini
+
+[logging]
+# Airflow can store logs remotely in AWS Cloudwatch. Users must supply a 
log group
+# ARN (starting with 'cloudwatch://...') and an Airflow connection
+# id that provides write and read access to the log location.
+remote_logging = True
+remote_base_log_folder = cloudwatch://arn:aws:logs:::log-group::*
 
 Review comment:
   What is the `*` on the end here for? What other possible values would it 
have?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] boring-cyborg[bot] commented on issue #7571: [AIRFLOW-XXXX] Fix typos in best-practices.rst, yandexcloud.rst, and concepts.rst

2020-02-27 Thread GitBox
boring-cyborg[bot] commented on issue #7571: [AIRFLOW-] Fix typos in 
best-practices.rst, yandexcloud.rst, and concepts.rst
URL: https://github.com/apache/airflow/pull/7571#issuecomment-591972889
 
 
   Awesome work, congrats on your first merged pull request!
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #7571: [AIRFLOW-XXXX] Fix typos in best-practices.rst, yandexcloud.rst, and concepts.rst

2020-02-27 Thread GitBox
kaxil merged pull request #7571: [AIRFLOW-] Fix typos in 
best-practices.rst, yandexcloud.rst, and concepts.rst
URL: https://github.com/apache/airflow/pull/7571
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ryanahamilton opened a new pull request #7571: [AIRFLOW-XXXX] Fix typos in best-practices.rst, yandexcloud.rst, and concepts.rst

2020-02-27 Thread GitBox
ryanahamilton opened a new pull request #7571: [AIRFLOW-] Fix typos in 
best-practices.rst, yandexcloud.rst, and concepts.rst
URL: https://github.com/apache/airflow/pull/7571
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] boring-cyborg[bot] commented on issue #7571: [AIRFLOW-XXXX] Fix typos in best-practices.rst, yandexcloud.rst, and concepts.rst

2020-02-27 Thread GitBox
boring-cyborg[bot] commented on issue #7571: [AIRFLOW-] Fix typos in 
best-practices.rst, yandexcloud.rst, and concepts.rst
URL: https://github.com/apache/airflow/pull/7571#issuecomment-591972078
 
 
   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://apache-airflow-slack.herokuapp.com/
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-6768) Graph view rendering angular edges

2020-02-27 Thread Nathan Hadfield (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6768?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nathan Hadfield reassigned AIRFLOW-6768:


Assignee: Ry Walker

> Graph view rendering angular edges
> --
>
> Key: AIRFLOW-6768
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6768
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.8, 1.10.9
>Reporter: Nathan Hadfield
>Assignee: Ry Walker
>Priority: Minor
> Fix For: 2.0.0, 1.10.10
>
> Attachments: Screenshot 2020-02-10 at 08.51.02.png, Screenshot 
> 2020-02-10 at 08.51.20.png
>
>
> Since the release of v1.10.8 the DAG graph view is rendering the edges 
> between nodes with angular lines rather than nice smooth curves.
> Seems to have been caused by a bump of dagre-d3.
> [https://github.com/apache/airflow/pull/7280]
> [https://github.com/dagrejs/dagre-d3/issues/305]
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (AIRFLOW-6768) Graph view rendering angular edges

2020-02-27 Thread Nathan Hadfield (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6768?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nathan Hadfield reassigned AIRFLOW-6768:


Assignee: (was: Nathan Hadfield)

> Graph view rendering angular edges
> --
>
> Key: AIRFLOW-6768
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6768
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.8, 1.10.9
>Reporter: Nathan Hadfield
>Priority: Minor
> Fix For: 2.0.0, 1.10.10
>
> Attachments: Screenshot 2020-02-10 at 08.51.02.png, Screenshot 
> 2020-02-10 at 08.51.20.png
>
>
> Since the release of v1.10.8 the DAG graph view is rendering the edges 
> between nodes with angular lines rather than nice smooth curves.
> Seems to have been caused by a bump of dagre-d3.
> [https://github.com/apache/airflow/pull/7280]
> [https://github.com/dagrejs/dagre-d3/issues/305]
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj commented on a change in pull request #7559: [AIRFLOW-6935] Pass the SchedulerJob configuration using the constructor

2020-02-27 Thread GitBox
mik-laj commented on a change in pull request #7559: [AIRFLOW-6935] Pass the 
SchedulerJob configuration using the constructor
URL: https://github.com/apache/airflow/pull/7559#discussion_r385123975
 
 

 ##
 File path: airflow/jobs/scheduler_job.py
 ##
 @@ -991,10 +1057,13 @@ def is_alive(self, grace_multiplier=None):
 if grace_multiplier is not None:
 # Accept the same behaviour as superclass
 return super().is_alive(grace_multiplier=grace_multiplier)
-scheduler_health_check_threshold = conf.getint('scheduler', 
'scheduler_health_check_threshold')
+# The object can be retrieved from the database, so it does not 
contain all the attributes.
+self.scheduler_health_check_threshold = conf.getint(
 
 Review comment:
   Combining these two aspects in one class is also hell with cyclical imports.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7559: [AIRFLOW-6935] Pass the SchedulerJob configuration using the constructor

2020-02-27 Thread GitBox
mik-laj commented on a change in pull request #7559: [AIRFLOW-6935] Pass the 
SchedulerJob configuration using the constructor
URL: https://github.com/apache/airflow/pull/7559#discussion_r385123579
 
 

 ##
 File path: airflow/jobs/scheduler_job.py
 ##
 @@ -991,10 +1057,13 @@ def is_alive(self, grace_multiplier=None):
 if grace_multiplier is not None:
 # Accept the same behaviour as superclass
 return super().is_alive(grace_multiplier=grace_multiplier)
-scheduler_health_check_threshold = conf.getint('scheduler', 
'scheduler_health_check_threshold')
+# The object can be retrieved from the database, so it does not 
contain all the attributes.
+self.scheduler_health_check_threshold = conf.getint(
 
 Review comment:
   The problem is related to the fact that we use the database entities as 
elements of building the application. Not only as a place to store data.  In 
most cases, however, the schedulerJob is an application logic object and only 
in rare cases a database object. I would like to separate it, but to do this, 
most of the code must be used correctly ie the configuration parameters are 
passed by the constructor.  If we divide these two aspects in the code, we will 
be able to make changes much easier and e.g. replace SQLAlchemy with Redis.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7492: [AIRFLOW-6871] optimize tree view for large DAGs

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7492: [AIRFLOW-6871] optimize tree 
view for large DAGs
URL: https://github.com/apache/airflow/pull/7492#discussion_r385121966
 
 

 ##
 File path: airflow/www/views.py
 ##
 @@ -1374,90 +1376,115 @@ def tree(self):
 .all()
 )
 dag_runs = {
-dr.execution_date: alchemy_to_dict(dr) for dr in dag_runs}
+dr.execution_date: alchemy_to_dict(dr) for dr in dag_runs
+}
 
 dates = sorted(list(dag_runs.keys()))
 max_date = max(dates) if dates else None
 min_date = min(dates) if dates else None
 
 tis = dag.get_task_instances(start_date=min_date, end_date=base_date)
-task_instances = {}
+task_instances: Dict[Tuple[str, datetime], models.TaskInstance] = {}
 for ti in tis:
-tid = alchemy_to_dict(ti)
-dr = dag_runs.get(ti.execution_date)
-tid['external_trigger'] = dr['external_trigger'] if dr else False
-task_instances[(ti.task_id, ti.execution_date)] = tid
+task_instances[(ti.task_id, ti.execution_date)] = ti
 
-expanded = []
+expanded = set()
 # The default recursion traces every path so that tree view has full
 # expand/collapse functionality. After 5,000 nodes we stop and fall
 # back on a quick DFS search for performance. See PR #320.
-node_count = [0]
+node_count = 0
 node_limit = 5000 / max(1, len(dag.leaves))
 
+def encode_ti(ti: Optional[models.TaskInstance]) -> Optional[List]:
+if not ti:
+return None
+
+# NOTE: order of entry is important here because client JS relies 
on it for
+# tree node reconstruction. Remember to change JS code in tree.html
+# whenever order is altered.
+data = [
+ti.state,
+ti.try_number,
+None,  # start_ts
+None,  # duration
+]
+
+if ti.start_date:
+# round to seconds to reduce payload size
+data[2] = int(ti.start_date.timestamp())
+if ti.duration is not None:
+data[3] = int(ti.duration)
+
+return data
+
 def recurse_nodes(task, visited):
+nonlocal node_count
+node_count += 1
 visited.add(task)
-node_count[0] += 1
-
-children = [
-recurse_nodes(t, visited) for t in task.downstream_list
-if node_count[0] < node_limit or t not in visited]
-
-# D3 tree uses children vs _children to define what is
-# expanded or not. The following block makes it such that
-# repeated nodes are collapsed by default.
-children_key = 'children'
-if task.task_id not in expanded:
-expanded.append(task.task_id)
-elif children:
-children_key = "_children"
-
-def set_duration(tid):
-if (isinstance(tid, dict) and tid.get("state") == 
State.RUNNING and
-tid["start_date"] is not None):
-d = timezone.utcnow() - timezone.parse(tid["start_date"])
-tid["duration"] = d.total_seconds()
-return tid
-
-return {
+task_id = task.task_id
+
+node = {
 'name': task.task_id,
 'instances': [
-set_duration(task_instances.get((task.task_id, d))) or {
-'execution_date': d.isoformat(),
-'task_id': task.task_id
-}
-for d in dates],
-children_key: children,
+encode_ti(task_instances.get((task_id, d)))
+for d in dates
+],
 'num_dep': len(task.downstream_list),
 'operator': task.task_type,
 'retries': task.retries,
 'owner': task.owner,
-'start_date': task.start_date,
-'end_date': task.end_date,
-'depends_on_past': task.depends_on_past,
 'ui_color': task.ui_color,
-'extra_links': task.extra_links,
 }
 
+if task.downstream_list:
+children = [
+recurse_nodes(t, visited) for t in task.downstream_list
+if node_count < node_limit or t not in visited]
+
+# D3 tree uses children vs _children to define what is
+# expanded or not. The following block makes it such that
+# repeated nodes are collapsed by default.
+if task.task_id not in expanded:
+children_key = 'children'
+expanded.add(task.task_id)
+else:
+  

[GitHub] [airflow] ashb commented on a change in pull request #7492: [AIRFLOW-6871] optimize tree view for large DAGs

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7492: [AIRFLOW-6871] optimize tree 
view for large DAGs
URL: https://github.com/apache/airflow/pull/7492#discussion_r385121446
 
 

 ##
 File path: airflow/www/views.py
 ##
 @@ -1371,90 +1374,115 @@ def tree(self):
 .all()
 )
 dag_runs = {
-dr.execution_date: alchemy_to_dict(dr) for dr in dag_runs}
+dr.execution_date: alchemy_to_dict(dr) for dr in dag_runs
+}
 
 dates = sorted(list(dag_runs.keys()))
 max_date = max(dates) if dates else None
 min_date = min(dates) if dates else None
 
 tis = dag.get_task_instances(start_date=min_date, end_date=base_date)
-task_instances = {}
+task_instances: Dict[Tuple[str, datetime], models.TaskInstance] = {}
 for ti in tis:
-tid = alchemy_to_dict(ti)
-dr = dag_runs.get(ti.execution_date)
-tid['external_trigger'] = dr['external_trigger'] if dr else False
-task_instances[(ti.task_id, ti.execution_date)] = tid
+task_instances[(ti.task_id, ti.execution_date)] = ti
 
-expanded = []
+expanded = set()
 # The default recursion traces every path so that tree view has full
 # expand/collapse functionality. After 5,000 nodes we stop and fall
 # back on a quick DFS search for performance. See PR #320.
-node_count = [0]
+node_count = 0
 node_limit = 5000 / max(1, len(dag.leaves))
 
+def encode_ti(ti: Optional[models.TaskInstance]) -> Optional[List]:
+if not ti:
+return None
+
+# NOTE: order of entry is important here because client JS relies 
on it for
+# tree node reconstruction. Remember to change JS code in tree.html
+# whenever order is altered.
+data = [
+ti.state,
+ti.try_number,
+None,  # start_ts
+None,  # duration
+]
+
+if ti.start_date:
+# round to seconds to reduce payload size
+data[2] = int(ti.start_date.timestamp())
+if ti.duration is not None:
+data[3] = int(ti.duration)
+
+return data
+
 def recurse_nodes(task, visited):
+nonlocal node_count
+node_count += 1
 visited.add(task)
-node_count[0] += 1
-
-children = [
-recurse_nodes(t, visited) for t in task.downstream_list
-if node_count[0] < node_limit or t not in visited]
-
-# D3 tree uses children vs _children to define what is
-# expanded or not. The following block makes it such that
-# repeated nodes are collapsed by default.
-children_key = 'children'
-if task.task_id not in expanded:
-expanded.append(task.task_id)
-elif children:
-children_key = "_children"
-
-def set_duration(tid):
-if (isinstance(tid, dict) and tid.get("state") == 
State.RUNNING and
-tid["start_date"] is not None):
-d = timezone.utcnow() - timezone.parse(tid["start_date"])
-tid["duration"] = d.total_seconds()
-return tid
-
-return {
+task_id = task.task_id
+
+node = {
 'name': task.task_id,
 'instances': [
-set_duration(task_instances.get((task.task_id, d))) or {
-'execution_date': d.isoformat(),
-'task_id': task.task_id
-}
-for d in dates],
-children_key: children,
+encode_ti(task_instances.get((task_id, d)))
+for d in dates
+],
 'num_dep': len(task.downstream_list),
 'operator': task.task_type,
 'retries': task.retries,
 'owner': task.owner,
-'start_date': task.start_date,
-'end_date': task.end_date,
-'depends_on_past': task.depends_on_past,
 'ui_color': task.ui_color,
-'extra_links': task.extra_links,
 }
 
+if task.downstream_list:
+children = [
+recurse_nodes(t, visited) for t in task.downstream_list
+if node_count < node_limit or t not in visited]
+
+# D3 tree uses children vs _children to define what is
+# expanded or not. The following block makes it such that
+# repeated nodes are collapsed by default.
+if task.task_id not in expanded:
+children_key = 'children'
+expanded.add(task.task_id)
+else:
+  

[GitHub] [airflow] ashb commented on a change in pull request #7492: [AIRFLOW-6871] optimize tree view for large DAGs

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7492: [AIRFLOW-6871] optimize tree 
view for large DAGs
URL: https://github.com/apache/airflow/pull/7492#discussion_r385120657
 
 

 ##
 File path: airflow/www/views.py
 ##
 @@ -1374,90 +1376,115 @@ def tree(self):
 .all()
 )
 dag_runs = {
-dr.execution_date: alchemy_to_dict(dr) for dr in dag_runs}
+dr.execution_date: alchemy_to_dict(dr) for dr in dag_runs
+}
 
 dates = sorted(list(dag_runs.keys()))
 max_date = max(dates) if dates else None
 min_date = min(dates) if dates else None
 
 tis = dag.get_task_instances(start_date=min_date, end_date=base_date)
-task_instances = {}
+task_instances: Dict[Tuple[str, datetime], models.TaskInstance] = {}
 for ti in tis:
-tid = alchemy_to_dict(ti)
-dr = dag_runs.get(ti.execution_date)
-tid['external_trigger'] = dr['external_trigger'] if dr else False
-task_instances[(ti.task_id, ti.execution_date)] = tid
+task_instances[(ti.task_id, ti.execution_date)] = ti
 
-expanded = []
+expanded = set()
 # The default recursion traces every path so that tree view has full
 # expand/collapse functionality. After 5,000 nodes we stop and fall
 # back on a quick DFS search for performance. See PR #320.
-node_count = [0]
+node_count = 0
 node_limit = 5000 / max(1, len(dag.leaves))
 
+def encode_ti(ti: Optional[models.TaskInstance]) -> Optional[List]:
+if not ti:
+return None
+
+# NOTE: order of entry is important here because client JS relies 
on it for
+# tree node reconstruction. Remember to change JS code in tree.html
+# whenever order is altered.
+data = [
+ti.state,
+ti.try_number,
+None,  # start_ts
+None,  # duration
+]
+
+if ti.start_date:
+# round to seconds to reduce payload size
+data[2] = int(ti.start_date.timestamp())
+if ti.duration is not None:
+data[3] = int(ti.duration)
+
+return data
+
 def recurse_nodes(task, visited):
+nonlocal node_count
+node_count += 1
 visited.add(task)
-node_count[0] += 1
-
-children = [
-recurse_nodes(t, visited) for t in task.downstream_list
-if node_count[0] < node_limit or t not in visited]
-
-# D3 tree uses children vs _children to define what is
-# expanded or not. The following block makes it such that
-# repeated nodes are collapsed by default.
-children_key = 'children'
-if task.task_id not in expanded:
-expanded.append(task.task_id)
-elif children:
-children_key = "_children"
-
-def set_duration(tid):
-if (isinstance(tid, dict) and tid.get("state") == 
State.RUNNING and
-tid["start_date"] is not None):
-d = timezone.utcnow() - timezone.parse(tid["start_date"])
-tid["duration"] = d.total_seconds()
-return tid
-
-return {
+task_id = task.task_id
+
+node = {
 'name': task.task_id,
 'instances': [
-set_duration(task_instances.get((task.task_id, d))) or {
-'execution_date': d.isoformat(),
-'task_id': task.task_id
-}
-for d in dates],
-children_key: children,
+encode_ti(task_instances.get((task_id, d)))
+for d in dates
+],
 'num_dep': len(task.downstream_list),
 'operator': task.task_type,
 'retries': task.retries,
 'owner': task.owner,
-'start_date': task.start_date,
-'end_date': task.end_date,
-'depends_on_past': task.depends_on_past,
 'ui_color': task.ui_color,
-'extra_links': task.extra_links,
 }
 
+if task.downstream_list:
+children = [
+recurse_nodes(t, visited) for t in task.downstream_list
+if node_count < node_limit or t not in visited]
+
+# D3 tree uses children vs _children to define what is
+# expanded or not. The following block makes it such that
+# repeated nodes are collapsed by default.
+if task.task_id not in expanded:
+children_key = 'children'
+expanded.add(task.task_id)
+else:
+  

[GitHub] [airflow] ashb commented on a change in pull request #7559: [AIRFLOW-6935] Pass the SchedulerJob configuration using the constructor

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7559: [AIRFLOW-6935] Pass the 
SchedulerJob configuration using the constructor
URL: https://github.com/apache/airflow/pull/7559#discussion_r385118219
 
 

 ##
 File path: airflow/jobs/scheduler_job.py
 ##
 @@ -991,10 +1057,13 @@ def is_alive(self, grace_multiplier=None):
 if grace_multiplier is not None:
 # Accept the same behaviour as superclass
 return super().is_alive(grace_multiplier=grace_multiplier)
-scheduler_health_check_threshold = conf.getint('scheduler', 
'scheduler_health_check_threshold')
+# The object can be retrieved from the database, so it does not 
contain all the attributes.
+self.scheduler_health_check_threshold = conf.getint(
 
 Review comment:
   The comment is helpful, but as a result there's no point storing this as an 
attribute, a local variable makes more sense.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7559: [AIRFLOW-6935] Pass the SchedulerJob configuration using the constructor

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7559: [AIRFLOW-6935] Pass the 
SchedulerJob configuration using the constructor
URL: https://github.com/apache/airflow/pull/7559#discussion_r385119442
 
 

 ##
 File path: airflow/jobs/scheduler_job.py
 ##
 @@ -991,10 +1057,13 @@ def is_alive(self, grace_multiplier=None):
 if grace_multiplier is not None:
 # Accept the same behaviour as superclass
 return super().is_alive(grace_multiplier=grace_multiplier)
-scheduler_health_check_threshold = conf.getint('scheduler', 
'scheduler_health_check_threshold')
+# The object can be retrieved from the database, so it does not 
contain all the attributes.
+self.scheduler_health_check_threshold = conf.getint(
 
 Review comment:
   Which also means we shouldn't accept it as an argument in the constructor -- 
it won't ever be used/do anything.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6857) Bulk sync DAGs

2020-02-27 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046608#comment-17046608
 ] 

ASF subversion and git services commented on AIRFLOW-6857:
--

Commit 031b4b73e5ebd656b512628051c21a8e83803b4c in airflow's branch 
refs/heads/master from Kamil Breguła
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=031b4b7 ]

[AIRFLOW-6857] Bulk sync DAGs (#7477)



> Bulk sync DAGs
> --
>
> Key: AIRFLOW-6857
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6857
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6857) Bulk sync DAGs

2020-02-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046607#comment-17046607
 ] 

ASF GitHub Bot commented on AIRFLOW-6857:
-

mik-laj commented on pull request #7477: [AIRFLOW-6857] Bulk sync DAGs
URL: https://github.com/apache/airflow/pull/7477
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Bulk sync DAGs
> --
>
> Key: AIRFLOW-6857
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6857
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj merged pull request #7477: [AIRFLOW-6857] Bulk sync DAGs

2020-02-27 Thread GitBox
mik-laj merged pull request #7477: [AIRFLOW-6857] Bulk sync DAGs
URL: https://github.com/apache/airflow/pull/7477
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] abdulbasitds commented on issue #6007: [AIRFLOW-2310] Enable AWS Glue Job Integration

2020-02-27 Thread GitBox
abdulbasitds commented on issue #6007: [AIRFLOW-2310] Enable AWS Glue Job 
Integration
URL: https://github.com/apache/airflow/pull/6007#issuecomment-591959953
 
 
   @feluelle deleted 'tests/sensors/test_aws_glue_job_sensor.py'
   
   Not sure about how to exclude integration change, you asked to delete the 
changes to intergation.rst that this pull request has made


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r38577
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_ecs.py
 ##
 @@ -48,12 +48,10 @@
 ]
 }
 
-
+# pylint: disable=unused-argument
+@mock.patch('airflow.providers.amazon.aws.operators.ecs.AwsBaseHook')
 
 Review comment:
   No luck. I can't mock the hook properly if I move it back to above the 
setUp, regardless of how I patch it. I'm not sure how it was working before...
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] abdulbasitds commented on a change in pull request #6007: [AIRFLOW-2310] Enable AWS Glue Job Integration

2020-02-27 Thread GitBox
abdulbasitds commented on a change in pull request #6007: [AIRFLOW-2310] Enable 
AWS Glue Job Integration
URL: https://github.com/apache/airflow/pull/6007#discussion_r385110737
 
 

 ##
 File path: docs/integration.rst
 ##
 @@ -17,7 +17,6 @@
 
 Integration
 ===
-
 
 Review comment:
   Thanks, How can i exclude it? there is option 'delete file' but will it 
delete file from the repo?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385110474
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_ecs.py
 ##
 @@ -171,7 +166,7 @@ def test_execute_with_failures(self):
 }
 )
 
-def test_wait_end_tasks(self):
+def test_wait_end_tasks(self, aws_hook_mock):
 
 Review comment:
   After moving the mock from setUp to above the class, all methods need to 
have this added (even if unused)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6007: [AIRFLOW-2310] Enable AWS Glue Job Integration

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #6007: [AIRFLOW-2310] Enable AWS 
Glue Job Integration
URL: https://github.com/apache/airflow/pull/6007#discussion_r385107981
 
 

 ##
 File path: docs/integration.rst
 ##
 @@ -17,7 +17,6 @@
 
 Integration
 ===
-
 
 Review comment:
   Please do not include this change.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] abdulbasitds commented on issue #6007: [AIRFLOW-2310] Enable AWS Glue Job Integration

2020-02-27 Thread GitBox
abdulbasitds commented on issue #6007: [AIRFLOW-2310] Enable AWS Glue Job 
Integration
URL: https://github.com/apache/airflow/pull/6007#issuecomment-591954863
 
 
   @feluelle I have made all changes and have rebased, does it look okay now?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on issue #7148: [AIRFLOW-6472] Correct short option in cli

2020-02-27 Thread GitBox
zhongjiajie commented on issue #7148: [AIRFLOW-6472] Correct short option in cli
URL: https://github.com/apache/airflow/pull/7148#issuecomment-591951660
 
 
   I find out our Breeze subcommand also name kebab-case style
   ![](https://i.loli.net/2020/02/27/IsqAGHgeMRbD4iE.png)
   But Airflow CLI still use snake_case style
   ![](https://i.loli.net/2020/02/27/z4VBXWElHaNtGRj.png)
   
   So, should we change our subcommand? @potiuk @mik-laj 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6938) Don't read dag_directory in Scheduler

2020-02-27 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046579#comment-17046579
 ] 

ASF subversion and git services commented on AIRFLOW-6938:
--

Commit 764aab63de5bb907b088a5bae5955d2869c33aa7 in airflow's branch 
refs/heads/master from Kamil Breguła
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=764aab6 ]

[AIRFLOW-6938] Don't read dag_directory in SchedulerJob (#7562)



> Don't read dag_directory in Scheduler
> -
>
> Key: AIRFLOW-6938
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6938
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6938) Don't read dag_directory in Scheduler

2020-02-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046577#comment-17046577
 ] 

ASF GitHub Bot commented on AIRFLOW-6938:
-

mik-laj commented on pull request #7562: [AIRFLOW-6938] Don't read 
dag_directory in SchedulerJob
URL: https://github.com/apache/airflow/pull/7562
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Don't read dag_directory in Scheduler
> -
>
> Key: AIRFLOW-6938
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6938
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj merged pull request #7562: [AIRFLOW-6938] Don't read dag_directory in SchedulerJob

2020-02-27 Thread GitBox
mik-laj merged pull request #7562: [AIRFLOW-6938] Don't read dag_directory in 
SchedulerJob
URL: https://github.com/apache/airflow/pull/7562
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #7343: [AIRFLOW-6719] Introduce pyupgrade to enforce latest syntax

2020-02-27 Thread GitBox
mik-laj commented on issue #7343: [AIRFLOW-6719] Introduce pyupgrade to enforce 
latest syntax
URL: https://github.com/apache/airflow/pull/7343#issuecomment-591949011
 
 
   @zhongjiajie I don't have time to finish this, because I've taken care of 
the scheduler performance. You want to take over?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #7343: [AIRFLOW-6719] Introduce pyupgrade to enforce latest syntax

2020-02-27 Thread GitBox
mik-laj edited a comment on issue #7343: [AIRFLOW-6719] Introduce pyupgrade to 
enforce latest syntax
URL: https://github.com/apache/airflow/pull/7343#issuecomment-591949011
 
 
   @zhongjiajie I don't have time to finish this, because I've taken care of 
the scheduler performance.  Would you like to take over?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-6947) UTF8mb4 encoding for mysql does not work in Airflow 2.0

2020-02-27 Thread Jarek Potiuk (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6947?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jarek Potiuk updated AIRFLOW-6947:
--
Description: 
The problem is with how MySQL handles different encodings. Especially UTF8. 
UTF8 in Mysql - default utf8 encoding - does not handle all UTF8 characters 
(only those encoded in 3 bytes) - the 4-bytes one are not working (there is an 
error -  "Incorrect string value: '
 xF0' for column 'description' at row 1") when you try to insert DAG with 
4-bytes character unicode.

This a problem mainly with DAG description that is stored in the database. One 
of our customers had this very issue with it's database and there database 
encoding is utf8. Current utf8 behaviour - is that it is an alias to utf8mb3 
[https://dev.mysql.com/doc/refman/5.7/en/charset-unicode-utf8.html] which means 
it does not handle all characters (mostly Emojis but also some chinese 
characters 
[https://stackoverflow.com/questions/17680237/mysql-four-byte-chinese-characters-support]
 ) . In some future versions of mysql - UTF8 will become alias for utf8mb4 
([https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8.html]) which 
supports full range of UTF-encoded characters. It is strongly advised to use 
utf8mb4 directly as default encoding.

I decided to see how it works with utf8mb4 encoding and - unfortunately it 
turns out that in case we switch to it, migration scripts for Airflow fails 
because row size for at least one of the indexes exceeds maximum row size:

‘'Specified key was too long; max key length is 3072 bytes'” when XCOM key is 
created.

ALTER TABLE xcom ADD CONSTRAINT pk_xcom PRIMARY KEY (dag_id, task_id, `key`, 
execution_date)]

In Airflow 1.10 the primary key for an xcom was an integer and in 2.0 it is a 
compound index with dag_id, task_id, execution_date and key - they together 
make the row too big for utf8mb4 (in utf8mb4 encoding the text fields take 4x 
number of characters).

In our CI we had so far the default mysql encoding (which for the uninitiated 
is latin1_swedish_ci (!), I switched it to utf8mb4 so that you can see the 
behaviour - and created PR here [https://github.com/apache/airflow/pull/7570] 
and failed test here:

[https://travis-ci.org/apache/airflow/jobs/655733996?utm_medium=notification_source=github_status]
 

Note similar problem occurs in 1.10 with MySQL 5.6 - if I change the charset to 
utf8mb4 and choose 5.6 mysql, it will fail because there the max key length was 
half the size (1536 characters).

There is even an issue for it in our JIRA 
https://issues.apache.org/jira/browse/AIRFLOW-3786 - for different index. The 
workaround was to use the UTF8  (UTF8mb3) or switching to MySQL 5.7.

  was:
The problem is with how MySQL handles different encodings. Especially UTF8. 
UTF8 in Mysql - default utf8 encoding - does not handle all UTF8 characters 
(only those encoded in 3 bytes) - the 4-bytes one are not working (there is an 
error -  "Incorrect string value: '
 xF0' for column 'description' at row 1") when you try to insert DAG with 
4-bytes character unicode.

This a problem mainly with DAG description that is stored in the database. One 
of our customers had this very issue with it's database and there database 
encoding is utf8. Current utf8 behaviour - is that it is an alias to utf8mb3 
[https://dev.mysql.com/doc/refman/5.7/en/charset-unicode-utf8.html] which means 
it does not handle all characters (mostly Emojis but also some chinese 
characters 
[https://stackoverflow.com/questions/17680237/mysql-four-byte-chinese-characters-support]
 ) . In some future versions of mysql - UTF8 will become alias for utf8mb4 
([https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8.html]) which 
supports full range of UTF-encoded characters. It is strongly advised to use 
utf8mb4 directly as default encoding.

I decided to see how it works with utf8mb4 encoding and - unfortunately it 
turns out that in case we switch to it, migration scripts for Airflow fails 
because row size for at least one of the indexes exceeds maximum row size:

‘'Specified key was too long; max key length is 3072 bytes'” when XCOM key is 
created.

ALTER TABLE xcom ADD CONSTRAINT pk_xcom PRIMARY KEY (dag_id, task_id, `key`, 
execution_date)]

In Airflow 1.10 the primary key was an integer and in 2.0 it is a compound 
index with dag_id, task_id, execution_date and key - they together make the row 
too big for utf8mb4 (in utf8mb4 encoding the text fields take 4x number of 
characters).

In our CI we had so far the default mysql encoding (which for the uninitiated 
is latin1_swedish_ci (!), I switched it to utf8mb4 so that you can see the 
behaviour - and created PR here [https://github.com/apache/airflow/pull/7570] 
and failed test here:

[https://travis-ci.org/apache/airflow/jobs/655733996?utm_medium=notification_source=github_status]
 

Note similar problem occurs in 1.10 with MySQL 5.6 - if I 

[jira] [Updated] (AIRFLOW-6947) UTF8mb4 encoding for mysql does not work in Airflow 2.0

2020-02-27 Thread Jarek Potiuk (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6947?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jarek Potiuk updated AIRFLOW-6947:
--
Description: 
The problem is with how MySQL handles different encodings. Especially UTF8. 
UTF8 in Mysql - default utf8 encoding - does not handle all UTF8 characters 
(only those encoded in 3 bytes) - the 4-bytes one are not working (there is an 
error -  "Incorrect string value: '
 xF0' for column 'description' at row 1") when you try to insert DAG with 
4-bytes character unicode.

This a problem mainly with DAG description that is stored in the database. One 
of our customers had this very issue with it's database and there database 
encoding is utf8. Current utf8 behaviour - is that it is an alias to utf8mb3 
[https://dev.mysql.com/doc/refman/5.7/en/charset-unicode-utf8.html] which means 
it does not handle all characters (mostly Emojis but also some chinese 
characters 
[https://stackoverflow.com/questions/17680237/mysql-four-byte-chinese-characters-support]
 ) . In some future versions of mysql - UTF8 will become alias for utf8mb4 
([https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8.html]) which 
supports full range of UTF-encoded characters. It is strongly advised to use 
utf8mb4 directly as default encoding.

I decided to see how it works with utf8mb4 encoding and - unfortunately it 
turns out that in case we switch to it, migration scripts for Airflow fails 
because row size for at least one of the indexes exceeds maximum row size:

‘'Specified key was too long; max key length is 3072 bytes'” when XCOM key is 
created.

ALTER TABLE xcom ADD CONSTRAINT pk_xcom PRIMARY KEY (dag_id, task_id, `key`, 
execution_date)]

In Airflow 1.10 the primary key was an integer and in 2.0 it is a compound 
index with dag_id, task_id, execution_date and key - they together make the row 
too big for utf8mb4 (in utf8mb4 encoding the text fields take 4x number of 
characters).

In our CI we had so far the default mysql encoding (which for the uninitiated 
is latin1_swedish_ci (!), I switched it to utf8mb4 so that you can see the 
behaviour - and created PR here [https://github.com/apache/airflow/pull/7570] 
and failed test here:

[https://travis-ci.org/apache/airflow/jobs/655733996?utm_medium=notification_source=github_status]
 

Note similar problem occurs in 1.10 with MySQL 5.6 - if I change the charset to 
utf8mb4 and choose 5.6 mysql, it will fail because there the max key length was 
half the size (1536 characters).

There is even an issue for it in our JIRA 
https://issues.apache.org/jira/browse/AIRFLOW-3786 - for different index. The 
workaround was to use the UTF8  (UTF8mb3) or switching to MySQL 5.7.

  was:
The problem is with how MySQL handles different encodings. Especially UTF8. 
UTF8 in Mysql - default utf8 encoding - does not handle all UTF8 characters 
(only those encoded in 3 bytes) - the 4-bytes one are not working (there is an 
error -  "Incorrect string value: '
xF0' for column 'description' at row 1") when you try to insert DAG with 
4-bytes character unicode.

This a problem mainly with DAG description that is stored in the database. One 
of our customers had this very issue with it's database and there database 
encoding is utf8. Current utf8 behaviour - is that it is an alias to utf8mb3 
[https://dev.mysql.com/doc/refman/5.7/en/charset-unicode-utf8.html] which means 
it does not handle all characters (mostly Emojis but also some chinese 
characters 
[https://stackoverflow.com/questions/17680237/mysql-four-byte-chinese-characters-support]
 ) . In some future versions of mysql - UTF8 will become alias for utf8mb4 
([https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8.html]) which 
supports full range of UTF-encoded characters. It is strongly advised to use 
utf8mb4 directly as default encoding.

I decided to see how it works with utf8mb4 encoding and - unfortunately it 
turns out that in case we switch to it, migration scripts for Airflow fails 
because row size for at least one of the indexes exceeds maximum row size:

‘'Specified key was too long; max key length is 3072 bytes'” when XCOM key is 
created.

ALTER TABLE xcom ADD CONSTRAINT pk_xcom PRIMARY KEY (dag_id, task_id, `key`, 
execution_date)]

Apparently increased size of some columns (key?) make the row too big for 
utf8mb4 (in utf8mb4 encoding the text fields take 4x number of characters).

In our CI we had so far the default mysql encoding (which for the uninitiated 
is latin1_swedish_ci (!), I switched it to utf8mb4 so that you can see the 
behaviour - and created PR here [https://github.com/apache/airflow/pull/7570] 
and failed test here:

[https://travis-ci.org/apache/airflow/jobs/655733996?utm_medium=notification_source=github_status]
 

Note similar problem occurs in 1.10 with MySQL 5.6 - if I change the charset to 
utf8mb4 and choose 5.6 mysql, it will fail because there the max key length was 
half 

[GitHub] [airflow] zhongjiajie commented on a change in pull request #7343: [AIRFLOW-6719] Introduce pyupgrade to enforce latest syntax

2020-02-27 Thread GitBox
zhongjiajie commented on a change in pull request #7343: [AIRFLOW-6719] 
Introduce pyupgrade to enforce latest syntax
URL: https://github.com/apache/airflow/pull/7343#discussion_r385092821
 
 

 ##
 File path: airflow/providers/amazon/aws/operators/s3_delete_objects.py
 ##
 @@ -74,4 +74,16 @@ def __init__(
 
 def execute(self, context):
 s3_hook = S3Hook(aws_conn_id=self.aws_conn_id, verify=self.verify)
+<<< HEAD
 s3_hook.delete_objects(bucket=self.bucket, keys=self.keys)
+===
+
+response = s3_hook.delete_objects(bucket=self.bucket, keys=self.keys)
+
+deleted_keys = [x['Key'] for x in response.get("Deleted", [])]
+self.log.info("Deleted: %s", deleted_keys)
+
+if "Errors" in response:
+errors_keys = [x['Key'] for x in response.get("Errors", [])]
+raise AirflowException(f"Errors when deleting: {errors_keys}")
+>>> b18ba328b... [AIRFLOW-6719] Introduce pyupgrade to enforce latest 
syntax
 
 Review comment:
   It fail as you said, not that easy, but this is rebase mistake


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #7343: [AIRFLOW-6719] Introduce pyupgrade to enforce latest syntax

2020-02-27 Thread GitBox
zhongjiajie commented on a change in pull request #7343: [AIRFLOW-6719] 
Introduce pyupgrade to enforce latest syntax
URL: https://github.com/apache/airflow/pull/7343#discussion_r385093470
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_sftp_to_s3.py
 ##
 @@ -81,7 +81,7 @@ def test_sftp_to_s3_operation(self):
 create_file_task = SSHOperator(
 task_id="test_create_file",
 ssh_hook=self.hook,
-command="echo '{0}' > {1}".format(test_remote_file_content,
+command="echo '{}' > {}".format(test_remote_file_content,
   self.sftp_path),
 
 Review comment:
   ```suggestion
   self.sftp_path),
   ```
   ```log
   tests/providers/amazon/aws/operators/test_sftp_to_s3.py:85:47: E127 
continuation line over-indented for visual indent
   tests/models/test_taskinstance.py:318:45: E128 continuation line 
under-indented for visual indent
   airflow/ti_deps/deps/runnable_exec_date_dep.py:38:47: E127 continuation line 
over-indented for visual indent
   tests/providers/sftp/operators/test_sftp.py:214:47: E127 continuation line 
over-indented for visual indent
   tests/providers/sftp/operators/test_sftp.py:252:47: E127 continuation line 
over-indented for visual indent
   tests/providers/sftp/operators/test_sftp.py:291:47: E127 continuation line 
over-indented for visual indent
   tests/providers/sftp/operators/test_sftp.py:327:47: E127 continuation line 
over-indented for visual indent
   airflow/cli/commands/dag_command.py:85:50: E127 continuation line 
over-indented for visual indent
   airflow/ti_deps/deps/dag_ti_slots_available_dep.py:33:57: E127 continuation 
line over-indented for visual indent
   airflow/providers/sftp/operators/sftp.py:139:57: E127 continuation line 
over-indented for visual indent
   airflow/providers/sftp/operators/sftp.py:150:57: E127 continuation line 
over-indented for visual indent
   airflow/ti_deps/deps/not_in_retry_period_dep.py:52:41: E127 continuation 
line over-indented for visual indent
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385091513
 
 

 ##
 File path: airflow/providers/amazon/aws/sensors/athena.py
 ##
 @@ -57,13 +57,11 @@ def __init__(self,
 super().__init__(*args, **kwargs)
 self.aws_conn_id = aws_conn_id
 self.query_execution_id = query_execution_id
-self.hook = None
 self.sleep_time = sleep_time
 self.max_retires = max_retires
+self.hook = AWSAthenaHook(self.aws_conn_id, self.sleep_time)
 
 Review comment:
   Thanks - many of the existing sensor code was doing it during __init__, so I 
thought it was convention. Will change to do it during execution which makes 
more sense.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385091780
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_ecs.py
 ##
 @@ -48,12 +48,10 @@
 ]
 }
 
-
+# pylint: disable=unused-argument
+@mock.patch('airflow.providers.amazon.aws.operators.ecs.AwsBaseHook')
 
 Review comment:
   I couldn't get it to work correctly with setUp, but will try again in case


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-6947) UTF8mb4 encoding for mysql does not work in Airflow 2.0

2020-02-27 Thread Jarek Potiuk (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6947?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jarek Potiuk updated AIRFLOW-6947:
--
Description: 
The problem is with how MySQL handles different encodings. Especially UTF8. 
UTF8 in Mysql - default utf8 encoding - does not handle all UTF8 characters 
(only those encoded in 3 bytes) - the 4-bytes one are not working (there is an 
error -  "Incorrect string value: '
xF0' for column 'description' at row 1") when you try to insert DAG with 
4-bytes character unicode.

This a problem mainly with DAG description that is stored in the database. One 
of our customers had this very issue with it's database and there database 
encoding is utf8. Current utf8 behaviour - is that it is an alias to utf8mb3 
[https://dev.mysql.com/doc/refman/5.7/en/charset-unicode-utf8.html] which means 
it does not handle all characters (mostly Emojis but also some chinese 
characters 
[https://stackoverflow.com/questions/17680237/mysql-four-byte-chinese-characters-support]
 ) . In some future versions of mysql - UTF8 will become alias for utf8mb4 
([https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8.html]) which 
supports full range of UTF-encoded characters. It is strongly advised to use 
utf8mb4 directly as default encoding.

I decided to see how it works with utf8mb4 encoding and - unfortunately it 
turns out that in case we switch to it, migration scripts for Airflow fails 
because row size for at least one of the indexes exceeds maximum row size:

‘'Specified key was too long; max key length is 3072 bytes'” when XCOM key is 
created.

ALTER TABLE xcom ADD CONSTRAINT pk_xcom PRIMARY KEY (dag_id, task_id, `key`, 
execution_date)]

Apparently increased size of some columns (key?) make the row too big for 
utf8mb4 (in utf8mb4 encoding the text fields take 4x number of characters).

In our CI we had so far the default mysql encoding (which for the uninitiated 
is latin1_swedish_ci (!), I switched it to utf8mb4 so that you can see the 
behaviour - and created PR here [https://github.com/apache/airflow/pull/7570] 
and failed test here:

[https://travis-ci.org/apache/airflow/jobs/655733996?utm_medium=notification_source=github_status]
 

Note similar problem occurs in 1.10 with MySQL 5.6 - if I change the charset to 
utf8mb4 and choose 5.6 mysql, it will fail because there the max key length was 
half the size (1536 characters).

There is even an issue for it in our JIRA 
https://issues.apache.org/jira/browse/AIRFLOW-3786. The workaround was to use 
the UTF8  (UTF8mb3) or switching to MySQL 5.7.

  was:
The problem is with how MySQL handles different encodings. Especially UTF8. 
UTF8 in Mysql - default utf8 encoding - does not handle all UTF8 characters 
(only those encoded in 3 bytes) - the 4-bytes one are not working (there is an 
error -  "Incorrect string value: '\\xF0' for column 'description' at row 
1") when you try to insert DAG with 4-bytes character unicode.

This a problem for example with DAG description that is stored in the database. 
One of our customers had this very issue with it's database and there database 
encoding is utf8. Current utf8 behaviour - is that it is an alias to utf8mb3 
https://dev.mysql.com/doc/refman/5.7/en/charset-unicode-utf8.html which means 
it does not handle all characters (mostly Emojis) . In some future versions of 
mysql - UTF8 will become alias for utf8mb4 
(https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8.html) which 
supports full range of UTF-encoded characters. It is strongly advised to use 
utf8mb4 directly as default encoding.

I decided to see how it works with utf8mb4 encoding and - unfortunately it 
turns out that in case we switch to it, migration scripts for Airflow fails 
because row size for at least one of the indexes exceeds maximum row size:

‘'Specified key was too long; max key length is 3072 bytes'” when XCOM key is 
created.

ALTER TABLE xcom ADD CONSTRAINT pk_xcom PRIMARY KEY (dag_id, task_id, `key`, 
execution_date)]

Apparently increased size of some columns (key?) make the row too big for 
utf8mb4 (in utf8mb4 encoding the text fields take 4x number of characters).

In our CI we had so far the default mysql encoding (which for the uninitiated 
is latin1_swedish_ci (!), I switched it to utf8mb4 so that you can see the 
behaviour - and created PR here https://github.com/apache/airflow/pull/7570 and 
failed test here:

https://travis-ci.org/apache/airflow/jobs/655733996?utm_medium=notification_source=github_status
 

Note similar problem occurs in 1.10 with MySQL 5.6 - if I change the charset to 
utf8mb4 and choose 5.6 mysql, it will fail because there the max key length was 
half the size (1536 characters).

There is even an issue for it in our JIRA 
https://issues.apache.org/jira/browse/AIRFLOW-3786. The workaround was to use 
the UTF8  (UTF8mb3) or switching to MySQL 5.7.


> UTF8mb4 encoding for mysql does not work in 

[jira] [Created] (AIRFLOW-6947) UTF8mb4 encoding for mysql does not work in Airflow 2.0

2020-02-27 Thread Jarek Potiuk (Jira)
Jarek Potiuk created AIRFLOW-6947:
-

 Summary: UTF8mb4 encoding for mysql does not work in Airflow 2.0
 Key: AIRFLOW-6947
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6947
 Project: Apache Airflow
  Issue Type: Improvement
  Components: mysql, database
Affects Versions: 2.0.0
Reporter: Jarek Potiuk


The problem is with how MySQL handles different encodings. Especially UTF8. 
UTF8 in Mysql - default utf8 encoding - does not handle all UTF8 characters 
(only those encoded in 3 bytes) - the 4-bytes one are not working (there is an 
error -  "Incorrect string value: '\\xF0' for column 'description' at row 
1") when you try to insert DAG with 4-bytes character unicode.

This a problem for example with DAG description that is stored in the database. 
One of our customers had this very issue with it's database and there database 
encoding is utf8. Current utf8 behaviour - is that it is an alias to utf8mb3 
https://dev.mysql.com/doc/refman/5.7/en/charset-unicode-utf8.html which means 
it does not handle all characters (mostly Emojis) . In some future versions of 
mysql - UTF8 will become alias for utf8mb4 
(https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8.html) which 
supports full range of UTF-encoded characters. It is strongly advised to use 
utf8mb4 directly as default encoding.

I decided to see how it works with utf8mb4 encoding and - unfortunately it 
turns out that in case we switch to it, migration scripts for Airflow fails 
because row size for at least one of the indexes exceeds maximum row size:

‘'Specified key was too long; max key length is 3072 bytes'” when XCOM key is 
created.

ALTER TABLE xcom ADD CONSTRAINT pk_xcom PRIMARY KEY (dag_id, task_id, `key`, 
execution_date)]

Apparently increased size of some columns (key?) make the row too big for 
utf8mb4 (in utf8mb4 encoding the text fields take 4x number of characters).

In our CI we had so far the default mysql encoding (which for the uninitiated 
is latin1_swedish_ci (!), I switched it to utf8mb4 so that you can see the 
behaviour - and created PR here https://github.com/apache/airflow/pull/7570 and 
failed test here:

https://travis-ci.org/apache/airflow/jobs/655733996?utm_medium=notification_source=github_status
 

Note similar problem occurs in 1.10 with MySQL 5.6 - if I change the charset to 
utf8mb4 and choose 5.6 mysql, it will fail because there the max key length was 
half the size (1536 characters).

There is even an issue for it in our JIRA 
https://issues.apache.org/jira/browse/AIRFLOW-3786. The workaround was to use 
the UTF8  (UTF8mb3) or switching to MySQL 5.7.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385081131
 
 

 ##
 File path: airflow/providers/amazon/aws/sensors/sqs.py
 ##
 @@ -56,6 +56,7 @@ def __init__(self,
 self.aws_conn_id = aws_conn_id
 self.max_messages = max_messages
 self.wait_time_seconds = wait_time_seconds
+self.hook = SQSHook(aws_conn_id=self.aws_conn_id)
 
 Review comment:
   Here, too.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385082524
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_ecs.py
 ##
 @@ -48,12 +48,10 @@
 ]
 }
 
-
+# pylint: disable=unused-argument
+@mock.patch('airflow.providers.amazon.aws.operators.ecs.AwsBaseHook')
 
 Review comment:
   What is the point of moving it up to class level instead of `setUp` ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385080404
 
 

 ##
 File path: airflow/providers/amazon/aws/sensors/athena.py
 ##
 @@ -57,13 +57,11 @@ def __init__(self,
 super().__init__(*args, **kwargs)
 self.aws_conn_id = aws_conn_id
 self.query_execution_id = query_execution_id
-self.hook = None
 self.sleep_time = sleep_time
 self.max_retires = max_retires
+self.hook = AWSAthenaHook(self.aws_conn_id, self.sleep_time)
 
 Review comment:
   Please don't initiate a Hook in an Operators `__init__`, because 1. the 
`__init__` will be called more often / every time the scheduler reads the dag 
file. 2. The Hook will also be initiated even if the Operator doesn't execute.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385087185
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/base_aws.py
 ##
 @@ -47,11 +48,29 @@ class AwsBaseHook(BaseHook):
 :param verify: Whether or not to verify SSL certificates.
 
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html
 :type verify: str or bool
+:param str region_name: AWS Region name to use. If this is None then the 
default boto3
+behaviour is used.
+:param str client_type: boto3 client_type used when creating 
boto3.client(). For
+example, 's3', 'emr', etc. Provided by specific hooks for these 
clients which
+subclass AwsBaseHook.
+:param str resource_type: boto3 resource_type used when creating 
boto3.resource(). For
+example, 's3'. Provided by specific hooks for these resources which
+subclass AwsBaseHook.
 
 Review comment:
   Can you use `:param` + `:type` ?
   
   Also the type is `Optional[str]` not `str` because it allows a string to be 
None. 
   You can also define the types directly next to the argument like this: 
`region_name: Optional[str] = None`. You only need to import `typing`. It is an 
official PEP https://www.python.org/dev/peps/pep-0484/ in Python >=3.5


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385083228
 
 

 ##
 File path: tests/providers/amazon/aws/operators/test_ecs.py
 ##
 @@ -171,7 +166,7 @@ def test_execute_with_failures(self):
 }
 )
 
-def test_wait_end_tasks(self):
+def test_wait_end_tasks(self, aws_hook_mock):
 
 Review comment:
   (Why) Is this necessary?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385080772
 
 

 ##
 File path: airflow/providers/amazon/aws/sensors/redshift.py
 ##
 @@ -43,9 +43,9 @@ def __init__(self,
 self.cluster_identifier = cluster_identifier
 self.target_status = target_status
 self.aws_conn_id = aws_conn_id
+self.hook = RedshiftHook(aws_conn_id=self.aws_conn_id)
 
 Review comment:
   Same here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385089326
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/base_aws.py
 ##
 @@ -232,6 +251,38 @@ def get_resource_type(self, resource_type, 
region_name=None, config=None):
 resource_type, endpoint_url=endpoint_url, config=config, 
verify=self.verify
 )
 
+@cached_property
+def conn(self):
+"""Get the underlying boto3 client (cached).
+
+The return value from this method is cached for efficiency.
+
+:return: boto3.client or boto3.resource for the current
+client/resource type and region
+:rtype: boto3.client() or boto3.resource()
+:raises AirflowException: self.client_type or self.resource_type are 
not
+populated. These are usually specified to this class, by a subclass
+__init__ method.
+"""
+if self.client_type:
+return self.get_client_type(self.client_type, 
region_name=self.region_name)
+elif self.resource_type:
+return self.get_resource_type(self.resource_type, 
region_name=self.region_name)
+else:
+raise AirflowException(
+'Either self.client_type or self.resource_type'
+' must be specified in the subclass')
 
 Review comment:
   I think it would be better to do the verification in the `__init__`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385084222
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/athena.py
 ##
 @@ -28,8 +28,12 @@ class AWSAthenaHook(AwsBaseHook):
 """
 Interact with AWS Athena to run, poll queries and return query results
 
-:param aws_conn_id: aws connection to use.
-:type aws_conn_id: str
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+:class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
 
 Review comment:
   I like this one  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385080961
 
 

 ##
 File path: airflow/providers/amazon/aws/sensors/s3_prefix.py
 ##
 @@ -69,12 +70,11 @@ def __init__(self,
 self.full_url = "s3://" + bucket_name + '/' + prefix
 self.aws_conn_id = aws_conn_id
 self.verify = verify
+self.hook = S3Hook(aws_conn_id=self.aws_conn_id, verify=self.verify)
 
 Review comment:
   And here. Please check again all files you changed :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
feluelle commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385083943
 
 

 ##
 File path: tests/providers/amazon/aws/sensors/test_sqs.py
 ##
 @@ -18,8 +18,9 @@
 
 
 import unittest
-from unittest.mock import MagicMock, patch
+from unittest.mock import MagicMock
 
+import mock
 
 Review comment:
   Please use `unittest.mock`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7563: [AIRFLOW-6939] Executor configuration via import path

2020-02-27 Thread GitBox
mik-laj commented on a change in pull request #7563: [AIRFLOW-6939] Executor 
configuration via import path
URL: https://github.com/apache/airflow/pull/7563#discussion_r385085663
 
 

 ##
 File path: UPDATING.md
 ##
 @@ -61,6 +61,29 @@ https://developers.google.com/style/inclusive-documentation
 
 -->
 
+### Custom executors is loaded using full import path
+
+In previous versions of Airflow it was possible to use plugins to load custom 
executors. It is still
+possible, but the configuration has changed. Now you don't have to create a 
plugin to configure a
+custom executor, but you need to provide the full path to the module in the 
`executor` option
+in the `core` section. The purpose of this change is to simplify the plugin 
mechanism and make
+it easier to configure executor.
+
+If your module was in the path `my_acme_company.executors.MyCustomExecutor`  
and the plugin was
+called `my_plugin` then your configuration looks like this
+
+```ini
+[core]
+executor = my_plguin.MyCustomExecutor
 
 Review comment:
   II haven't tested it, but the code looked like it had a chance to work.  In 
the documentation, we have information:
   https://airflow.readthedocs.io/en/1.10.6/plugins.html


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #7562: [AIRFLOW-6938] Don't read dag_directory in SchedulerJob

2020-02-27 Thread GitBox
mik-laj commented on a change in pull request #7562: [AIRFLOW-6938] Don't read 
dag_directory in SchedulerJob
URL: https://github.com/apache/airflow/pull/7562#discussion_r385081097
 
 

 ##
 File path: airflow/utils/dag_processing.py
 ##
 @@ -577,7 +566,7 @@ def __init__(self,
 
 self._last_zombie_query_time = None
 # Last time that the DAG dir was traversed to look for files
-self.last_dag_dir_refresh_time = timezone.utcnow()
+self.last_dag_dir_refresh_time = 
timezone.make_aware(datetime.fromtimestamp(0))
 
 Review comment:
   We want the files to be read during the first loop cycle.
   
https://github.com/apache/airflow/blob/master/airflow/utils/dag_processing.py#L717-L718


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on a change in pull request #7562: [AIRFLOW-6938] Don't read dag_directory in SchedulerJob

2020-02-27 Thread GitBox
kaxil commented on a change in pull request #7562: [AIRFLOW-6938] Don't read 
dag_directory in SchedulerJob
URL: https://github.com/apache/airflow/pull/7562#discussion_r385076833
 
 

 ##
 File path: airflow/utils/dag_processing.py
 ##
 @@ -577,7 +566,7 @@ def __init__(self,
 
 self._last_zombie_query_time = None
 # Last time that the DAG dir was traversed to look for files
-self.last_dag_dir_refresh_time = timezone.utcnow()
+self.last_dag_dir_refresh_time = 
timezone.make_aware(datetime.fromtimestamp(0))
 
 Review comment:
   Why this change?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7563: [AIRFLOW-6939] Executor configuration via import path

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7563: [AIRFLOW-6939] Executor 
configuration via import path
URL: https://github.com/apache/airflow/pull/7563#discussion_r385076603
 
 

 ##
 File path: UPDATING.md
 ##
 @@ -61,6 +61,29 @@ https://developers.google.com/style/inclusive-documentation
 
 -->
 
+### Custom executors is loaded using full import path
+
+In previous versions of Airflow it was possible to use plugins to load custom 
executors. It is still
+possible, but the configuration has changed. Now you don't have to create a 
plugin to configure a
+custom executor, but you need to provide the full path to the module in the 
`executor` option
+in the `core` section. The purpose of this change is to simplify the plugin 
mechanism and make
+it easier to configure executor.
+
+If your module was in the path `my_acme_company.executors.MyCustomExecutor`  
and the plugin was
+called `my_plugin` then your configuration looks like this
+
+```ini
+[core]
+executor = my_plguin.MyCustomExecutor
 
 Review comment:
   More importantly for updating -- did we ever publish executors-via- plugin 
to 1.10 release, if not then these updating instructions don't make sense.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on issue #6007: [AIRFLOW-2310] Enable AWS Glue Job Integration

2020-02-27 Thread GitBox
feluelle commented on issue #6007: [AIRFLOW-2310] Enable AWS Glue Job 
Integration
URL: https://github.com/apache/airflow/pull/6007#issuecomment-591929366
 
 
   Yes please rebase first. Yes, you are right - only remove your changes to 
the file.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7563: [AIRFLOW-6939] Executor configuration via import path

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7563: [AIRFLOW-6939] Executor 
configuration via import path
URL: https://github.com/apache/airflow/pull/7563#discussion_r385075833
 
 

 ##
 File path: UPDATING.md
 ##
 @@ -61,6 +61,29 @@ https://developers.google.com/style/inclusive-documentation
 
 -->
 
+### Custom executors is loaded using full import path
+
+In previous versions of Airflow it was possible to use plugins to load custom 
executors. It is still
+possible, but the configuration has changed. Now you don't have to create a 
plugin to configure a
+custom executor, but you need to provide the full path to the module in the 
`executor` option
+in the `core` section. The purpose of this change is to simplify the plugin 
mechanism and make
+it easier to configure executor.
+
+If your module was in the path `my_acme_company.executors.MyCustomExecutor`  
and the plugin was
+called `my_plugin` then your configuration looks like this
+
+```ini
+[core]
+executor = my_plguin.MyCustomExecutor
 
 Review comment:
   Did this even work before? I thought we only looked at a specific hard-coded 
list?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] blcksrx commented on issue #7422: [AIRFLOW-6809] Test for presto operators

2020-02-27 Thread GitBox
blcksrx commented on issue #7422: [AIRFLOW-6809] Test for presto operators
URL: https://github.com/apache/airflow/pull/7422#issuecomment-591923291
 
 
   @feluelle Done!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #7553: [AIRFLOW-XXXX] Update LICENSE versions and remove old licenses

2020-02-27 Thread GitBox
ashb commented on a change in pull request #7553: [AIRFLOW-] Update LICENSE 
versions and remove old licenses
URL: https://github.com/apache/airflow/pull/7553#discussion_r385066087
 
 

 ##
 File path: LICENSE
 ##
 @@ -229,36 +229,22 @@ MIT licenses
 The following components are provided under the MIT License. See project link 
for details.
 The text of each license is also included at licenses/LICENSE-[project].txt.
 
-(MIT License) jquery v2.1.4 (https://jquery.org/license/)
-(MIT License) dagre-d3 v0.6.1 (https://github.com/cpettitt/dagre-d3)
+(MIT License) jquery v3.4.1 (https://jquery.org/license/)
+(MIT License) dagre-d3 v0.8.5 (https://github.com/cpettitt/dagre-d3)
 
 Review comment:
   Yarn.lock has this as 0.6.4


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #7553: [AIRFLOW-XXXX] Update LICENSE versions and remove old licenses

2020-02-27 Thread GitBox
ashb commented on issue #7553: [AIRFLOW-] Update LICENSE versions and 
remove old licenses
URL: https://github.com/apache/airflow/pull/7553#issuecomment-591919253
 
 
   Twistlock really "detects" versions of code in use from what is mentioned in 
a license file? o_O


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
baolsen commented on a change in pull request #7541: [AIRFLOW-6822] AWS hooks 
should cache boto3 client
URL: https://github.com/apache/airflow/pull/7541#discussion_r385061503
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/base_aws.py
 ##
 @@ -49,9 +50,19 @@ class AwsBaseHook(BaseHook):
 :type verify: str or bool
 """
 
-def __init__(self, aws_conn_id="aws_default", verify=None):
+def __init__(
+self,
+aws_conn_id="aws_default",
+verify=None,
+region_name=None,
+client_type=None,
+resource_type=None
 
 Review comment:
   @feluelle Thanks for the feedback, please take another look. I've updated 
the docstrings on all the hooks accordingly. There's only a "how to" document 
for the DataSync operator, no other separate docs for AWS - at least that I 
could see. Let me know if there's anywhere else you can think of :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #7541: [AIRFLOW-6822] AWS hooks should cache boto3 client

2020-02-27 Thread GitBox
codecov-io commented on issue #7541: [AIRFLOW-6822] AWS hooks should cache 
boto3 client
URL: https://github.com/apache/airflow/pull/7541#issuecomment-591909170
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7541?src=pr=h1) 
Report
   > Merging 
[#7541](https://codecov.io/gh/apache/airflow/pull/7541?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/3f293001d6cf4674edab36d9981ad8407ecc9043?src=pr=desc)
 will **decrease** coverage by `0.33%`.
   > The diff coverage is `97.8%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7541/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7541?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7541  +/-   ##
   ==
   - Coverage   86.86%   86.52%   -0.34% 
   ==
 Files 896  896  
 Lines   4264942581  -68 
   ==
   - Hits3704636845 -201 
   - Misses   5603 5736 +133
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7541?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/providers/amazon/aws/hooks/datasync.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9kYXRhc3luYy5weQ==)
 | `16.66% <0%> (-0.12%)` | :arrow_down: |
   | 
[airflow/providers/amazon/aws/hooks/s3.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9zMy5weQ==)
 | `96.58% <100%> (-0.02%)` | :arrow_down: |
   | 
[airflow/providers/amazon/aws/sensors/redshift.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL3JlZHNoaWZ0LnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[...flow/providers/amazon/aws/hooks/lambda\_function.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9sYW1iZGFfZnVuY3Rpb24ucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/providers/amazon/aws/hooks/sagemaker.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9zYWdlbWFrZXIucHk=)
 | `87.55% <100%> (+0.29%)` | :arrow_up: |
   | 
[airflow/providers/amazon/aws/hooks/redshift.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9yZWRzaGlmdC5weQ==)
 | `75% <100%> (-0.87%)` | :arrow_down: |
   | 
[...w/providers/amazon/aws/sensors/sagemaker\_tuning.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL3NhZ2VtYWtlcl90dW5pbmcucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/providers/amazon/aws/sensors/sqs.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL3Nxcy5weQ==)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/providers/amazon/aws/hooks/glue\_catalog.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9nbHVlX2NhdGFsb2cucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/providers/amazon/aws/sensors/emr\_step.py](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Vtcl9zdGVwLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | ... and [31 
more](https://codecov.io/gh/apache/airflow/pull/7541/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7541?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7541?src=pr=footer). 
Last update 
[3f29300...f282ce5](https://codecov.io/gh/apache/airflow/pull/7541?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-6940) Improve test isolation in test_views.py

2020-02-27 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6940?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-6940.
-
Fix Version/s: 2.0.0
   Resolution: Fixed

> Improve test isolation in test_views.py
> ---
>
> Key: AIRFLOW-6940
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6940
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6940) Improve test isolation in test_views.py

2020-02-27 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6940?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046464#comment-17046464
 ] 

ASF subversion and git services commented on AIRFLOW-6940:
--

Commit c9943ca834ce09a4c427a164d968d8f80401016e in airflow's branch 
refs/heads/master from Kamil Breguła
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=c9943ca ]

[AIRFLOW-6940] Improve test isolation in test_views.py (#7564)



> Improve test isolation in test_views.py
> ---
>
> Key: AIRFLOW-6940
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6940
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6940) Improve test isolation in test_views.py

2020-02-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6940?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046463#comment-17046463
 ] 

ASF GitHub Bot commented on AIRFLOW-6940:
-

kaxil commented on pull request #7564: [AIRFLOW-6940] Improve test isolation in 
test_views.py
URL: https://github.com/apache/airflow/pull/7564
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Improve test isolation in test_views.py
> ---
>
> Key: AIRFLOW-6940
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6940
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5659) Add support for ephemeral storage on KubernetesPodOperator

2020-02-27 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5659?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-5659.
-
Fix Version/s: 2.0.0
   Resolution: Fixed

> Add support for ephemeral storage on KubernetesPodOperator
> --
>
> Key: AIRFLOW-5659
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5659
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 2.0.0
>Reporter: Leonardo Miguel
>Assignee: Leonardo Miguel
>Priority: Minor
> Fix For: 2.0.0
>
>
> KubernetesPodOperator currently doesn't support requests and limits for 
> resource 'ephemeral-storage'.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil merged pull request #7564: [AIRFLOW-6940] Improve test isolation in test_views.py

2020-02-27 Thread GitBox
kaxil merged pull request #7564: [AIRFLOW-6940] Improve test isolation in 
test_views.py
URL: https://github.com/apache/airflow/pull/7564
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #6337: [AIRFLOW-5659] - Add support for ephemeral storage on KubernetesPodOp…

2020-02-27 Thread GitBox
kaxil merged pull request #6337: [AIRFLOW-5659] - Add support for ephemeral 
storage on KubernetesPodOp…
URL: https://github.com/apache/airflow/pull/6337
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5659) Add support for ephemeral storage on KubernetesPodOperator

2020-02-27 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5659?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046462#comment-17046462
 ] 

ASF subversion and git services commented on AIRFLOW-5659:
--

Commit dfb18adaf53fa12f1b10b1283321b1dd71211059 in airflow's branch 
refs/heads/master from Leonardo Alves Miguel
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=dfb18ad ]

[AIRFLOW-5659] Add support for ephemeral storage on KubernetesPodOperator 
(#6337)



> Add support for ephemeral storage on KubernetesPodOperator
> --
>
> Key: AIRFLOW-5659
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5659
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 2.0.0
>Reporter: Leonardo Miguel
>Assignee: Leonardo Miguel
>Priority: Minor
>
> KubernetesPodOperator currently doesn't support requests and limits for 
> resource 'ephemeral-storage'.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5659) Add support for ephemeral storage on KubernetesPodOperator

2020-02-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5659?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046461#comment-17046461
 ] 

ASF GitHub Bot commented on AIRFLOW-5659:
-

kaxil commented on pull request #6337: [AIRFLOW-5659] - Add support for 
ephemeral storage on KubernetesPodOp…
URL: https://github.com/apache/airflow/pull/6337
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add support for ephemeral storage on KubernetesPodOperator
> --
>
> Key: AIRFLOW-5659
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5659
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 2.0.0
>Reporter: Leonardo Miguel
>Assignee: Leonardo Miguel
>Priority: Minor
>
> KubernetesPodOperator currently doesn't support requests and limits for 
> resource 'ephemeral-storage'.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk opened a new pull request #7570: [AIRFLOW-6946] [WIP] Switch to MySQL 5.7 in 2.0 as base

2020-02-27 Thread GitBox
potiuk opened a new pull request #7570: [AIRFLOW-6946] [WIP] Switch to MySQL 
5.7 in 2.0 as base
URL: https://github.com/apache/airflow/pull/7570
 
 
   Switch to MySQL 5.7 in tests.
   
   Also test utf8mb4 encoding
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6946) Switch to MySQL 5.7 in 2.0 as base

2020-02-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6946?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17046451#comment-17046451
 ] 

ASF GitHub Bot commented on AIRFLOW-6946:
-

potiuk commented on pull request #7570: [AIRFLOW-6946] [WIP] Switch to MySQL 
5.7 in 2.0 as base
URL: https://github.com/apache/airflow/pull/7570
 
 
   Switch to MySQL 5.7 in tests.
   
   Also test utf8mb4 encoding
   
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Switch to MySQL 5.7 in 2.0 as base
> --
>
> Key: AIRFLOW-6946
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6946
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
>
> Switch to MySQL 5.7 in tests. 
> Also test utf8mb4 encoding



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6946) Switch to MySQL 5.7 in 2.0 as base

2020-02-27 Thread Jarek Potiuk (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6946?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jarek Potiuk updated AIRFLOW-6946:
--
Description: 
Switch to MySQL 5.7 in tests. 

Also test utf8mb4 encoding

  was:Switch to MySQL 5.7 in tests. 


> Switch to MySQL 5.7 in 2.0 as base
> --
>
> Key: AIRFLOW-6946
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6946
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
>
> Switch to MySQL 5.7 in tests. 
> Also test utf8mb4 encoding



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6946) Switch to MySQL 5.7 in 2.0 as base

2020-02-27 Thread Jarek Potiuk (Jira)
Jarek Potiuk created AIRFLOW-6946:
-

 Summary: Switch to MySQL 5.7 in 2.0 as base
 Key: AIRFLOW-6946
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6946
 Project: Apache Airflow
  Issue Type: Improvement
  Components: ci
Affects Versions: 2.0.0
Reporter: Jarek Potiuk


Switch to MySQL 5.7 in tests. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


<    1   2   3   >