[GitHub] [airflow] rootcss commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


rootcss commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466850548



##
File path: airflow/cli/cli_parser.py
##
@@ -1104,6 +1113,12 @@ class GroupCommand(NamedTuple):
 
func=lazy_load_command('airflow.cli.commands.connection_command.connections_delete'),
 args=(ARG_CONN_ID,),
 ),
+ActionCommand(
+name='export',
+help='Export all connections',
+
func=lazy_load_command('airflow.cli.commands.connection_command.connections_export'),

Review comment:
   Got it.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


mik-laj commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466848114



##
File path: airflow/cli/cli_parser.py
##
@@ -1104,6 +1113,12 @@ class GroupCommand(NamedTuple):
 
func=lazy_load_command('airflow.cli.commands.connection_command.connections_delete'),
 args=(ARG_CONN_ID,),
 ),
+ActionCommand(
+name='export',
+help='Export all connections',
+
func=lazy_load_command('airflow.cli.commands.connection_command.connections_export'),

Review comment:
   The help parameter appears in the task list, but the description 
parameter is available for help. For example: see `airflow dags show --help`.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


mik-laj commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466847733



##
File path: docs/howto/connection/index.rst
##
@@ -88,6 +88,93 @@ Alternatively you may specify each parameter individually:
 --conn-schema 'schema' \
 ...
 
+.. _connection/export:
+
+Exporting Connections from the CLI
+--
+
+You may export connections from the database using the CLI. The supported 
formats are ``json``, ``yaml`` and ``env``.
+
+You may mention the target file as the parameter:
+
+.. code-block:: bash
+
+airflow connections export connections.json
+
+Alternatively you may specify ``format`` parameter for overriding the format:
+
+.. code-block:: bash
+
+airflow connections export /tmp/connections --format yaml
+
+You may also specify ``-`` for STDOUT:
+
+.. code-block:: bash
+
+airflow connections export -
+
+The JSON format contains an object where the key contains the connection ID 
and the value contains the definition of the connection. In this format, the 
connection is defined as a JSON object. The following is a sample JSON file.
+
+.. code-block:: json
+
+{
+"CONN_A": {
+"conn_type": "mysql",
+"host": "mysql",
+"login": "root",
+"password": "plainpassword",
+"schema": "airflow",
+"port": null,
+"extra": null,
+"is_encrypted": false,
+"is_extra_encrypted": false
+},
+"CONN_B": {
+"conn_type": "druid",
+"host": "druid-broker",
+"login": null,
+"password": null,
+"schema": null,
+"port": 8082,
+"extra": "{\"endpoint\": \"druid/v2/sql\"}",
+"is_encrypted": false,
+"is_extra_encrypted": false
+}
+}
+
+The YAML file structure is similar to that of a JSON. The key-value pair of 
connection ID and the definitions of one or more connections. In this format, 
the connection is defined as a YAML object. The following is a sample YAML file.
+
+.. code-block:: yaml
+
+CONN_A:
+  conn_type: mysql
+  extra:
+  host: mysql
+  is_encrypted: false
+  is_extra_encrypted: false
+  login: root
+  password: plainpassword
+  port:
+  schema: airflow
+
+CONN_B:
+  conn_type: druid
+  extra: '{"endpoint": "druid/v2/sql"}'
+  host: druid-broker
+  is_encrypted: false
+  is_extra_encrypted: false
+  login:
+  password:
+  port: 8082
+  schema:
+
+You may also export connections in ``.env`` format. The key is the connection 
ID, and the value describes the connection using the URI. If the connection ID 
is repeated, all values will be returned. The following is a sample ENV file.

Review comment:
   In Airflow 1.10, connection ID is not unique. In Airflow 2.0, the 
connection ID is unique for 5 days. See:  
https://github.com/apache/airflow/pull/9067





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rootcss commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


rootcss commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466846563



##
File path: docs/howto/connection/index.rst
##
@@ -88,6 +88,93 @@ Alternatively you may specify each parameter individually:
 --conn-schema 'schema' \
 ...
 
+.. _connection/export:
+
+Exporting Connections from the CLI
+--
+
+You may export connections from the database using the CLI. The supported 
formats are ``json``, ``yaml`` and ``env``.
+
+You may mention the target file as the parameter:
+
+.. code-block:: bash
+
+airflow connections export connections.json
+
+Alternatively you may specify ``format`` parameter for overriding the format:
+
+.. code-block:: bash
+
+airflow connections export /tmp/connections --format yaml
+
+You may also specify ``-`` for STDOUT:
+
+.. code-block:: bash
+
+airflow connections export -
+
+The JSON format contains an object where the key contains the connection ID 
and the value contains the definition of the connection. In this format, the 
connection is defined as a JSON object. The following is a sample JSON file.
+
+.. code-block:: json
+
+{
+"CONN_A": {
+"conn_type": "mysql",
+"host": "mysql",
+"login": "root",
+"password": "plainpassword",
+"schema": "airflow",
+"port": null,
+"extra": null,
+"is_encrypted": false,
+"is_extra_encrypted": false
+},
+"CONN_B": {
+"conn_type": "druid",
+"host": "druid-broker",
+"login": null,
+"password": null,
+"schema": null,
+"port": 8082,
+"extra": "{\"endpoint\": \"druid/v2/sql\"}",
+"is_encrypted": false,
+"is_extra_encrypted": false
+}
+}
+
+The YAML file structure is similar to that of a JSON. The key-value pair of 
connection ID and the definitions of one or more connections. In this format, 
the connection is defined as a YAML object. The following is a sample YAML file.
+
+.. code-block:: yaml
+
+CONN_A:
+  conn_type: mysql
+  extra:
+  host: mysql
+  is_encrypted: false
+  is_extra_encrypted: false
+  login: root
+  password: plainpassword
+  port:
+  schema: airflow
+
+CONN_B:
+  conn_type: druid
+  extra: '{"endpoint": "druid/v2/sql"}'
+  host: druid-broker
+  is_encrypted: false
+  is_extra_encrypted: false
+  login:
+  password:
+  port: 8082
+  schema:
+
+You may also export connections in ``.env`` format. The key is the connection 
ID, and the value describes the connection using the URI. If the connection ID 
is repeated, all values will be returned. The following is a sample ENV file.

Review comment:
   Correct. I was confused about this because in various documentation it's 
mentioned that there could be multiple connections with same connection ID, 
which didn't seem to be the case with `Connection` model. Will change this.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


mik-laj commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466846158



##
File path: airflow/cli/cli_parser.py
##
@@ -1104,6 +1113,12 @@ class GroupCommand(NamedTuple):
 
func=lazy_load_command('airflow.cli.commands.connection_command.connections_delete'),
 args=(ARG_CONN_ID,),
 ),
+ActionCommand(
+name='export',
+help='Export all connections',
+
func=lazy_load_command('airflow.cli.commands.connection_command.connections_export'),

Review comment:
   Can you add examples of calls for this command in the command 
description (description parameter not help parameter)? 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


mik-laj commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466844596



##
File path: docs/howto/connection/index.rst
##
@@ -88,6 +88,93 @@ Alternatively you may specify each parameter individually:
 --conn-schema 'schema' \
 ...
 
+.. _connection/export:
+
+Exporting Connections from the CLI
+--
+
+You may export connections from the database using the CLI. The supported 
formats are ``json``, ``yaml`` and ``env``.
+
+You may mention the target file as the parameter:
+
+.. code-block:: bash
+
+airflow connections export connections.json
+
+Alternatively you may specify ``format`` parameter for overriding the format:
+
+.. code-block:: bash
+
+airflow connections export /tmp/connections --format yaml
+
+You may also specify ``-`` for STDOUT:
+
+.. code-block:: bash
+
+airflow connections export -
+
+The JSON format contains an object where the key contains the connection ID 
and the value contains the definition of the connection. In this format, the 
connection is defined as a JSON object. The following is a sample JSON file.
+
+.. code-block:: json
+
+{
+"CONN_A": {
+"conn_type": "mysql",
+"host": "mysql",
+"login": "root",
+"password": "plainpassword",
+"schema": "airflow",
+"port": null,
+"extra": null,
+"is_encrypted": false,
+"is_extra_encrypted": false
+},
+"CONN_B": {
+"conn_type": "druid",
+"host": "druid-broker",
+"login": null,
+"password": null,
+"schema": null,
+"port": 8082,
+"extra": "{\"endpoint\": \"druid/v2/sql\"}",
+"is_encrypted": false,
+"is_extra_encrypted": false
+}
+}
+
+The YAML file structure is similar to that of a JSON. The key-value pair of 
connection ID and the definitions of one or more connections. In this format, 
the connection is defined as a YAML object. The following is a sample YAML file.
+
+.. code-block:: yaml
+
+CONN_A:
+  conn_type: mysql
+  extra:
+  host: mysql
+  is_encrypted: false
+  is_extra_encrypted: false
+  login: root
+  password: plainpassword
+  port:
+  schema: airflow
+
+CONN_B:
+  conn_type: druid
+  extra: '{"endpoint": "druid/v2/sql"}'
+  host: druid-broker
+  is_encrypted: false
+  is_extra_encrypted: false
+  login:
+  password:
+  port: 8082
+  schema:
+
+You may also export connections in ``.env`` format. The key is the connection 
ID, and the value describes the connection using the URI. If the connection ID 
is repeated, all values will be returned. The following is a sample ENV file.

Review comment:
   Connection ID is unique
   
https://github.com/apache/airflow/blob/master/UPDATING.md#unique-conn_id-in-connection-table





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rootcss commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


rootcss commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466843622



##
File path: docs/howto/connection/index.rst
##
@@ -88,6 +88,31 @@ Alternatively you may specify each parameter individually:
 --conn-schema 'schema' \
 ...
 
+.. _connection/export:
+
+Exporting Connections from the CLI
+--
+
+You may export connections from the database using the CLI. The supported 
formats are ``json``, ``yaml`` and ``env``.

Review comment:
   @mik-laj Please have a look now. I've added the examples.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-5071) Thousand os Executor reports task instance X finished (success) although the task says its queued. Was the task killed externally?

2020-08-06 Thread Szymon Grzemski (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17172919#comment-17172919
 ] 

Szymon Grzemski commented on AIRFLOW-5071:
--

I think I might have an idea how to replicate it. I'll post it here once I have 
some outcomes.

> Thousand os Executor reports task instance X finished (success) although the 
> task says its queued. Was the task killed externally?
> --
>
> Key: AIRFLOW-5071
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5071
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG, scheduler
>Affects Versions: 1.10.3
>Reporter: msempere
>Priority: Critical
> Fix For: 1.10.12
>
> Attachments: image-2020-01-27-18-10-29-124.png, 
> image-2020-07-08-07-58-42-972.png
>
>
> I'm opening this issue because since I update to 1.10.3 I'm seeing thousands 
> of daily messages like the following in the logs:
>  
> ```
>  {{__init__.py:1580}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> {{jobs.py:1484}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> ```
> -And looks like this is triggering also thousand of daily emails because the 
> flag to send email in case of failure is set to True.-
> I have Airflow setup to use Celery and Redis as a backend queue service.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] turbaszek commented on a change in pull request #10084: Fix more PodMutationHook issues for backwards compatibility

2020-08-06 Thread GitBox


turbaszek commented on a change in pull request #10084:
URL: https://github.com/apache/airflow/pull/10084#discussion_r466839460



##
File path: airflow/kubernetes/pod_generator.py
##
@@ -36,6 +36,9 @@
 import kubernetes.client.models as k8s
 import yaml
 from kubernetes.client.api_client import ApiClient
+from ..contrib.kubernetes.pod import (

Review comment:
   Hm, I checked out this code in both cases `..` and `airflow.contrib` I 
see warning `Access to a protected member _extract_volume_mounts of a module`. 
So I don't think this is a problem. If there's any particular reason to have 
such import I would be happy to have a comment on why we do this way





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #10084: Fix more PodMutationHook issues for backwards compatibility

2020-08-06 Thread GitBox


turbaszek commented on a change in pull request #10084:
URL: https://github.com/apache/airflow/pull/10084#discussion_r466839460



##
File path: airflow/kubernetes/pod_generator.py
##
@@ -36,6 +36,9 @@
 import kubernetes.client.models as k8s
 import yaml
 from kubernetes.client.api_client import ApiClient
+from ..contrib.kubernetes.pod import (

Review comment:
   Hm, I checked out this code in both cases `..` and `airflow.contrib` I 
see warning `Access to a protected member _extract_volume_mounts of a module`. 
So I don't think this is a problem.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dimberman commented on a change in pull request #10084: Fix more PodMutationHook issues for backwards compatibility

2020-08-06 Thread GitBox


dimberman commented on a change in pull request #10084:
URL: https://github.com/apache/airflow/pull/10084#discussion_r466837581



##
File path: airflow/kubernetes/pod_generator.py
##
@@ -36,6 +36,9 @@
 import kubernetes.client.models as k8s
 import yaml
 from kubernetes.client.api_client import ApiClient
+from ..contrib.kubernetes.pod import (

Review comment:
   @turbaszek it seems you need to use relative paths if you want to use 
private functions





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj opened a new pull request #10214: Add airflow connections get command

2020-08-06 Thread GitBox


mik-laj opened a new pull request #10214:
URL: https://github.com/apache/airflow/pull/10214


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #10084: Fix more PodMutationHook issues for backwards compatibility

2020-08-06 Thread GitBox


turbaszek commented on a change in pull request #10084:
URL: https://github.com/apache/airflow/pull/10084#discussion_r466834542



##
File path: airflow/kubernetes/pod_generator.py
##
@@ -36,6 +36,9 @@
 import kubernetes.client.models as k8s
 import yaml
 from kubernetes.client.api_client import ApiClient
+from ..contrib.kubernetes.pod import (

Review comment:
   Why not use full path?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow-site] turbaszek commented on issue #275: The buttons for the Use Cases feel reversed

2020-08-06 Thread GitBox


turbaszek commented on issue #275:
URL: https://github.com/apache/airflow-site/issues/275#issuecomment-670336486


   There are two buttons after the use case: `previous` and `next`. I think 
it's quite intuitive that those refer to navigation between other use cases. 
Happy to hear other opinions, eventually we can ask the UX designer behind this 
design :) 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Comment Edited] (AIRFLOW-5071) Thousand os Executor reports task instance X finished (success) although the task says its queued. Was the task killed externally?

2020-08-06 Thread Tomasz Urbaszek (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17172902#comment-17172902
 ] 

Tomasz Urbaszek edited comment on AIRFLOW-5071 at 8/7/20, 5:37 AM:
---

[~antontimenko] I'm running 100 DAGs of the following type and no faliure 

class TestSensor(BaseSensorOperator):
   def __init__(self, **kwargs):
         super().__init__(**kwargs)
         self.mode = "RESCHEDULE"

  def poke(self, context):
     return choice([True, False, False])

args = \{"owner": "airflow", "start_date": days_ago(1)}

with DAG(
   dag_id="%s",
   is_paused_upon_creation=False,   
   max_active_runs=1,
   default_args=args,
   schedule_interval="* * * * *",
 ) as dag:
   start = TestSensor(task_id="start")
   end = TestSensor(task_id="end")
   for i in range(100):
     next = TestSensor(task_id=f"next_\{i}")
     start >> next >> end

 

I'm using Airflow 1.10.6 on Composer


was (Author: turbaszek):
[~antontimenko] I'm running 100 DAGs of the following type:

class TestSensor(BaseSensorOperator):
   def __init__(self, **kwargs):
        super().__init__(**kwargs)
        self.mode = "RESCHEDULE"

   def poke(self, context):
     return choice([True, False, False])


args = \{"owner": "airflow", "start_date": days_ago(1)}


with DAG(
   dag_id="%s",
   is_paused_upon_creation=False,   
   max_active_runs=1,
   default_args=args,
   schedule_interval="* * * * *",
) as dag:
   start = TestSensor(task_id="start")
   end = TestSensor(task_id="end")
   for i in range(100):
     next = TestSensor(task_id=f"next_\{i}")
     start >> next >> end

> Thousand os Executor reports task instance X finished (success) although the 
> task says its queued. Was the task killed externally?
> --
>
> Key: AIRFLOW-5071
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5071
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG, scheduler
>Affects Versions: 1.10.3
>Reporter: msempere
>Priority: Critical
> Fix For: 1.10.12
>
> Attachments: image-2020-01-27-18-10-29-124.png, 
> image-2020-07-08-07-58-42-972.png
>
>
> I'm opening this issue because since I update to 1.10.3 I'm seeing thousands 
> of daily messages like the following in the logs:
>  
> ```
>  {{__init__.py:1580}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> {{jobs.py:1484}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> ```
> -And looks like this is triggering also thousand of daily emails because the 
> flag to send email in case of failure is set to True.-
> I have Airflow setup to use Celery and Redis as a backend queue service.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5071) Thousand os Executor reports task instance X finished (success) although the task says its queued. Was the task killed externally?

2020-08-06 Thread Tomasz Urbaszek (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17172902#comment-17172902
 ] 

Tomasz Urbaszek commented on AIRFLOW-5071:
--

[~antontimenko] I'm running 100 DAGs of the following type:

class TestSensor(BaseSensorOperator):
   def __init__(self, **kwargs):
        super().__init__(**kwargs)
        self.mode = "RESCHEDULE"

   def poke(self, context):
     return choice([True, False, False])


args = \{"owner": "airflow", "start_date": days_ago(1)}


with DAG(
   dag_id="%s",
   is_paused_upon_creation=False,   
   max_active_runs=1,
   default_args=args,
   schedule_interval="* * * * *",
) as dag:
   start = TestSensor(task_id="start")
   end = TestSensor(task_id="end")
   for i in range(100):
     next = TestSensor(task_id=f"next_\{i}")
     start >> next >> end

> Thousand os Executor reports task instance X finished (success) although the 
> task says its queued. Was the task killed externally?
> --
>
> Key: AIRFLOW-5071
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5071
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG, scheduler
>Affects Versions: 1.10.3
>Reporter: msempere
>Priority: Critical
> Fix For: 1.10.12
>
> Attachments: image-2020-01-27-18-10-29-124.png, 
> image-2020-07-08-07-58-42-972.png
>
>
> I'm opening this issue because since I update to 1.10.3 I'm seeing thousands 
> of daily messages like the following in the logs:
>  
> ```
>  {{__init__.py:1580}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> {{jobs.py:1484}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> ```
> -And looks like this is triggering also thousand of daily emails because the 
> flag to send email in case of failure is set to True.-
> I have Airflow setup to use Celery and Redis as a backend queue service.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] TreeKat71 opened a new pull request #10213: Fix chart: parameterize namespace

2020-08-06 Thread GitBox


TreeKat71 opened a new pull request #10213:
URL: https://github.com/apache/airflow/pull/10213


   Replace fixed namespace "airflow" with variable {{ .Release.Namespace }}
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #10213: Fix chart: parameterize namespace

2020-08-06 Thread GitBox


boring-cyborg[bot] commented on pull request #10213:
URL: https://github.com/apache/airflow/pull/10213#issuecomment-670334642


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://apache-airflow-slack.herokuapp.com/
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #10207: Pylint checks should be way faster now

2020-08-06 Thread GitBox


turbaszek commented on a change in pull request #10207:
URL: https://github.com/apache/airflow/pull/10207#discussion_r466826926



##
File path: tests/airflow_pylint/disable_checks_for_tests.py
##
@@ -0,0 +1,60 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from astroid import MANAGER, scoped_nodes
+from pylint.lint import PyLinter
+
+DISABLED_CHECKS_FOR_TESTS = \
+"missing-docstring, no-self-use, too-many-public-methods, 
protected-access, do-not-use-asserts"
+
+
+def register(_: PyLinter):
+"""
+Skip registering any plugin. This is not a real plugin - we only need it 
to register transform before
+running pylint.
+
+:param _:
+:return:
+"""
+
+
+def transform(mod):
+"""
+It's a small hack but one that gives us a lot of speedup in pylint tests. 
We are replacing the first
+line of the file with pylint-disable (or update existing one) when file 
name start with `test_` or
+(for providers) when it is the full path of the package (both cases occur 
in pylint)
+
+:param mod: astroid module
+:return: None
+"""
+if mod.name.startswith("test_") or \
+mod.name.startswith("tests.") or \
+mod.name.startswith("kubernetes_tests."):
+decoded_lines = mod.stream().read().decode("utf-8").split("\n")
+if decoded_lines[0].startswith("# pylint: disable="):
+decoded_lines[0] = decoded_lines[0] + " " + 
DISABLED_CHECKS_FOR_TESTS
+elif decoded_lines[0].startswith("#") or decoded_lines[0].strip() == 
"":
+decoded_lines[0] = "# pylint: disable=" + DISABLED_CHECKS_FOR_TESTS
+else:
+raise Exception(f"The first line of module {mod.name} is not a 
comment or empty. "
+f"Please make sure it is!")
+# pylint will read from `.file_bytes` attribute later when tokenization
+mod.file_bytes = "\n".join(decoded_lines).encode("utf-8")
+
+
+MANAGER.register_transform(scoped_nodes.Module, transform)

Review comment:
   I'm just wondering if we can first run pylint for main sources and then, 
just add additional disables to `pylintrc`? Not sure if this will be simpler





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #10207: Pylint checks should be way faster now

2020-08-06 Thread GitBox


turbaszek commented on a change in pull request #10207:
URL: https://github.com/apache/airflow/pull/10207#discussion_r466826723



##
File path: tests/dags/test_logging_in_dag.py
##
@@ -25,6 +25,11 @@
 
 
 def test_logging_fn(**kwargs):
+"""
+Tests DAG logging.
+:param kwargs:
+:return:
+"""

Review comment:
   Is this related?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #10209: You can sync your fork master with apache/airflow master via UI

2020-08-06 Thread GitBox


turbaszek commented on a change in pull request #10209:
URL: https://github.com/apache/airflow/pull/10209#discussion_r466825085



##
File path: CONTRIBUTING.rst
##
@@ -862,6 +865,19 @@ commands:
 # Check JS code in .js and .html files, report any errors/warnings and fix 
them if possible
 yarn run lint:fix
 
+How to sync your fork
+=
+
+When you have your fork, you should periodically synchronize the master of 
your fork with the
+Apache Airflow master. In order to do that you can ``git pull`` to your local 
git repository from

Review comment:
   ```suggestion
   Apache Airflow master. In order to do that you can ``git pull --rebase`` to 
your local git repository from
   ```
   Should we explicitly recommend fetch and rebase? Just in case someone one 
read this and skips rest of the information
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj opened a new pull request #10212: [WIP] Disable sentry integration by default

2020-08-06 Thread GitBox


mik-laj opened a new pull request #10212:
URL: https://github.com/apache/airflow/pull/10212


   Hello,
   
   Sentry imports a few libraries when not in use, but just have it installed. 
Users may not even be aware of this.
   
   Best regards,
   Kamil Breguła
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] codecov-commenter commented on pull request #10211: Each secrets backend is on a separate page

2020-08-06 Thread GitBox


codecov-commenter commented on pull request #10211:
URL: https://github.com/apache/airflow/pull/10211#issuecomment-670298773


   # [Codecov](https://codecov.io/gh/apache/airflow/pull/10211?src=pr&el=h1) 
Report
   > Merging 
[#10211](https://codecov.io/gh/apache/airflow/pull/10211?src=pr&el=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/cdec3012542b45d23a05f62d69110944ba542e2a&el=desc)
 will **decrease** coverage by `54.32%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/10211/graphs/tree.svg?width=650&height=150&src=pr&token=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/10211?src=pr&el=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master   #10211   +/-   ##
   ===
   - Coverage   89.41%   35.08%   -54.33% 
   ===
 Files1037 1037   
 Lines   5001350013   
   ===
   - Hits4472017549-27171 
   - Misses   529332464+27171 
   ```
   
   | Flag | Coverage Δ | |
   |---|---|---|
   | #kubernetes-tests-3.6-9.6 | `?` | |
   | #kubernetes-tests-image-3.6-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.6-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.6-v1.18.6 | `?` | |
   | #kubernetes-tests-image-3.7-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.7-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.7-v1.18.6 | `?` | |
   | #mysql-tests-Core-3.7-5.7 | `?` | |
   | #mysql-tests-Core-3.8-5.7 | `?` | |
   | #mysql-tests-Integration-3.7-5.7 | `?` | |
   | #mysql-tests-Integration-3.8-5.7 | `?` | |
   | #postgres-tests-Core-3.6-10 | `?` | |
   | #postgres-tests-Core-3.6-9.6 | `?` | |
   | #postgres-tests-Core-3.7-10 | `?` | |
   | #postgres-tests-Core-3.7-9.6 | `?` | |
   | #postgres-tests-Integration-3.6-10 | `34.73% <ø> (ø)` | |
   | #postgres-tests-Integration-3.6-9.6 | `34.73% <ø> (ø)` | |
   | #postgres-tests-Integration-3.7-10 | `34.73% <ø> (ø)` | |
   | #postgres-tests-Integration-3.7-9.6 | `34.73% <ø> (ø)` | |
   | #sqlite-tests-Core-3.6 | `?` | |
   | #sqlite-tests-Core-3.8 | `?` | |
   | #sqlite-tests-Integration-3.6 | `34.18% <ø> (ø)` | |
   | #sqlite-tests-Integration-3.8 | `34.44% <ø> (ø)` | |
   
   Flags with carried forward coverage won't be shown. [Click 
here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment)
 to find out more.
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/10211?src=pr&el=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/pig\_hook.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9waWdfaG9vay5weQ==)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hdfs\_hook.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oZGZzX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/http\_hook.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9odHRwX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/contrib/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL19faW5pdF9fLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/druid\_hook.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9kcnVpZF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hive\_hooks.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oaXZlX2hvb2tzLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mssql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9tc3NxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mysql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9teXNxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | ... and [907 
more](https://codecov.io/gh/apache/airflow/pull/10211/diff?src=pr&el=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/10211?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø 

[GitHub] [airflow] mik-laj opened a new pull request #10211: Each secrets backend is on a separate page

2020-08-06 Thread GitBox


mik-laj opened a new pull request #10211:
URL: https://github.com/apache/airflow/pull/10211


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #9708: Increase typing coverage

2020-08-06 Thread GitBox


mik-laj commented on issue #9708:
URL: https://github.com/apache/airflow/issues/9708#issuecomment-670275918


   in the first message, I included a command that allows you to check where 
the typing are still missing.
   More info: 
https://mypy.readthedocs.io/en/stable/command_line.html?highlight=Report#report-generation
   
   When i am working on typing coverqge i often use html report.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] morrme commented on issue #10182: Move relevant information about First time contributor workshop as Airflow Blog

2020-08-06 Thread GitBox


morrme commented on issue #10182:
URL: https://github.com/apache/airflow/issues/10182#issuecomment-670274992


   @potiuk Should these be new articles? 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] morrme commented on issue #9708: Increase typing coverage

2020-08-06 Thread GitBox


morrme commented on issue #9708:
URL: https://github.com/apache/airflow/issues/9708#issuecomment-670274611


   @mik-laj 👋🏾  is there an update on what files remain? 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] yuqian90 commented on issue #8696: Skip task itself instead of all downstream tasks

2020-08-06 Thread GitBox


yuqian90 commented on issue #8696:
URL: https://github.com/apache/airflow/issues/8696#issuecomment-670249354


   Great thank you so much @j-y-matsubara 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #10210: SQSSensor Dag is not triggering whenever there is new message.

2020-08-06 Thread GitBox


boring-cyborg[bot] commented on issue #10210:
URL: https://github.com/apache/airflow/issues/10210#issuecomment-670248194


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] prakash260 opened a new issue #10210: SQSSensor Dag is not triggering whenever there is new message.

2020-08-06 Thread GitBox


prakash260 opened a new issue #10210:
URL: https://github.com/apache/airflow/issues/10210


   I have created a DAG using SQSSensor to trigger whenever there is new 
message in SQS. 
   
   It is not triggering automatically whenever there is new message. 
   
   Below  is my code:
   
   from __future__ import print_function
   from airflow import DAG
   from airflow.utils.dates import days_ago
   from airflow.operators.python_operator import PythonOperator
   from airflow.contrib.sensors.aws_sqs_sensor import SQSSensor
   default_args = {
   'owner': 'Airflow',
   'start_date': days_ago(1),
   'provide_context': True,
   }
   def pull_from_xcom(**context):
   val = context['ti'].xcom_pull(task_ids='sqs_get', key='messages')
   print(val)
   dag = DAG('sqs_test', default_args=default_args, schedule_interval='@daily')
   t1 = SQSSensor(
   dag=dag,
   task_id='sqs_get',
   sqs_queue='https://sqs.ap-southeast-2.amazonaws.com/accountid/test',
   aws_conn_id='aws_default',
   max_message=1,
   wait_time_seconds = 1
   )
   t2 = PythonOperator(
   task_id='xcom_pull',
   python_callable=pull_from_xcom,
   depends_on_past=False,
   dag=dag)
   t1 >> t2
   
   
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] codecov-commenter commented on pull request #10207: Pylint checks should be way faster now

2020-08-06 Thread GitBox


codecov-commenter commented on pull request #10207:
URL: https://github.com/apache/airflow/pull/10207#issuecomment-670236880


   # [Codecov](https://codecov.io/gh/apache/airflow/pull/10207?src=pr&el=h1) 
Report
   > Merging 
[#10207](https://codecov.io/gh/apache/airflow/pull/10207?src=pr&el=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/d79e7221de76f01b5cd36c15224b59e8bb451c90&el=desc)
 will **decrease** coverage by `49.50%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/10207/graphs/tree.svg?width=650&height=150&src=pr&token=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/10207?src=pr&el=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master   #10207   +/-   ##
   ===
   - Coverage   89.41%   39.91%   -49.51% 
   ===
 Files1037 1037   
 Lines   5001149724  -287 
   ===
   - Hits4471719846-24871 
   - Misses   529429878+24584 
   ```
   
   | Flag | Coverage Δ | |
   |---|---|---|
   | #kubernetes-tests-3.6-9.6 | `39.91% <ø> (+<0.01%)` | :arrow_up: |
   | #kubernetes-tests-image-3.6-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.6-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.6-v1.18.6 | `?` | |
   | #kubernetes-tests-image-3.7-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.7-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.7-v1.18.6 | `?` | |
   | #mysql-tests-Core-3.7-5.7 | `?` | |
   | #mysql-tests-Core-3.8-5.7 | `?` | |
   | #mysql-tests-Integration-3.7-5.7 | `?` | |
   | #mysql-tests-Integration-3.8-5.7 | `?` | |
   | #postgres-tests-Core-3.6-10 | `?` | |
   | #postgres-tests-Core-3.6-9.6 | `?` | |
   | #postgres-tests-Core-3.7-10 | `?` | |
   | #postgres-tests-Core-3.7-9.6 | `?` | |
   | #postgres-tests-Integration-3.6-10 | `?` | |
   | #postgres-tests-Integration-3.6-9.6 | `?` | |
   | #postgres-tests-Integration-3.7-10 | `?` | |
   | #postgres-tests-Integration-3.7-9.6 | `?` | |
   | #sqlite-tests-Core-3.6 | `?` | |
   | #sqlite-tests-Core-3.8 | `?` | |
   | #sqlite-tests-Integration-3.6 | `?` | |
   | #sqlite-tests-Integration-3.8 | `?` | |
   
   Flags with carried forward coverage won't be shown. [Click 
here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment)
 to find out more.
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/10207?src=pr&el=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/pig\_hook.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9waWdfaG9vay5weQ==)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hdfs\_hook.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oZGZzX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/http\_hook.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9odHRwX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/contrib/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL19faW5pdF9fLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/druid\_hook.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9kcnVpZF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hive\_hooks.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oaXZlX2hvb2tzLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mssql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9tc3NxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mysql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9teXNxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | ... and [911 
more](https://codecov.io/gh/apache/airflow/pull/10207/diff?src=pr&el=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/10207?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered

[airflow] branch constraints-master updated: Updating constraints. GH run id:198356852

2020-08-06 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch constraints-master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/constraints-master by this 
push:
 new 1d498ec  Updating constraints. GH run id:198356852
1d498ec is described below

commit 1d498ecc46d1d8d46583bcd4cec59d4b5115343b
Author: Automated Github Actions commit 
AuthorDate: Thu Aug 6 23:00:21 2020 +

Updating constraints. GH run id:198356852

This update in constraints is automatically committed by the CI 
'constraints-push' step based on
HEAD of 'refs/heads/master' in 'apache/airflow'
with commit sha cdec3012542b45d23a05f62d69110944ba542e2a.

All tests passed in this build so we determined we can push the updated 
constraints.

See 
https://github.com/apache/airflow/blob/master/README.md#installing-from-pypi 
for details.
---
 constraints-3.6.txt | 6 +++---
 constraints-3.7.txt | 6 +++---
 constraints-3.8.txt | 6 +++---
 3 files changed, 9 insertions(+), 9 deletions(-)

diff --git a/constraints-3.6.txt b/constraints-3.6.txt
index 54c3c56..2b95805 100644
--- a/constraints-3.6.txt
+++ b/constraints-3.6.txt
@@ -72,9 +72,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 black==19.10b0
 blinker==1.4
-boto3==1.14.36
+boto3==1.14.37
 boto==2.49.0
-botocore==1.17.36
+botocore==1.17.37
 bowler==0.8.0
 cached-property==1.5.1
 cachetools==4.1.1
@@ -170,7 +170,7 @@ google-cloud-translate==3.0.0
 google-cloud-videointelligence==1.15.0
 google-cloud-vision==1.0.0
 google-crc32c==0.1.0
-google-resumable-media==0.7.0
+google-resumable-media==0.7.1
 googleapis-common-protos==1.52.0
 graphviz==0.14.1
 greenlet==0.4.16
diff --git a/constraints-3.7.txt b/constraints-3.7.txt
index 3d21e1f..8a08e22 100644
--- a/constraints-3.7.txt
+++ b/constraints-3.7.txt
@@ -72,9 +72,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 black==19.10b0
 blinker==1.4
-boto3==1.14.36
+boto3==1.14.37
 boto==2.49.0
-botocore==1.17.36
+botocore==1.17.37
 bowler==0.8.0
 cached-property==1.5.1
 cachetools==4.1.1
@@ -168,7 +168,7 @@ google-cloud-translate==3.0.0
 google-cloud-videointelligence==1.15.0
 google-cloud-vision==1.0.0
 google-crc32c==0.1.0
-google-resumable-media==0.7.0
+google-resumable-media==0.7.1
 googleapis-common-protos==1.52.0
 graphviz==0.14.1
 greenlet==0.4.16
diff --git a/constraints-3.8.txt b/constraints-3.8.txt
index 661414a..70d5891 100644
--- a/constraints-3.8.txt
+++ b/constraints-3.8.txt
@@ -72,9 +72,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 black==19.10b0
 blinker==1.4
-boto3==1.14.36
+boto3==1.14.37
 boto==2.49.0
-botocore==1.17.36
+botocore==1.17.37
 bowler==0.8.0
 cached-property==1.5.1
 cachetools==4.1.1
@@ -168,7 +168,7 @@ google-cloud-translate==3.0.0
 google-cloud-videointelligence==1.15.0
 google-cloud-vision==1.0.0
 google-crc32c==0.1.0
-google-resumable-media==0.7.0
+google-resumable-media==0.7.1
 googleapis-common-protos==1.52.0
 graphviz==0.14.1
 greenlet==0.4.16



[GitHub] [airflow] pcandoalmeida commented on pull request #10162: Add Airflow UI site_title configuration option

2020-08-06 Thread GitBox


pcandoalmeida commented on pull request #10162:
URL: https://github.com/apache/airflow/pull/10162#issuecomment-670224950


   Hi @mik-laj I've got this working for a couple of test views (there's quite 
few!). Would I need to add unit tests for every view? I would assume so, but 
wanted to ask so I can do it as I go along.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] codecov-commenter edited a comment on pull request #10209: You can now trigger apache/airflow sync via Github Web UI

2020-08-06 Thread GitBox


codecov-commenter edited a comment on pull request #10209:
URL: https://github.com/apache/airflow/pull/10209#issuecomment-670219050


   # [Codecov](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=h1) 
Report
   > Merging 
[#10209](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/d79e7221de76f01b5cd36c15224b59e8bb451c90&el=desc)
 will **decrease** coverage by `54.30%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/10209/graphs/tree.svg?width=650&height=150&src=pr&token=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master   #10209   +/-   ##
   ===
   - Coverage   89.41%   35.11%   -54.31% 
   ===
 Files1037 1037   
 Lines   5001150011   
   ===
   - Hits4471717560-27157 
   - Misses   529432451+27157 
   ```
   
   | Flag | Coverage Δ | |
   |---|---|---|
   | #kubernetes-tests-3.6-9.6 | `?` | |
   | #kubernetes-tests-image-3.6-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.6-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.6-v1.18.6 | `?` | |
   | #kubernetes-tests-image-3.7-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.7-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.7-v1.18.6 | `?` | |
   | #mysql-tests-Core-3.7-5.7 | `?` | |
   | #mysql-tests-Core-3.8-5.7 | `?` | |
   | #mysql-tests-Integration-3.7-5.7 | `34.75% <ø> (ø)` | |
   | #mysql-tests-Integration-3.8-5.7 | `?` | |
   | #postgres-tests-Core-3.6-10 | `?` | |
   | #postgres-tests-Core-3.6-9.6 | `?` | |
   | #postgres-tests-Core-3.7-10 | `?` | |
   | #postgres-tests-Core-3.7-9.6 | `?` | |
   | #postgres-tests-Integration-3.6-10 | `34.73% <ø> (ø)` | |
   | #postgres-tests-Integration-3.6-9.6 | `?` | |
   | #postgres-tests-Integration-3.7-10 | `34.73% <ø> (ø)` | |
   | #postgres-tests-Integration-3.7-9.6 | `34.73% <ø> (ø)` | |
   | #sqlite-tests-Core-3.6 | `?` | |
   | #sqlite-tests-Core-3.8 | `?` | |
   | #sqlite-tests-Integration-3.6 | `34.18% <ø> (ø)` | |
   | #sqlite-tests-Integration-3.8 | `34.44% <ø> (ø)` | |
   
   Flags with carried forward coverage won't be shown. [Click 
here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment)
 to find out more.
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/pig\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9waWdfaG9vay5weQ==)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hdfs\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oZGZzX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/http\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9odHRwX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/contrib/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL19faW5pdF9fLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/druid\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9kcnVpZF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hive\_hooks.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oaXZlX2hvb2tzLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mssql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9tc3NxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mysql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9teXNxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | ... and [905 
more](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact

[GitHub] [airflow] codecov-commenter edited a comment on pull request #10209: You can now trigger apache/airflow sync via Github Web UI

2020-08-06 Thread GitBox


codecov-commenter edited a comment on pull request #10209:
URL: https://github.com/apache/airflow/pull/10209#issuecomment-670219050


   # [Codecov](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=h1) 
Report
   > Merging 
[#10209](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/d79e7221de76f01b5cd36c15224b59e8bb451c90&el=desc)
 will **decrease** coverage by `54.30%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/10209/graphs/tree.svg?width=650&height=150&src=pr&token=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master   #10209   +/-   ##
   ===
   - Coverage   89.41%   35.11%   -54.31% 
   ===
 Files1037 1037   
 Lines   5001150011   
   ===
   - Hits4471717560-27157 
   - Misses   529432451+27157 
   ```
   
   | Flag | Coverage Δ | |
   |---|---|---|
   | #kubernetes-tests-3.6-9.6 | `?` | |
   | #kubernetes-tests-image-3.6-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.6-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.6-v1.18.6 | `?` | |
   | #kubernetes-tests-image-3.7-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.7-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.7-v1.18.6 | `?` | |
   | #mysql-tests-Core-3.7-5.7 | `?` | |
   | #mysql-tests-Core-3.8-5.7 | `?` | |
   | #mysql-tests-Integration-3.7-5.7 | `34.75% <ø> (ø)` | |
   | #mysql-tests-Integration-3.8-5.7 | `?` | |
   | #postgres-tests-Core-3.6-10 | `?` | |
   | #postgres-tests-Core-3.6-9.6 | `?` | |
   | #postgres-tests-Core-3.7-10 | `?` | |
   | #postgres-tests-Core-3.7-9.6 | `?` | |
   | #postgres-tests-Integration-3.6-10 | `?` | |
   | #postgres-tests-Integration-3.6-9.6 | `?` | |
   | #postgres-tests-Integration-3.7-10 | `?` | |
   | #postgres-tests-Integration-3.7-9.6 | `34.73% <ø> (ø)` | |
   | #sqlite-tests-Core-3.6 | `?` | |
   | #sqlite-tests-Core-3.8 | `?` | |
   | #sqlite-tests-Integration-3.6 | `34.18% <ø> (ø)` | |
   | #sqlite-tests-Integration-3.8 | `34.44% <ø> (ø)` | |
   
   Flags with carried forward coverage won't be shown. [Click 
here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment)
 to find out more.
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/pig\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9waWdfaG9vay5weQ==)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hdfs\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oZGZzX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/http\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9odHRwX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/contrib/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL19faW5pdF9fLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/druid\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9kcnVpZF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hive\_hooks.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oaXZlX2hvb2tzLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mssql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9tc3NxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mysql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9teXNxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | ... and [905 
more](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `?

[GitHub] [airflow] codecov-commenter commented on pull request #10209: You can now trigger apache/airflow sync via Github Web UI

2020-08-06 Thread GitBox


codecov-commenter commented on pull request #10209:
URL: https://github.com/apache/airflow/pull/10209#issuecomment-670219050


   # [Codecov](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=h1) 
Report
   > Merging 
[#10209](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/d79e7221de76f01b5cd36c15224b59e8bb451c90&el=desc)
 will **decrease** coverage by `54.30%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/10209/graphs/tree.svg?width=650&height=150&src=pr&token=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master   #10209   +/-   ##
   ===
   - Coverage   89.41%   35.11%   -54.31% 
   ===
 Files1037 1037   
 Lines   5001150011   
   ===
   - Hits4471717559-27158 
   - Misses   529432452+27158 
   ```
   
   | Flag | Coverage Δ | |
   |---|---|---|
   | #kubernetes-tests-3.6-9.6 | `?` | |
   | #kubernetes-tests-image-3.6-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.6-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.6-v1.18.6 | `?` | |
   | #kubernetes-tests-image-3.7-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.7-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.7-v1.18.6 | `?` | |
   | #mysql-tests-Core-3.7-5.7 | `?` | |
   | #mysql-tests-Core-3.8-5.7 | `?` | |
   | #mysql-tests-Integration-3.7-5.7 | `34.75% <ø> (ø)` | |
   | #mysql-tests-Integration-3.8-5.7 | `?` | |
   | #postgres-tests-Core-3.6-10 | `?` | |
   | #postgres-tests-Core-3.6-9.6 | `?` | |
   | #postgres-tests-Core-3.7-10 | `?` | |
   | #postgres-tests-Core-3.7-9.6 | `?` | |
   | #postgres-tests-Integration-3.6-10 | `?` | |
   | #postgres-tests-Integration-3.6-9.6 | `?` | |
   | #postgres-tests-Integration-3.7-10 | `?` | |
   | #postgres-tests-Integration-3.7-9.6 | `?` | |
   | #sqlite-tests-Core-3.6 | `?` | |
   | #sqlite-tests-Core-3.8 | `?` | |
   | #sqlite-tests-Integration-3.6 | `34.18% <ø> (ø)` | |
   | #sqlite-tests-Integration-3.8 | `34.44% <ø> (ø)` | |
   
   Flags with carried forward coverage won't be shown. [Click 
here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment)
 to find out more.
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/pig\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9waWdfaG9vay5weQ==)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hdfs\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oZGZzX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/http\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9odHRwX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/contrib/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL19faW5pdF9fLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/druid\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9kcnVpZF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hive\_hooks.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oaXZlX2hvb2tzLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mssql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9tc3NxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/mysql\_hook.py](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9teXNxbF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | ... and [906 
more](https://codecov.io/gh/apache/airflow/pull/10209/diff?src=pr&el=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/10209?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   

[airflow] branch master updated: Add correct signature to all operators and sensors (#10205)

2020-08-06 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new cdec301  Add correct signature to all operators and sensors (#10205)
cdec301 is described below

commit cdec3012542b45d23a05f62d69110944ba542e2a
Author: Ephraim Anierobi 
AuthorDate: Thu Aug 6 23:17:01 2020 +0100

Add correct signature to all operators and sensors (#10205)

* add correct signature to operators in providers package

* add keyword only to operators and sensors outside provider package

* remove unused type ignore
---
 airflow/operators/email.py|  4 ++--
 airflow/operators/sql.py  |  8 
 airflow/providers/celery/sensors/celery_queue.py  |  2 +-
 airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py |  1 +
 .../providers/cncf/kubernetes/operators/spark_kubernetes.py   |  2 +-
 airflow/providers/cncf/kubernetes/sensors/spark_kubernetes.py |  2 +-
 airflow/providers/databricks/operators/databricks.py  |  4 ++--
 airflow/providers/datadog/sensors/datadog.py  |  2 +-
 airflow/providers/dingding/operators/dingding.py  |  2 +-
 airflow/providers/discord/operators/discord_webhook.py|  2 +-
 airflow/providers/docker/operators/docker.py  |  2 +-
 airflow/providers/docker/operators/docker_swarm.py|  4 ++--
 airflow/providers/exasol/operators/exasol.py  |  2 +-
 airflow/providers/ftp/sensors/ftp.py  |  2 +-
 airflow/providers/google/ads/transfers/ads_to_gcs.py  |  2 +-
 airflow/providers/google/cloud/sensors/bigquery.py|  2 +-
 airflow/providers/google/cloud/sensors/bigquery_dts.py|  1 +
 airflow/providers/google/cloud/sensors/bigtable.py|  1 +
 .../google/cloud/sensors/cloud_storage_transfer_service.py|  2 +-
 airflow/providers/google/cloud/sensors/gcs.py |  2 +-
 airflow/providers/google/cloud/sensors/pubsub.py  |  2 +-
 airflow/providers/google/cloud/transfers/adls_to_gcs.py   |  2 +-
 .../providers/google/cloud/transfers/bigquery_to_bigquery.py  |  2 +-
 airflow/providers/google/cloud/transfers/bigquery_to_gcs.py   |  2 +-
 airflow/providers/google/cloud/transfers/bigquery_to_mysql.py |  2 +-
 airflow/providers/google/cloud/transfers/cassandra_to_gcs.py  |  2 +-
 .../providers/google/cloud/transfers/facebook_ads_to_gcs.py   |  2 +-
 airflow/providers/google/cloud/transfers/gcs_to_bigquery.py   |  2 +-
 airflow/providers/google/cloud/transfers/gcs_to_gcs.py|  2 +-
 airflow/providers/google/cloud/transfers/gcs_to_local.py  |  2 +-
 airflow/providers/google/cloud/transfers/gcs_to_sftp.py   |  2 +-
 airflow/providers/google/cloud/transfers/local_to_gcs.py  |  2 +-
 airflow/providers/google/cloud/transfers/mssql_to_gcs.py  |  2 +-
 airflow/providers/google/cloud/transfers/mysql_to_gcs.py  |  2 +-
 airflow/providers/google/cloud/transfers/postgres_to_gcs.py   |  2 +-
 airflow/providers/google/cloud/transfers/presto_to_gcs.py |  2 +-
 airflow/providers/google/cloud/transfers/s3_to_gcs.py |  2 +-
 airflow/providers/google/cloud/transfers/sftp_to_gcs.py   |  2 +-
 airflow/providers/google/cloud/transfers/sheets_to_gcs.py |  7 ---
 airflow/providers/google/cloud/transfers/sql_to_gcs.py|  2 +-
 .../google/marketing_platform/sensors/campaign_manager.py |  2 +-
 .../google/marketing_platform/sensors/display_video.py|  5 ++---
 .../providers/google/marketing_platform/sensors/search_ads.py |  2 +-
 airflow/providers/grpc/operators/grpc.py  |  2 +-
 airflow/providers/http/operators/http.py  |  2 +-
 airflow/providers/http/sensors/http.py|  2 +-
 airflow/providers/imap/sensors/imap_attachment.py |  2 +-
 airflow/providers/jdbc/operators/jdbc.py  |  2 +-
 airflow/providers/jenkins/operators/jenkins_job_trigger.py|  3 +--
 airflow/providers/jira/operators/jira.py  |  2 +-
 airflow/providers/jira/sensors/jira.py|  4 ++--
 airflow/providers/microsoft/azure/operators/adls_list.py  |  2 +-
 airflow/providers/microsoft/azure/operators/adx.py|  2 +-
 airflow/providers/microsoft/azure/operators/azure_batch.py|  2 +-
 .../microsoft/azure/operators/azure_container_instances.py|  2 +-
 airflow/providers/microsoft/azure/operators/azure_cosmos.py   |  2 +-
 .../providers/microsoft/azure/operators/wasb_delete_blob.py   |  2 +-
 airflow/providers/microsoft/azure/sensors/azure_cosmos.py |  2 +-
 airflow/providers/microsoft/azure/sensors/wasb.py |  4 ++--
 airflow/providers/microsoft/azure/transfers/file_to_wasb.py   |  2 +-
 

[GitHub] [airflow] potiuk merged pull request #10205: Add correct signature to all operators and sensors

2020-08-06 Thread GitBox


potiuk merged pull request #10205:
URL: https://github.com/apache/airflow/pull/10205


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk closed issue #9942: Incorrect signature of operators

2020-08-06 Thread GitBox


potiuk closed issue #9942:
URL: https://github.com/apache/airflow/issues/9942


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #10209: You can now trigger apache/airflow sync via Github Web UI

2020-08-06 Thread GitBox


potiuk opened a new pull request #10209:
URL: https://github.com/apache/airflow/pull/10209


   We are using newly added feature of GitHub to add manually triggered
   workflow to enable manually-triggered force-syncing of your fork
   with apache/airflow.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on issue #10195: Grafana Dashboard

2020-08-06 Thread GitBox


kaxil commented on issue #10195:
URL: https://github.com/apache/airflow/issues/10195#issuecomment-670210718


   https://grafana.com/grafana/dashboards/9672



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil closed issue #10195: Grafana Dashboard

2020-08-06 Thread GitBox


kaxil closed issue #10195:
URL: https://github.com/apache/airflow/issues/10195


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dependabot[bot] closed pull request #9880: Bump lodash from 4.17.15 to 4.17.19 in /airflow/www

2020-08-06 Thread GitBox


dependabot[bot] closed pull request #9880:
URL: https://github.com/apache/airflow/pull/9880


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dependabot[bot] closed pull request #10086: Bump elliptic from 6.5.2 to 6.5.3 in /airflow/www

2020-08-06 Thread GitBox


dependabot[bot] closed pull request #10086:
URL: https://github.com/apache/airflow/pull/10086


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dependabot[bot] commented on pull request #9880: Bump lodash from 4.17.15 to 4.17.19 in /airflow/www

2020-08-06 Thread GitBox


dependabot[bot] commented on pull request #9880:
URL: https://github.com/apache/airflow/pull/9880#issuecomment-670207616


   Looks like lodash is up-to-date now, so this is no longer needed.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dependabot[bot] commented on pull request #10086: Bump elliptic from 6.5.2 to 6.5.3 in /airflow/www

2020-08-06 Thread GitBox


dependabot[bot] commented on pull request #10086:
URL: https://github.com/apache/airflow/pull/10086#issuecomment-670207536


   Looks like elliptic is up-to-date now, so this is no longer needed.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (d79e722 -> c920b1b)

2020-08-06 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from d79e722  Type annotation for Docker operator (#9733)
 add c920b1b  Update JS packages to latest versions (#9811) (#9921)

No new revisions were added by this update.

Summary of changes:
 airflow/www/package.json  |   50 +-
 airflow/www/webpack.config.js |  109 +-
 airflow/www/yarn.lock | 2753 +++--
 3 files changed, 1669 insertions(+), 1243 deletions(-)



[GitHub] [airflow] kaxil merged pull request #9921: Update JS packages to latest versions (#9811)

2020-08-06 Thread GitBox


kaxil merged pull request #9921:
URL: https://github.com/apache/airflow/pull/9921


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil closed issue #9811: Update JS packages to latest versions

2020-08-06 Thread GitBox


kaxil closed issue #9811:
URL: https://github.com/apache/airflow/issues/9811


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #10208: Improves stability of reported coverage and makes it nicer

2020-08-06 Thread GitBox


potiuk opened a new pull request #10208:
URL: https://github.com/apache/airflow/pull/10208


   With this change we only upload coverage report in the case
   when all tests were successuful and actually executed. This means
   that coverage report will not be run when the job gets canceled.
   
   Currently a lof of coverage reports gathered contain far less
   coverage because when static check fails or docs some test jobs
   could already submit their coverage - resulting in partial coverage
   reports.
   
   With this change we also remove comment from coverage report
   and replace it with (for now) informational status message published
   to github. If we see that it works, we can change it to a
   PR-failing status if coverage drops for a given PR.
   
   This way we might get our coverage monotonously increasing :).
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on pull request #9921: Update JS packages to latest versions (#9811)

2020-08-06 Thread GitBox


kaxil commented on pull request #9921:
URL: https://github.com/apache/airflow/pull/9921#issuecomment-670204880


   @retornam I still see the following errors in Console on my deployment: 
   
![image](https://user-images.githubusercontent.com/8811558/89585269-78d03a00-d835-11ea-8793-402051c050ac.png)
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #10207: Pylint checks should be way faster now

2020-08-06 Thread GitBox


potiuk opened a new pull request #10207:
URL: https://github.com/apache/airflow/pull/10207


   Instead of running separate pylint checks for tests and main source
   we are running a single check now. This is possible thanks to a
   nice hack - we have pylint plugin that injects the right
   "# pylint: disable=" comment for all test files while reading
   the file content by astroid (just before tokenization)
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-6786) Adding KafkaConsumerHook, KafkaProducerHook, and KafkaSensor

2020-08-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6786?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17172674#comment-17172674
 ] 

ASF GitHub Bot commented on AIRFLOW-6786:
-

turbaszek commented on pull request #7407:
URL: https://github.com/apache/airflow/pull/7407#issuecomment-670198210


   What do you think about having single `KafkaHook` that implements both 
consumer and producer methods? It seems to be common approach in Airflow to try 
to have single hook for an external service.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Adding KafkaConsumerHook, KafkaProducerHook, and KafkaSensor
> 
>
> Key: AIRFLOW-6786
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6786
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, hooks
>Affects Versions: 1.10.9
>Reporter: Daniel Ferguson
>Assignee: Daniel Ferguson
>Priority: Minor
>
> Add the KafkaProducerHook.
>  Add the KafkaConsumerHook.
>  Add the KafkaSensor which listens to messages with a specific topic.
>  Related Issue:
>  #1311 (Pre-dates Jira Migration)
> Reminder to contributors:
> You must add an Apache License header to all new files
>  Please squash your commits when possible and follow the 7 rules of good Git 
> commits
>  I am new to the community, I am not sure the files are at the right place or 
> missing anything.
> The sensor could be used as the first node of a dag where the second node can 
> be a TriggerDagRunOperator. The messages are polled in a batch and the dag 
> runs are dynamically generated.
> Thanks!
> Note, as per denied PR [#1415|https://github.com/apache/airflow/pull/1415], 
> it is important to mention these integrations are not suitable for 
> low-latency/high-throughput/streaming. For reference, [#1415 
> (comment)|https://github.com/apache/airflow/pull/1415#issuecomment-484429806].
> Co-authored-by: Dan Ferguson 
> [dferguson...@gmail.com|mailto:dferguson...@gmail.com]
>  Co-authored-by: YuanfΞi Zhu



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] turbaszek commented on pull request #7407: [AIRFLOW-6786] Add KafkaConsumerHook, KafkaProduerHook and KafkaSensor

2020-08-06 Thread GitBox


turbaszek commented on pull request #7407:
URL: https://github.com/apache/airflow/pull/7407#issuecomment-670198210


   What do you think about having single `KafkaHook` that implements both 
consumer and producer methods? It seems to be common approach in Airflow to try 
to have single hook for an external service.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-6786) Adding KafkaConsumerHook, KafkaProducerHook, and KafkaSensor

2020-08-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6786?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17172673#comment-17172673
 ] 

ASF GitHub Bot commented on AIRFLOW-6786:
-

turbaszek commented on a change in pull request #7407:
URL: https://github.com/apache/airflow/pull/7407#discussion_r466691730



##
File path: airflow/providers/apache/kafka/hooks/kafka_consumer_hook.py
##
@@ -0,0 +1,86 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from kafka import KafkaConsumer
+
+from airflow.hooks.base_hook import BaseHook
+
+
+class KafkaConsumerHook(BaseHook):
+"""
+KafkaConsumerHook Class.
+"""
+DEFAULT_HOST = 'kafka1'
+DEFAULT_PORT = 9092
+
+def __init__(self, topic, host=DEFAULT_HOST, port=DEFAULT_PORT, 
kafka_conn_id='kafka_default'):
+super(KafkaConsumerHook, self).__init__(None)
+self.conn_id = kafka_conn_id
+self._conn = None
+self.server = None
+self.consumer = None
+self.extra_dejson = {}
+self.topic = topic
+self.host = host
+self.port = port
+
+def get_conn(self) -> KafkaConsumer:
+"""
+A Kafka Consumer object.
+
+:returns: A Kafka Consumer object.
+"""
+if not self._conn:
+conn = self.get_connection(self.conn_id)
+service_options = conn.extra_dejson
+host = conn.host or self.DEFAULT_HOST
+port = conn.port or self.DEFAULT_PORT
+
+self.server = f"""{host}:{port}"""
+self.consumer = KafkaConsumer(
+self.topic,
+bootstrap_servers=self.server,
+**service_options
+)
+return self.consumer
+
+def get_messages(self, timeout_ms=5000) -> dict:
+"""
+Get all the messages haven't been consumed, it doesn't
+block by default, then commit the offset.
+
+:param timeout_ms: Timeout in Milliseconds
+:returns: A list of messages
+"""
+consumer = self.get_conn()
+try:
+messages = consumer.poll(timeout_ms)
+# consumer.commit()
+finally:
+consumer.close()
+return messages
+
+def __repr__(self):
+"""
+A pretty version of the connection string.
+"""
+connected = self.consumer is not None
+return '' % \
+   (connected, self.server, self.topic)

Review comment:
   Not sure if this will be ever used





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Adding KafkaConsumerHook, KafkaProducerHook, and KafkaSensor
> 
>
> Key: AIRFLOW-6786
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6786
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, hooks
>Affects Versions: 1.10.9
>Reporter: Daniel Ferguson
>Assignee: Daniel Ferguson
>Priority: Minor
>
> Add the KafkaProducerHook.
>  Add the KafkaConsumerHook.
>  Add the KafkaSensor which listens to messages with a specific topic.
>  Related Issue:
>  #1311 (Pre-dates Jira Migration)
> Reminder to contributors:
> You must add an Apache License header to all new files
>  Please squash your commits when possible and follow the 7 rules of good Git 
> commits
>  I am new to the community, I am not sure the files are at the right place or 
> missing anything.
> The sensor could be used as the first node of a dag where the second node can 
> be a TriggerDagRunOperator. The messages are polled in a batch and the dag 
> runs are dynamically generated.
> Thanks!
> Note, as per denied PR [#1415|https://github.com/apache/airflow/pull/1415], 
> it is important to mention these integrations are not suitable for 
> low-latency/high-throughput/st

[jira] [Commented] (AIRFLOW-6786) Adding KafkaConsumerHook, KafkaProducerHook, and KafkaSensor

2020-08-06 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6786?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17172672#comment-17172672
 ] 

ASF GitHub Bot commented on AIRFLOW-6786:
-

turbaszek commented on a change in pull request #7407:
URL: https://github.com/apache/airflow/pull/7407#discussion_r466691419



##
File path: airflow/providers/apache/kafka/hooks/kafka_consumer_hook.py
##
@@ -0,0 +1,86 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from kafka import KafkaConsumer
+
+from airflow.hooks.base_hook import BaseHook
+
+
+class KafkaConsumerHook(BaseHook):
+"""
+KafkaConsumerHook Class.
+"""
+DEFAULT_HOST = 'kafka1'
+DEFAULT_PORT = 9092
+
+def __init__(self, topic, host=DEFAULT_HOST, port=DEFAULT_PORT, 
kafka_conn_id='kafka_default'):

Review comment:
   Would you mind adding type hints? There's already ongoing effort to 
improve mypy coverage #9708  





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Adding KafkaConsumerHook, KafkaProducerHook, and KafkaSensor
> 
>
> Key: AIRFLOW-6786
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6786
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, hooks
>Affects Versions: 1.10.9
>Reporter: Daniel Ferguson
>Assignee: Daniel Ferguson
>Priority: Minor
>
> Add the KafkaProducerHook.
>  Add the KafkaConsumerHook.
>  Add the KafkaSensor which listens to messages with a specific topic.
>  Related Issue:
>  #1311 (Pre-dates Jira Migration)
> Reminder to contributors:
> You must add an Apache License header to all new files
>  Please squash your commits when possible and follow the 7 rules of good Git 
> commits
>  I am new to the community, I am not sure the files are at the right place or 
> missing anything.
> The sensor could be used as the first node of a dag where the second node can 
> be a TriggerDagRunOperator. The messages are polled in a batch and the dag 
> runs are dynamically generated.
> Thanks!
> Note, as per denied PR [#1415|https://github.com/apache/airflow/pull/1415], 
> it is important to mention these integrations are not suitable for 
> low-latency/high-throughput/streaming. For reference, [#1415 
> (comment)|https://github.com/apache/airflow/pull/1415#issuecomment-484429806].
> Co-authored-by: Dan Ferguson 
> [dferguson...@gmail.com|mailto:dferguson...@gmail.com]
>  Co-authored-by: YuanfΞi Zhu



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] turbaszek commented on a change in pull request #7407: [AIRFLOW-6786] Add KafkaConsumerHook, KafkaProduerHook and KafkaSensor

2020-08-06 Thread GitBox


turbaszek commented on a change in pull request #7407:
URL: https://github.com/apache/airflow/pull/7407#discussion_r466691730



##
File path: airflow/providers/apache/kafka/hooks/kafka_consumer_hook.py
##
@@ -0,0 +1,86 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from kafka import KafkaConsumer
+
+from airflow.hooks.base_hook import BaseHook
+
+
+class KafkaConsumerHook(BaseHook):
+"""
+KafkaConsumerHook Class.
+"""
+DEFAULT_HOST = 'kafka1'
+DEFAULT_PORT = 9092
+
+def __init__(self, topic, host=DEFAULT_HOST, port=DEFAULT_PORT, 
kafka_conn_id='kafka_default'):
+super(KafkaConsumerHook, self).__init__(None)
+self.conn_id = kafka_conn_id
+self._conn = None
+self.server = None
+self.consumer = None
+self.extra_dejson = {}
+self.topic = topic
+self.host = host
+self.port = port
+
+def get_conn(self) -> KafkaConsumer:
+"""
+A Kafka Consumer object.
+
+:returns: A Kafka Consumer object.
+"""
+if not self._conn:
+conn = self.get_connection(self.conn_id)
+service_options = conn.extra_dejson
+host = conn.host or self.DEFAULT_HOST
+port = conn.port or self.DEFAULT_PORT
+
+self.server = f"""{host}:{port}"""
+self.consumer = KafkaConsumer(
+self.topic,
+bootstrap_servers=self.server,
+**service_options
+)
+return self.consumer
+
+def get_messages(self, timeout_ms=5000) -> dict:
+"""
+Get all the messages haven't been consumed, it doesn't
+block by default, then commit the offset.
+
+:param timeout_ms: Timeout in Milliseconds
+:returns: A list of messages
+"""
+consumer = self.get_conn()
+try:
+messages = consumer.poll(timeout_ms)
+# consumer.commit()
+finally:
+consumer.close()
+return messages
+
+def __repr__(self):
+"""
+A pretty version of the connection string.
+"""
+connected = self.consumer is not None
+return '' % \
+   (connected, self.server, self.topic)

Review comment:
   Not sure if this will be ever used





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on a change in pull request #7407: [AIRFLOW-6786] Add KafkaConsumerHook, KafkaProduerHook and KafkaSensor

2020-08-06 Thread GitBox


turbaszek commented on a change in pull request #7407:
URL: https://github.com/apache/airflow/pull/7407#discussion_r466691419



##
File path: airflow/providers/apache/kafka/hooks/kafka_consumer_hook.py
##
@@ -0,0 +1,86 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from kafka import KafkaConsumer
+
+from airflow.hooks.base_hook import BaseHook
+
+
+class KafkaConsumerHook(BaseHook):
+"""
+KafkaConsumerHook Class.
+"""
+DEFAULT_HOST = 'kafka1'
+DEFAULT_PORT = 9092
+
+def __init__(self, topic, host=DEFAULT_HOST, port=DEFAULT_PORT, 
kafka_conn_id='kafka_default'):

Review comment:
   Would you mind adding type hints? There's already ongoing effort to 
improve mypy coverage #9708  





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] amithmathew commented on issue #9461: Unclear documentation for the delegate_to parameter

2020-08-06 Thread GitBox


amithmathew commented on issue #9461:
URL: https://github.com/apache/airflow/issues/9461#issuecomment-670193295


   `delegate_to` would be used to impersonate a user account using a (specific) 
service account - It does look like `delegate_to` may be 
[used](https://developers.google.com/admin-sdk/reports/v1/guides/delegation) on 
the GSuite side of things.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jaketf commented on issue #9461: Unclear documentation for the delegate_to parameter

2020-08-06 Thread GitBox


jaketf commented on issue #9461:
URL: https://github.com/apache/airflow/issues/9461#issuecomment-670188547


   Taking a look at the code now it seems we have this common 
[GoogleBaseHook](https://github.com/apache/airflow/blob/d79e7221de76f01b5cd36c15224b59e8bb451c90/airflow/providers/google/common/hooks/base_google.py#L125)
 used by hooks for gsuite and cloud. This `delegate_to` seems not really not 
useful for cloud, and I don't think the scenario 2 of delegating to a human 
user to impersonate a service account is an advisable pattern / one worth 
supporting in airflow core. I think `delegate_to` should be removed / 
deprecated from the Google Cloud Hooks / Operators to avoid confusion.
   
   To play devil's advocate: There may be use cases where users expect 
`delegate_to` to attribute API calls (e.g. a BQ Query) to the delegated human 
user. Again, I don't think I'd recommend this as an auditing posture as anyone 
could throw j...@foo.com into the `delegate_to` and bootstrap my IAM 
permissions. IMO This seems like something we shouldn't support.  



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: fixup! Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks

2020-08-06 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new a513964  fixup! Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks
a513964 is described below

commit a5139645da073480d32bf799afd98947c3c12b3d
Author: Kaxil Naik 
AuthorDate: Thu Aug 6 21:11:40 2020 +0100

fixup! Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks
---
 setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index af6f5ab..e16b1cd 100644
--- a/setup.py
+++ b/setup.py
@@ -326,7 +326,7 @@ papermill = [
 'papermill[all]>=1.0.0',
 'nteract-scrapbook[all]>=0.2.1',
 'pyarrow<1.0.0',
-'fsspec<0.8.0;python_version<"3.6"'
+'fsspec<0.8.0;python_version=="3.5"'
 ]
 password = [
 'bcrypt>=2.0.0',



[GitHub] [airflow] kaxil commented on a change in pull request #9008: Get connections uri with AWS Secrets Manager backend

2020-08-06 Thread GitBox


kaxil commented on a change in pull request #9008:
URL: https://github.com/apache/airflow/pull/9008#discussion_r466649936



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -60,15 +59,23 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 
 def __init__(
 self,
-connections_prefix: str = 'airflow/connections',
-variables_prefix: str = 'airflow/variables',
-profile_name: Optional[str] = None,
-sep: str = "/",
+connections_prefix=None,

Review comment:
   The following should work in your case (can you try):
   
   ```ini
   [secrets]
   backend = airflow.contrib.secrets.aws_secrets_manager.SecretsManagerBackend
   backend_kwargs = {"connections_prefix": "", "variables_prefix": "", "sep": 
""}
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #9008: Get connections uri with AWS Secrets Manager backend

2020-08-06 Thread GitBox


kaxil commented on a change in pull request #9008:
URL: https://github.com/apache/airflow/pull/9008#discussion_r466649936



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -60,15 +59,23 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 
 def __init__(
 self,
-connections_prefix: str = 'airflow/connections',
-variables_prefix: str = 'airflow/variables',
-profile_name: Optional[str] = None,
-sep: str = "/",
+connections_prefix=None,

Review comment:
   The following should work in your case (can you try):
   
   ```ini
   [secrets]
   backend = airflow.contrib.secrets.aws_secrets_manager.SecretsManagerBackend
   backend_kwargs = {"connections_prefix": "", "sep": ""}
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated (2f4d872 -> badd86a)

2020-08-06 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


omit 2f4d872  Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks
 new badd86a  Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (2f4d872)
\
 N -- N -- N   refs/heads/v1-10-test (badd86a)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[airflow] 01/01: Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks

2020-08-06 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit badd86ae4ed9b002265f474674bad9371e22f9a0
Author: Kaxil Naik 
AuthorDate: Thu Aug 6 20:19:35 2020 +0100

Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks
---
 setup.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index 327e157..af6f5ab 100644
--- a/setup.py
+++ b/setup.py
@@ -325,7 +325,8 @@ pagerduty = [
 papermill = [
 'papermill[all]>=1.0.0',
 'nteract-scrapbook[all]>=0.2.1',
-'pyarrow<1.0.0'
+'pyarrow<1.0.0',
+'fsspec<0.8.0;python_version<"3.6"'
 ]
 password = [
 'bcrypt>=2.0.0',



[airflow] branch v1-10-test updated: Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks

2020-08-06 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 2f4d872  Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks
2f4d872 is described below

commit 2f4d8727e38e1a8a60a1c8889e74f60e387d3b73
Author: Kaxil Naik 
AuthorDate: Thu Aug 6 20:19:35 2020 +0100

Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks
---
 setup.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index 327e157..b4d1374 100644
--- a/setup.py
+++ b/setup.py
@@ -325,7 +325,8 @@ pagerduty = [
 papermill = [
 'papermill[all]>=1.0.0',
 'nteract-scrapbook[all]>=0.2.1',
-'pyarrow<1.0.0'
+'pyarrow<1.0.0',
+'fsspec<8.0.0;python_version<"3.6"'
 ]
 password = [
 'bcrypt>=2.0.0',



[airflow] branch v1-10-test updated (54d8aae -> 2f4d872)

2020-08-06 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 54d8aae  Revert "Enable pretty output in mypy (#9785)"
 add 2f4d872  Pin fsspec<8.0.0 for Python <3.6 to fix Static Checks

No new revisions were added by this update.

Summary of changes:
 setup.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)



[GitHub] [airflow] fengsi commented on issue #10206: Make `webserver_config.py` location customizable

2020-08-06 Thread GitBox


fengsi commented on issue #10206:
URL: https://github.com/apache/airflow/issues/10206#issuecomment-670144119


   > @fengsi I like this idea. This can be standardized to have a similar 
interface to `logging_config_class`. Would you like to work on it?
   
   Sure, I'd be willing to make that happen.  I may also take care of other 
config paths together. 👍



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jkbngl commented on pull request #9295: added mssql to oracle transfer operator

2020-08-06 Thread GitBox


jkbngl commented on pull request #9295:
URL: https://github.com/apache/airflow/pull/9295#issuecomment-670141704


   Hi @ephraimbuddy it seems like you suggestion worked, thanks a lot 👍 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #10206: Make `webserver_config.py` location customizable

2020-08-06 Thread GitBox


mik-laj commented on issue #10206:
URL: https://github.com/apache/airflow/issues/10206#issuecomment-670116878


   @fengsi  I like this idea. This can be standardized to have a similar 
interface to `logging_config_class`.  Would you like to work on it?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (eff0f03 -> d79e722)

2020-08-06 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from eff0f03   Update guide for Google Cloud Secret Manager Backend (#10172)
 add d79e722  Type annotation for Docker operator (#9733)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/docker/hooks/docker.py   | 13 
 airflow/providers/docker/operators/docker.py   | 20 -
 airflow/providers/docker/operators/docker_swarm.py | 35 ++
 3 files changed, 42 insertions(+), 26 deletions(-)



[GitHub] [airflow] potiuk merged pull request #9733: Increasing typing coverage for Docker

2020-08-06 Thread GitBox


potiuk merged pull request #9733:
URL: https://github.com/apache/airflow/pull/9733


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk closed issue #8696: Skip task itself instead of all downstream tasks

2020-08-06 Thread GitBox


potiuk closed issue #8696:
URL: https://github.com/apache/airflow/issues/8696


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


mik-laj commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466598778



##
File path: docs/howto/connection/index.rst
##
@@ -88,6 +88,31 @@ Alternatively you may specify each parameter individually:
 --conn-schema 'schema' \
 ...
 
+.. _connection/export:
+
+Exporting Connections from the CLI
+--
+
+You may export connections from the database using the CLI. The supported 
formats are ``json``, ``yaml`` and ``env``.

Review comment:
   Can you describe exactly what each of these file formats looks like? It 
is very important that the user can look into the documentation to know what 
such a file looks like. Here is an example of a similar description so you can 
copy common parts of your docs.
   
https://airflow.readthedocs.io/en/latest/howto/use-alternative-secrets-backend.html#local-filesystem-secrets-backend
   At this point, it makes sense to add a reference to your section.
   >  You can create the file by exporting the connections from the database, 
for more information see: "Exporting connection from the CLI"
   
   Documentation is often the more difficult part than writing an 
implementation, but I will be very happy to help you with this.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


mik-laj commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466598778



##
File path: docs/howto/connection/index.rst
##
@@ -88,6 +88,31 @@ Alternatively you may specify each parameter individually:
 --conn-schema 'schema' \
 ...
 
+.. _connection/export:
+
+Exporting Connections from the CLI
+--
+
+You may export connections from the database using the CLI. The supported 
formats are ``json``, ``yaml`` and ``env``.

Review comment:
   Can you describe exactly what each of these file formats looks like? It 
is very important that the user can look into the documentation to know what 
such a file looks like. Here is an example of a similar description so you can 
copy common parts of your docs.
   
https://airflow.readthedocs.io/en/latest/howto/use-alternative-secrets-backend.html#local-filesystem-secrets-backend
   At this point, it makes sense to add a reference to your section.
   >  You can create the file by exporting the connections from the database, 
for more information see: "Exporting connection from the CLI"
   
   Documentation is often the more difficult part than writing an 
implementation, but I try to fill many gaps in the documentation. I will be 
very happy to help you with this.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


mik-laj commented on a change in pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#discussion_r466598778



##
File path: docs/howto/connection/index.rst
##
@@ -88,6 +88,31 @@ Alternatively you may specify each parameter individually:
 --conn-schema 'schema' \
 ...
 
+.. _connection/export:
+
+Exporting Connections from the CLI
+--
+
+You may export connections from the database using the CLI. The supported 
formats are ``json``, ``yaml`` and ``env``.

Review comment:
   Can you describe exactly what each of these file formats looks like? It 
is very important that the user can look into the documentation to know what 
such a file looks like. Here is an example of a similar description so you can 
copy common parts of your docs.
   
https://airflow.readthedocs.io/en/latest/howto/use-alternative-secrets-backend.html#local-filesystem-secrets-backend
   At this point, it makes sense to add a reference to your section.
   >  You can create the file by exporting the connections from the database, 
for more information see: "Exporting connection from the CLI"
   
   Documentation is often the more difficult part than writing an 
implementation, but I try to fill many gaps in the documentation.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (0c77ea8 -> eff0f03)

2020-08-06 Thread kamilbregula
This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 0c77ea8  Add type annotations to S3 hook module (#10164)
 add eff0f03   Update guide for Google Cloud Secret Manager Backend (#10172)

No new revisions were added by this update.

Summary of changes:
 .../google/cloud/secrets/secret_manager.py |   5 +-
 docs/howto/use-alternative-secrets-backend.rst | 109 -
 2 files changed, 89 insertions(+), 25 deletions(-)



[GitHub] [airflow] potiuk closed issue #10204: ModuleNotFoundError: No module named 'docker' in DockerOperator

2020-08-06 Thread GitBox


potiuk closed issue #10204:
URL: https://github.com/apache/airflow/issues/10204


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on issue #10204: ModuleNotFoundError: No module named 'docker' in DockerOperator

2020-08-06 Thread GitBox


potiuk edited a comment on issue #10204:
URL: https://github.com/apache/airflow/issues/10204#issuecomment-670088238


   Not really. See https://airflow.apache.org/docs/stable/installation.html
   
   Apache Airflow has a number of extras that you can choose when installing it 
. For example pip install airflow[kubernetes] will install all the stuff to run 
kubernetes. The file you mentioned is a constraint file only (and we are 
changing the mechanism of those  slightly soon). You can read the installation 
doc where you find some details: 
https://airflow.apache.org/docs/stable/installation.html



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #10204: ModuleNotFoundError: No module named 'docker' in DockerOperator

2020-08-06 Thread GitBox


potiuk commented on issue #10204:
URL: https://github.com/apache/airflow/issues/10204#issuecomment-670088238


   Not really. See https://airflow.apache.org/docs/stable/installation.html
   
   Apache has a number of extras that you can choose when installing it . For 
example pip install airflow[kubernetes] will install all the stuff to run 
kubernetes. The file you mentioned is a constraint file only (and we are 
changing the mechanism of those  slightly soon). You can read the installation 
doc where you find some details: 
https://airflow.apache.org/docs/stable/installation.html



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj merged pull request #10172: Update guide for Google Cloud Secret Manager Backend

2020-08-06 Thread GitBox


mik-laj merged pull request #10172:
URL: https://github.com/apache/airflow/pull/10172


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #10165: Add missing `labels` parameter to MLEngineTrainingOperator

2020-08-06 Thread GitBox


mik-laj commented on issue #10165:
URL: https://github.com/apache/airflow/issues/10165#issuecomment-670084435


   @coopergillan I assigned you to this ticket.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #10206: Make `webserver_config.py` location customizable

2020-08-06 Thread GitBox


boring-cyborg[bot] commented on issue #10206:
URL: https://github.com/apache/airflow/issues/10206#issuecomment-670083722


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] fengsi opened a new issue #10206: Make `webserver_config.py` location customizable

2020-08-06 Thread GitBox


fengsi opened a new issue #10206:
URL: https://github.com/apache/airflow/issues/10206


   **Description**
   `webserver_config.py` location is 
[hard-coded](https://github.com/apache/airflow/blob/c2db0dfeb13ee679bf4d7b57874f0fcb39c0f0ed/airflow/configuration.py#L769)
 as follows:
   ```
   WEBSERVER_CONFIG = AIRFLOW_HOME + '/webserver_config.py'
   ```
   
   **Use case / motivation**
   It would be great if this path is customizable.  Since `AIRFLOW_HOME/config` 
is already in `PYTHONPATH` (thought that's also hard-coded), it's common for 
user to mount all config files under `AIRFLOW_HOME/config`.  However, this 
`webserver_config.py` needs extra handling, either by mounting using a 
`subPath` explicitly, or in my case, creating a symbolic link from 
`AIRFLOW_HOME/webserver_config.py` to `AIRFLOW_HOME/config/webserver_config.py` 
in the image.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy opened a new pull request #10205: Add correct signature to all operators and sensors in providers package

2020-08-06 Thread GitBox


ephraimbuddy opened a new pull request #10205:
URL: https://github.com/apache/airflow/pull/10205


   This PR closes part of #9942, It enforces keyword-only arguments for all 
sensors and operators in provider package
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] tambulkar opened a new issue #10204: ModuleNotFoundError: No module named 'docker' in DockerOperator

2020-08-06 Thread GitBox


tambulkar opened a new issue #10204:
URL: https://github.com/apache/airflow/issues/10204


   **Apache Airflow version**: 1.10.11
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   
   **Environment**: Conda Python 3.6.11
   
   **What happened**:
   
   Trying to import the DockerOperator from `airflow.operators.docker_operator 
and I get this error
   ModuleNotFoundError: No module named 'docker'
   
   **What you expected to happen**:
   
   Shouldn't airflow automatically install all the dependencies listed 
[here](https://github.com/apache/airflow/blob/1.10.11/requirements/requirements-python3.6.txt)
 when I run `pip install apache-airflow`? It is easily fixed by manually 
running `pip install docker`, but is this the expected behavior? 
   
   **How to reproduce it**:
   ```
   conda create -n test python 3.6.11 --yes
   conda activate test
   pip install apache-airflow
   pip freeze
   ```
   [docker](https://github.com/docker/docker-py) doesn't appear in the list of 
installed packages 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] j-y-matsubara commented on issue #8696: Skip task itself instead of all downstream tasks

2020-08-06 Thread GitBox


j-y-matsubara commented on issue #8696:
URL: https://github.com/apache/airflow/issues/8696#issuecomment-670063962


   @yuqian90 
   I am sorry to bother you, but Is it possible to close this issue?
   The PR have been merged, but it didn't seem to link well with this issue.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] j-y-matsubara commented on issue #8696: Skip task itself instead of all downstream tasks

2020-08-06 Thread GitBox


j-y-matsubara commented on issue #8696:
URL: https://github.com/apache/airflow/issues/8696#issuecomment-670059961


   Thank you for your reply, @ccage-simp 
   I'm sorry, I made a wrong interpretation about your comment.
   
   Best Regard,



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rootcss edited a comment on pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


rootcss edited a comment on pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#issuecomment-670043056


   > Users are asking for this command on Slack, so you're doing an important 
feature :-D
   > https://apache-airflow.slack.com/archives/CCR6P6JRL/p1596726375390300
   > I also thought today how to use it in another Airflow feature and I can't 
wait for it to be done.
   
   Nice. The docs will be ready today too. Code changes are done.
   
   Edit: Updated the docs too @mik-laj 
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bellhea removed a comment on issue #10202: TaskHandlerWithCustomFormatter adds prefix twice

2020-08-06 Thread GitBox


bellhea removed a comment on issue #10202:
URL: https://github.com/apache/airflow/issues/10202#issuecomment-670059026


   -



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bellhea closed issue #10202: TaskHandlerWithCustomFormatter adds prefix twice

2020-08-06 Thread GitBox


bellhea closed issue #10202:
URL: https://github.com/apache/airflow/issues/10202


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bellhea commented on issue #10202: TaskHandlerWithCustomFormatter adds prefix twice

2020-08-06 Thread GitBox


bellhea commented on issue #10202:
URL: https://github.com/apache/airflow/issues/10202#issuecomment-670059026


   -



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] OmairK commented on pull request #9733: Increasing typing coverage for Docker

2020-08-06 Thread GitBox


OmairK commented on pull request #9733:
URL: https://github.com/apache/airflow/pull/9733#issuecomment-670052862


   > @OmairK Can you do a rebase?
   
   Done 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #10084: Fix more PodMutationHook issues for backwards compatibility

2020-08-06 Thread GitBox


kaxil commented on a change in pull request #10084:
URL: https://github.com/apache/airflow/pull/10084#discussion_r466548177



##
File path: airflow/kubernetes/pod_generator.py
##
@@ -637,6 +640,37 @@ def extend_object_field(base_obj, client_obj, field_name):
 setattr(client_obj_cp, field_name, base_obj_field)
 return client_obj_cp
 
-appended_fields = base_obj_field + client_obj_field
+base_obj_set = get_dict_from_list(base_obj_field)
+client_obj_set = get_dict_from_list(client_obj_field)
+
+appended_fields = _merge_list_of_objects(base_obj_set, client_obj_set)
+
 setattr(client_obj_cp, field_name, appended_fields)
 return client_obj_cp
+
+
+def _merge_list_of_objects(base_obj_set, client_obj_set):
+for k, v in base_obj_set.items():
+if k not in client_obj_set:
+client_obj_set[k] = v
+else:
+client_obj_set[k] = merge_objects(v, client_obj_set[k])
+appended_field_keys = sorted(client_obj_set.keys())
+appended_fields = [client_obj_set[k] for k in appended_field_keys]
+return appended_fields
+
+
+def get_dict_from_list(base_list):
+"""
+:param base_list:
+:type base_list: list(Optional[dict,

Review comment:
   We need to fix this or remove the type and make it an internal Method





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rootcss commented on pull request #10081: Add CLI for connections export (#9856)

2020-08-06 Thread GitBox


rootcss commented on pull request #10081:
URL: https://github.com/apache/airflow/pull/10081#issuecomment-670043056


   > Users are asking for this command on Slack, so you're doing an important 
feature :-D
   > https://apache-airflow.slack.com/archives/CCR6P6JRL/p1596726375390300
   > I also thought today how to use it in another Airflow feature and I can't 
wait for it to be done.
   
   Nice. The docs will be ready today too. Code changes are done.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] coopergillan commented on issue #10165: Add missing `labels` parameter to MLEngineTrainingOperator

2020-08-06 Thread GitBox


coopergillan commented on issue #10165:
URL: https://github.com/apache/airflow/issues/10165#issuecomment-670037712


   I can give a shot at picking this up.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #10172: Update guide for Google Cloud Secret Manager Backend

2020-08-06 Thread GitBox


potiuk commented on a change in pull request #10172:
URL: https://github.com/apache/airflow/pull/10172#discussion_r466536026



##
File path: docs/howto/use-alternative-secrets-backend.rst
##
@@ -383,48 +383,75 @@ Note that the secret ``Key`` is ``value``, and secret 
``Value`` is ``world`` and
 
 .. _secret_manager_backend:
 
-GCP Secret Manager Backend
-^^
+Google Cloud Secret Manager Backend
+^^^
 
-To enable GCP Secrets Manager to retrieve connection/variables, specify 
:py:class:`~airflow.providers.google.cloud.secrets.secret_manager.CloudSecretManagerBackend`
-as the ``backend`` in  ``[secrets]`` section of ``airflow.cfg``.
+This topic describes how to configure Airflow to use `Secret Manager 
`__ as
+a secret bakcned and how to manage secrets.
 
-Available parameters to ``backend_kwargs``:
+Before you begin
+
 
-* ``connections_prefix``: Specifies the prefix of the secret to read to get 
Connections.
-* ``variables_prefix``: Specifies the prefix of the secret to read to get 
Variables.
-* ``gcp_key_path``: Path to GCP Credential JSON file
-* ``gcp_scopes``: Comma-separated string containing GCP scopes
-* ``sep``: separator used to concatenate connections_prefix and conn_id. 
Default: "-"
+`Configure Secret Manager and your local environment 
`__, 
once per project.
 
-Note: The full GCP Secrets Manager secret id should follow the pattern 
"[a-zA-Z0-9-_]".
+Enabling the secret backend
+"""
 
-Here is a sample configuration if you want to just retrieve connections:
+To enable the secret backend for Google Cloud Secrets Manager to retrieve 
connection/variables,
+specify 
:py:class:`~airflow.providers.google.cloud.secrets.secret_manager.CloudSecretManagerBackend`

Review comment:
   Backport Packages have their own documentation -  - and I already have a 
mechanism to incorpoare some extra information in it - I will extract some of 
the useful GCP guides ther with the next wave of backport packages.
   
   The documentation is here: 
https://github.com/apache/airflow/tree/master/airflow/providers/google
   
   And when released it can be found in PyPI 
https://pypi.org/project/apache-airflow-backport-providers-google/2020.6.24/ 
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] gardnerdev opened a new issue #10203: Logs not visible in UI even though they exist

2020-08-06 Thread GitBox


gardnerdev opened a new issue #10203:
URL: https://github.com/apache/airflow/issues/10203


   **Apache Airflow version**:
   1.10.10
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   1.18.3
   
   **Environment**:
   Kubernetes cluser 
   
   
   **What happened**:
   During dag execution I am not able to see logs 
   
   > *** Log file does not exist: 
/opt/airflow/logs/example_dag/python_print_date_152/2019-01-01T00:00:00+00:00/1.log
   *** Fetching from: 
htttp://fi-airflow-worker-97bcfb76-jdwj7:8793/log/example_dag/python_print_date_152/2019-01-01T00:00:00+00:00/1.log
   *** Failed to fetch log file from worker. 
HTTPConnectionPool(host='fi-airflow-worker-97bcfb76-jdwj7', port=8793): Max 
retries exceeded with url: 
/log/example_dag/python_print_date_152/2019-01-01T00:00:00+00:00/1.log (Caused 
by NewConnectionError(': Failed to establish a new connection: [Errno -2] Name or 
service not known'))
   
   
   
   
   **What you expected to happen**:
   See logs from dag execution in UI no only from command line



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (817e1ac -> 0c77ea8)

2020-08-06 Thread turbaszek
This is an automated email from the ASF dual-hosted git repository.

turbaszek pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 817e1ac  Add thredup to list of Airflow users (#10198)
 add 0c77ea8  Add type annotations to S3 hook module (#10164)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/hooks/s3.py | 172 ++-
 1 file changed, 103 insertions(+), 69 deletions(-)



[GitHub] [airflow] codecov-commenter commented on pull request #9733: Increasing typing coverage for Docker

2020-08-06 Thread GitBox


codecov-commenter commented on pull request #9733:
URL: https://github.com/apache/airflow/pull/9733#issuecomment-670027989


   # [Codecov](https://codecov.io/gh/apache/airflow/pull/9733?src=pr&el=h1) 
Report
   > Merging 
[#9733](https://codecov.io/gh/apache/airflow/pull/9733?src=pr&el=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/817e1ac938667eef9cea4ae6d0502f3ec571bd4a&el=desc)
 will **decrease** coverage by `54.69%`.
   > The diff coverage is `50.00%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/9733/graphs/tree.svg?width=650&height=150&src=pr&token=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/9733?src=pr&el=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master#9733   +/-   ##
   ===
   - Coverage   89.42%   34.73%   -54.70% 
   ===
 Files1037 1037   
 Lines   4998549709  -276 
   ===
   - Hits4469917266-27433 
   - Misses   528632443+27157 
   ```
   
   | Flag | Coverage Δ | |
   |---|---|---|
   | #kubernetes-tests-3.6-9.6 | `?` | |
   | #kubernetes-tests-image-3.6-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.6-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.6-v1.18.6 | `?` | |
   | #kubernetes-tests-image-3.7-v1.16.9 | `?` | |
   | #kubernetes-tests-image-3.7-v1.17.5 | `?` | |
   | #kubernetes-tests-image-3.7-v1.18.6 | `?` | |
   | #mysql-tests-Core-3.7-5.7 | `?` | |
   | #mysql-tests-Core-3.8-5.7 | `?` | |
   | #mysql-tests-Integration-3.7-5.7 | `?` | |
   | #mysql-tests-Integration-3.8-5.7 | `?` | |
   | #postgres-tests-Core-3.6-10 | `?` | |
   | #postgres-tests-Core-3.6-9.6 | `?` | |
   | #postgres-tests-Core-3.7-10 | `?` | |
   | #postgres-tests-Core-3.7-9.6 | `?` | |
   | #postgres-tests-Integration-3.6-10 | `34.73% <50.00%> (-0.01%)` | 
:arrow_down: |
   | #postgres-tests-Integration-3.6-9.6 | `?` | |
   | #postgres-tests-Integration-3.7-10 | `?` | |
   | #postgres-tests-Integration-3.7-9.6 | `?` | |
   | #sqlite-tests-Core-3.6 | `?` | |
   | #sqlite-tests-Integration-3.6 | `?` | |
   | #sqlite-tests-Integration-3.8 | `?` | |
   
   Flags with carried forward coverage won't be shown. [Click 
here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment)
 to find out more.
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/9733?src=pr&el=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/providers/docker/operators/docker\_swarm.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZG9ja2VyL29wZXJhdG9ycy9kb2NrZXJfc3dhcm0ucHk=)
 | `22.97% <36.84%> (-67.51%)` | :arrow_down: |
   | 
[airflow/providers/docker/operators/docker.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZG9ja2VyL29wZXJhdG9ycy9kb2NrZXIucHk=)
 | `19.62% <60.00%> (-75.52%)` | :arrow_down: |
   | 
[airflow/providers/docker/hooks/docker.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZG9ja2VyL2hvb2tzL2RvY2tlci5weQ==)
 | `24.39% <100.00%> (-68.11%)` | :arrow_down: |
   | 
[airflow/hooks/S3\_hook.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9TM19ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/pig\_hook.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9waWdfaG9vay5weQ==)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/hdfs\_hook.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9oZGZzX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/http\_hook.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9odHRwX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/contrib/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9jb250cmliL19faW5pdF9fLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | 
[airflow/hooks/druid\_hook.py](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree#diff-YWlyZmxvdy9ob29rcy9kcnVpZF9ob29rLnB5)
 | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | ... and [921 
more](https://codecov.io/gh/apache/airflow/pull/9733/diff?src=pr&el=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/9733?src=pr&el=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)

  1   2   3   >