[GitHub] [airflow] dstandish commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


dstandish commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r717216128



##
File path: tests/providers/amazon/aws/hooks/test_redshift_statement.py
##
@@ -0,0 +1,72 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+import json
+import unittest
+from unittest import mock
+
+from airflow.models import Connection
+from airflow.providers.amazon.aws.hooks.redshift_statement import 
RedshiftStatementHook
+
+
+class TestRedshiftStatementHookConn(unittest.TestCase):
+def setUp(self):
+super().setUp()
+
+self.connection = Connection(login='login', password='password', 
host='host', port=5439, schema="dev")
+
+class UnitTestRedshiftStatementHook(RedshiftStatementHook):
+conn_name_attr = "redshift_conn_id"
+conn_type = 'redshift+redshift_connector'
+
+self.db_hook = UnitTestRedshiftStatementHook()
+self.db_hook.get_connection = mock.Mock()
+self.db_hook.get_connection.return_value = self.connection
+
+def test_get_uri(self):
+uri_shouldbe = 
'redshift+redshift_connector://login:password@host:5439/dev'
+x = self.db_hook.get_uri()
+assert uri_shouldbe == x

Review comment:
   ```suggestion
   assert x == uri_shoudbe
   ```
   the convention with pytest is `assert actual == expected`
   
   nit pick here, but would also be easier to read / more conventional if you 
called it  `expected` or `uri_expected` instead of `uri_shouldbe`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


dstandish commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r717131359



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,131 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Execute statements against Amazon Redshift, using redshift_connector."""
+try:
+from functools import cached_property
+except ImportError:
+from cached_property import cached_property
+from typing import Dict, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):
+"""
+Execute statements against Amazon Redshift, using redshift_connector
+
+This hook requires the redshift_conn_id connection. This connection must
+be initialized with the host, port, login, password. Additional connection
+options can be passed to extra as a JSON string.
+
+:param redshift_conn_id: reference to
+:ref:`Amazon Redshift connection id`
+:type redshift_conn_id: str
+
+.. note::
+get_sqlalchemy_engine() and get_uri() depend on 
sqlalchemy-amazon-redshift
+"""
+
+conn_name_attr = 'redshift_conn_id'
+default_conn_name = 'redshift_default'
+conn_type = 'redshift+redshift_connector'
+hook_name = 'Amazon Redshift'
+supports_autocommit = True
+
+@staticmethod
+def get_ui_field_behavior() -> Dict:
+"""Returns custom field behavior"""
+return {
+"hidden_fields": [],
+"relabeling": {'login': 'User', 'schema': 'Database'},
+}
+
+def __init__(self, *args, **kwargs) -> None:
+super().__init__(*args, **kwargs)
+
+@cached_property
+def conn(self):
+return self.get_connection(
+self.redshift_conn_id  # type: ignore[attr-defined]  # pylint: 
disable=no-member
+)
+
+def _get_conn_params(self) -> Dict[str, Union[str, int]]:
+"""Helper method to retrieve connection args"""
+conn = self.conn
+
+conn_params: Dict[str, Union[str, int]] = {}
+
+if conn.login:
+conn_params['user'] = conn.login
+if conn.password:
+conn_params['password'] = conn.password
+if conn.host:
+conn_params['host'] = conn.host
+if conn.port:
+conn_params['port'] = conn.port
+if conn.schema:
+conn_params['database'] = conn.schema
+
+return conn_params
+
+def get_uri(self) -> str:
+"""
+Override DbApiHook get_uri method for get_sqlalchemy_engine()
+
+.. note::
+Value passed to connection extra parameter will be excluded
+from returned uri but passed to get_sqlalchemy_engine()
+by default
+"""
+from sqlalchemy.engine.url import URL
+
+conn_params = self._get_conn_params()
+
+conn = self.conn
+
+conn_type = conn.conn_type or RedshiftStatementHook.conn_type
+
+if 'user' in conn_params:
+conn_params['username'] = conn_params.pop('user')
+
+return URL(drivername=conn_type, **conn_params).__str__()

Review comment:
   ```suggestion
   return str(URL(drivername=conn_type, **conn_params))
   ```
   
   I believe `str()` is preferred to calling `__str__()` directly

##
File path: airflow/providers/amazon/aws/operators/redshift.py
##
@@ -0,0 +1,73 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the

[GitHub] [airflow] mik-laj commented on a change in pull request #18563: Test kubernetes refresh config

2021-09-27 Thread GitBox


mik-laj commented on a change in pull request #18563:
URL: https://github.com/apache/airflow/pull/18563#discussion_r717209660



##
File path: tests/kubernetes/test_refresh_config.py
##
@@ -35,3 +43,64 @@ def 
test_parse_timestamp_should_convert_regular_timezone_to_unix_timestamp(self)
 def test_parse_timestamp_should_throw_exception(self):
 with pytest.raises(ParserError):
 _parse_timestamp("foobar")
+
+def test_get_kube_config_loader_for_yaml_file(self):
+refresh_kube_config_loader = 
_get_kube_config_loader_for_yaml_file('./kube_config')
+
+assert refresh_kube_config_loader is not None
+
+assert refresh_kube_config_loader.current_context['name'] == 
'federal-context'
+
+context = refresh_kube_config_loader.current_context['context']
+assert context is not None
+assert context['cluster'] == 'horse-cluster'
+assert context['namespace'] == 'chisel-ns'
+assert context['user'] == 'green-user'
+
+def test_get_api_key_with_prefix(self):
+
+refresh_config = RefreshConfiguration()
+refresh_config.api_key['key'] = '1234'
+assert refresh_config is not None
+
+api_key = refresh_config.get_api_key_with_prefix("key")
+
+assert api_key == '1234'
+
+def test_refresh_kube_config_loader(self):
+
+current_context = 
_get_kube_config_loader_for_yaml_file('./kube_config').current_context
+
+config_dict = {}
+config_dict['current-context'] = 'federal-context'
+config_dict['contexts'] = []
+config_dict['contexts'].append(current_context)
+
+config_dict['clusters'] = []
+
+cluster_config = {}
+cluster_config['api-version'] = 'v1'
+cluster_config['server'] = 'http://cow.org:8080'
+cluster_config['name'] = 'horse-cluster'
+cluster_root_config = {}
+cluster_root_config['cluster'] = cluster_config
+cluster_root_config['name'] = 'horse-cluster'
+config_dict['clusters'].append(cluster_root_config)
+
+refresh_kube_config_loader = 
RefreshKubeConfigLoader(config_dict=config_dict)
+refresh_kube_config_loader._user = {}
+refresh_kube_config_loader._user['exec'] = 'test'
+
+config_node = ConfigNode('command', 'test')
+config_node.__dict__['apiVersion'] = '2.0'
+config_node.__dict__['command'] = 'test'
+
+ExecProvider.__init__ = Mock()

Review comment:
   Can you use `unittest.mock` here to avoid side-effects?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell opened a new pull request #18565: Updating the Elasticsearch example DAG to use the TaskFlow API

2021-09-27 Thread GitBox


josh-fell opened a new pull request #18565:
URL: https://github.com/apache/airflow/pull/18565


   Related: #9415
   
   This was missed from #18278 which updated miscellaneous example DAGs in 
providers to use the TaskFlow API.  This PR updates the example DAG for 
Elasticsearch in the same manner.  Also there is a refactoring of 
`default_args` similar to previous example DAG PRs.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell opened a new pull request #18564: Adding `task_group` parameter to the `BaseOperator` docstring

2021-09-27 Thread GitBox


josh-fell opened a new pull request #18564:
URL: https://github.com/apache/airflow/pull/18564


   In the `BaseOperator` docstring the `task_group` parameter was missing and 
therefore missing from the documentation.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] subkanthi opened a new pull request #18563: Test kubernetes refresh config

2021-09-27 Thread GitBox


subkanthi opened a new pull request #18563:
URL: https://github.com/apache/airflow/pull/18563


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] r-richmond commented on issue #14396: Make context less nebulous

2021-09-27 Thread GitBox


r-richmond commented on issue #14396:
URL: https://github.com/apache/airflow/issues/14396#issuecomment-928642301


   @kaxil Noticed this keeps getting pushed. Can you provide any additional 
context?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell opened a new pull request #18562: Updating core example DAGs to use TaskFlow API where applicable

2021-09-27 Thread GitBox


josh-fell opened a new pull request #18562:
URL: https://github.com/apache/airflow/pull/18562


   Related: #9415
   
   This PR aims to replace the use of PythonOperator tasks for the TaskFlow API 
in several core example DAGs.
   
   Additionally, there are instances of replacing `trigger_rule` values with 
the appropriate `TriggerRule` attr instead of the literal string, removing an 
unnecessary `dag` arg in _example_skip_dag_, and replacing `PythonOperator` in 
_example_complex_ for `BashOperator` (since it was the only task that wasn't a 
`BashOperator` and using a `PythonOperator` task really wasn't adding any value 
to the example).
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ShakaibKhan commented on issue #17487: Make gantt view to show also retries

2021-09-27 Thread GitBox


ShakaibKhan commented on issue #17487:
URL: https://github.com/apache/airflow/issues/17487#issuecomment-928564345


   I would like to try implementing this
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18561: Update s3_list.py

2021-09-27 Thread GitBox


boring-cyborg[bot] commented on pull request #18561:
URL: https://github.com/apache/airflow/pull/18561#issuecomment-928522307


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Xilorole opened a new pull request #18561: Update s3_list.py

2021-09-27 Thread GitBox


Xilorole opened a new pull request #18561:
URL: https://github.com/apache/airflow/pull/18561


   removed inappropriate character `{` from the error message
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] GaoJiaChengPaul edited a comment on issue #16881: Re-deploy scheduler tasks failing with SIGTERM on K8s executor

2021-09-27 Thread GitBox


GaoJiaChengPaul edited a comment on issue #16881:
URL: https://github.com/apache/airflow/issues/16881#issuecomment-928504787


   Hi All,
   
   We are having the same problem as below discussion. In our case we are 
running 2 schedulers in two different clusters with Kubernetes Executor.
   
   #18455
   
   @ashb 
   Hi Ashb,
   Any suggestions about this?
   Do we need to reset a running task?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] GaoJiaChengPaul commented on issue #16881: Re-deploy scheduler tasks failing with SIGTERM on K8s executor

2021-09-27 Thread GitBox


GaoJiaChengPaul commented on issue #16881:
URL: https://github.com/apache/airflow/issues/16881#issuecomment-928504787


   Hi All,
   
   We are having the same problem as below discussion. In our case we are 
running 2 schedulers in two different clusters with Kubernetes Executor.
   
   #18455
   
   @ashb 
   Hi Ashb,
   Any suggestions about this?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Brooke-white commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


Brooke-white commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r717126873



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,159 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Execute statements against Amazon Redshift, using redshift_connector."""
+
+from typing import Callable, Dict, Optional, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):
+"""
+Execute statements against Amazon Redshift, using redshift_connector
+
+This hook requires the redshift_conn_id connection. This connection must
+be initialized with the host, port, login, password. Additional connection
+options can be passed to extra as a JSON string.
+
+:param redshift_conn_id: reference to
+:ref:`Amazon Redshift connection id`
+:type redshift_conn_id: str
+
+.. note::
+get_sqlalchemy_engine() and get_uri() depend on 
sqlalchemy-amazon-redshift
+"""
+
+conn_name_attr = 'redshift_conn_id'
+default_conn_name = 'redshift_default'
+conn_type = 'redshift+redshift_connector'
+hook_name = 'Amazon Redshift'
+supports_autocommit = True
+
+@staticmethod
+def get_ui_field_behavior() -> Dict:
+"""Returns custom field behavior"""
+return {
+"hidden_fields": [],
+"relabeling": {'login': 'User', 'schema': 'Database'},
+}
+
+def __init__(self, *args, **kwargs) -> None:
+super().__init__(*args, **kwargs)
+
+def _get_conn_params(self) -> Dict[str, Union[str, int]]:
+"""Helper method to retrieve connection args"""
+conn = self.get_connection(
+self.redshift_conn_id  # type: ignore[attr-defined]  # pylint: 
disable=no-member
+)
+
+conn_params: Dict[str, Union[str, int]] = {
+"user": conn.login or '',
+"password": conn.password or '',
+"host": conn.host or '',
+"port": conn.port or 5439,
+"database": conn.schema or '',
+}
+
+return conn_params
+
+def _get_conn_kwargs(self) -> Dict:
+"""Helper method to retrieve connection kwargs"""
+conn = self.get_connection(

Review comment:
   cached property `conn` added, and `_get_conn_kwargs` removed in a1de44e

##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,159 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Execute statements against Amazon Redshift, using redshift_connector."""
+
+from typing import Callable, Dict, Optional, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):
+"""
+Execute statements against Amazon Redshift, using redshift_connector
+
+This hook requires the redshift_conn_id connection. This connection must
+be initialized with the host, port, login, password. Additional connection
+options can be passed to extra as a JSON string.
+
+:param redshift_conn_id: reference to
+:ref:`Amazon Redshift connection id`
+:type redshift_conn_id: str
+
+.. note::
+get_sqlalchemy_engine() and get_uri() depend on 
sqlalchemy-amazon-redshift
+"""
+
+   

[GitHub] [airflow] github-actions[bot] closed pull request #16647: Move FABs base Security Manager into Airflow.

2021-09-27 Thread GitBox


github-actions[bot] closed pull request #16647:
URL: https://github.com/apache/airflow/pull/16647


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] closed pull request #16498: gitpodify Apache Airflow - online development workspace

2021-09-27 Thread GitBox


github-actions[bot] closed pull request #16498:
URL: https://github.com/apache/airflow/pull/16498


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] closed pull request #17354: Add Docker Sensor

2021-09-27 Thread GitBox


github-actions[bot] closed pull request #17354:
URL: https://github.com/apache/airflow/pull/17354


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #16795: GCP Dataflow - Fixed getting job status by job id

2021-09-27 Thread GitBox


github-actions[bot] commented on pull request #16795:
URL: https://github.com/apache/airflow/pull/16795#issuecomment-928481553


   This pull request has been automatically marked as stale because it has not 
had recent activity. It will be closed in 5 days if no further activity occurs. 
Thank you for your contributions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] t4n1o edited a comment on issue #18541: Error when running dag & something to do with parsing the start date

2021-09-27 Thread GitBox


t4n1o edited a comment on issue #18541:
URL: https://github.com/apache/airflow/issues/18541#issuecomment-928440208


   Would it help if I upgraded airflow to a newer version (2.14)?
   
   The dag I'm running that caused this has `start_date` set to:
   ```
   with DAG(
   'Archives Mirror',
   default_args=default_args,
   description='A simple tutorial DAG',
   schedule_interval=timedelta(days=1),
   start_date=days_ago(2),
   tags=['example'],
   ) as dag:
   t1 = BashOperator(
   task_id='mirror_csv_gz_archives',
   bash_command='cd /opt/repo/ && /opt/repo/venv/bin/python -m 
path.to.my.mirror_script',
   )
   
   ```
   
   I copied days_ago(2) from this sample tutorial: 
https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] t4n1o edited a comment on issue #18541: Error when running dag & something to do with parsing the start date

2021-09-27 Thread GitBox


t4n1o edited a comment on issue #18541:
URL: https://github.com/apache/airflow/issues/18541#issuecomment-928440208


   Would it help if I upgraded airflow to a newer version (2.14)?
   
   The dag I'm running that caused this has `start_date` set to:
   ```
   with DAG(
   'Archives Mirror',
   default_args=default_args,
   description='A simple tutorial DAG',
   schedule_interval=timedelta(days=1),
   start_date=days_ago(2),
   tags=['example'],
   ) as dag:
   t1 = BashOperator(
   task_id='mirror_csv_gz_archives',
   bash_command='set -e; cd /opt/repo/; /opt/repo/venv/bin/python -m 
path.to.my.mirror_script',
   )
   
   ```
   
   I copied days_ago(2) from this sample tutorial: 
https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] t4n1o edited a comment on issue #18541: Error when running dag & something to do with parsing the start date

2021-09-27 Thread GitBox


t4n1o edited a comment on issue #18541:
URL: https://github.com/apache/airflow/issues/18541#issuecomment-928440208


   Would it help if I upgraded airflow to a newer version (2.14)?
   
   The dag I'm running that caused this has `start_date` set to:
   ```
   with DAG(
   'Archives Mirror',
   default_args=default_args,
   description='A simple tutorial DAG',
   schedule_interval=timedelta(days=1),
   start_date=days_ago(2),
   tags=['example'],
   ) as dag:
   t1 = BashOperator(
   task_id='mirror_csv_gz_archives',
   bash_command='set -e; cd /opt/repo/; /opt/repo/venv/bin/python -m 
path.to.my.mirror_script',
   )
   
   ```
   
   I used days_ago(2) as per this tutorial: 
https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] t4n1o commented on issue #18541: Error when running dag & something to do with parsing the start date

2021-09-27 Thread GitBox


t4n1o commented on issue #18541:
URL: https://github.com/apache/airflow/issues/18541#issuecomment-928440208


   Would it help if I upgraded airflow to a newer version (2.14)?
   
   The dag I'm running that caused this has start_date set to:
   ```
   with DAG(
   'Archives Mirror',
   default_args=default_args,
   description='A simple tutorial DAG',
   schedule_interval=timedelta(days=1),
   start_date=days_ago(2),
   tags=['example'],
   ) as dag:
   t1 = BashOperator(
   task_id='mirror_csv_gz_archives',
   bash_command='set -e; cd /opt/repo/; /opt/repo/venv/bin/python -m 
path.to.my.mirror_script',
   )
   
   ```
   
   I used days_ago(2) as per this tutorial: 
https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ernest-kr commented on pull request #16666: Replace execution_date with run_id in airflow tasks run command

2021-09-27 Thread GitBox


ernest-kr commented on pull request #1:
URL: https://github.com/apache/airflow/pull/1#issuecomment-928439654


   @SamWheating 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ernest-kr commented on pull request #16666: Replace execution_date with run_id in airflow tasks run command

2021-09-27 Thread GitBox


ernest-kr commented on pull request #1:
URL: https://github.com/apache/airflow/pull/1#issuecomment-928439191


   I see that the CLI help docs are not updated
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ChristianDavis opened a new pull request #18560: typo in docs

2021-09-27 Thread GitBox


ChristianDavis opened a new pull request #18560:
URL: https://github.com/apache/airflow/pull/18560


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18560: typo in docs

2021-09-27 Thread GitBox


boring-cyborg[bot] commented on pull request #18560:
URL: https://github.com/apache/airflow/pull/18560#issuecomment-928400554


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rafaelwsan commented on issue #18558: Login failed in UI after setting Postgres external database in the helm chart

2021-09-27 Thread GitBox


rafaelwsan commented on issue #18558:
URL: https://github.com/apache/airflow/issues/18558#issuecomment-928360874


   All tables were created successfully after chart upgrade, but UI login still 
fail


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rafaelwsan commented on issue #18558: Login failed in UI after setting Postgres external database in the helm chart

2021-09-27 Thread GitBox


rafaelwsan commented on issue #18558:
URL: https://github.com/apache/airflow/issues/18558#issuecomment-928356991


   NAME READY   STATUSRESTARTS   AGE
   airflow-pgbouncer-6df5c988f7-xbp7q   2/2 Running   0  26m
   airflow-scheduler-5dd5f7bf6b-g4q9g   3/3 Running   0  26m
   airflow-statsd-84f4f9898-2cb2k   1/1 Running   0  149m
   airflow-webserver-d6cb44b9d-ml4ws1/1 Running   0  26m


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] enriqueayala opened a new issue #18559: dag_run.start_date not available from SubDag

2021-09-27 Thread GitBox


enriqueayala opened a new issue #18559:
URL: https://github.com/apache/airflow/issues/18559


   ### Apache Airflow version
   
   2.1.4 (latest released)
   
   ### Operating System
   
   Ubuntu 20.04
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   Access to dag_run.start_date attribute from subdag returns `None`. 
   
   ### What you expected to happen
   
   Return DagRun.start_date of parent_dag (working on 2.1.0)
   
   ### How to reproduce
   
   Modify example_dags\subdags\subdag.py to : 
   ```
   from airflow import DAG
   from airflow.operators.bash import BashOperator
   from airflow.utils.dates import days_ago
   
   
   def subdag(parent_dag_name, child_dag_name, args):
   with DAG(
   dag_id=f'{parent_dag_name}.{child_dag_name}',
   default_args=args,
   start_date=days_ago(2),
   schedule_interval="@daily",
   ) as dag_subdag:
   
   t1 = BashOperator(
   task_id='echo_execution_date',
   bash_command='echo "{{dag_run.execution_date}}"',
   )
   t2 = BashOperator(
   task_id='echo_start_date',
   bash_command='echo "{{dag_run.start_date}}"',
   )
   
   t1 >> t2
   
   return dag_subdag
   ```
   
![dag_run_start_date](https://user-images.githubusercontent.com/10963531/134992336-fa6035d4-8504-44f0-b5d2-890690b9ab5d.PNG)
   
   
![dag_run_exec_date](https://user-images.githubusercontent.com/10963531/134992299-4bd00a10-0b22-44aa-b9e0-8f903133eb04.PNG)
   
   
   
   ### Anything else
   
   It also occurs within a task through context. 
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rafaelwsan opened a new issue #18558: Login failed in UI after setting Postgres external database in the helm chart

2021-09-27 Thread GitBox


rafaelwsan opened a new issue #18558:
URL: https://github.com/apache/airflow/issues/18558


   ### Official Helm Chart version
   
   1.1.0 (latest released)
   
   ### Apache Airflow version
   
   2.1.2
   
   ### Kubernetes Version
   
   1.20.9
   
   ### Helm Chart configuration
   
   # Licensed to the Apache Software Foundation (ASF) under one
   # or more contributor license agreements.  See the NOTICE file
   # distributed with this work for additional information
   # regarding copyright ownership.  The ASF licenses this file
   # to you under the Apache License, Version 2.0 (the
   # "License"); you may not use this file except in compliance
   # with the License.  You may obtain a copy of the License at
   #
   #   http://www.apache.org/licenses/LICENSE-2.0
   #
   # Unless required by applicable law or agreed to in writing,
   # software distributed under the License is distributed on an
   # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
   # KIND, either express or implied.  See the License for the
   # specific language governing permissions and limitations
   # under the License.
   ---
   # Default values for airflow.
   # This is a YAML-formatted file.
   # Declare variables to be passed into your templates.
   
   # Provide a name to substitute for the full names of resources
   fullnameOverride: ""
   
   # Provide a name to substitute for the name of the chart
   nameOverride: ""
   
   # User and group of airflow user
   uid: 5
   gid: 0
   
   # Airflow home directory
   # Used for mount paths
   airflowHome: /opt/airflow
   
   # Default airflow repository -- overrides all the specific images below
   defaultAirflowRepository: rafaelsan/airflow
   
   # Default airflow tag to deploy
   defaultAirflowTag: "airflow-custom-1.0.0"
   
   # Airflow version (Used to make some decisions based on Airflow Version 
being deployed)
   airflowVersion: "2.1.2"
   
   # Images
   images:
 airflow:
   repository: ~
   tag: ~
   pullPolicy: IfNotPresent
 pod_template:
   repository: ~
   tag: ~
   pullPolicy: IfNotPresent
 flower:
   repository: ~
   tag: ~
   pullPolicy: IfNotPresent
 statsd:
   repository: apache/airflow
   tag: airflow-statsd-exporter-2021.04.28-v0.17.0
   pullPolicy: IfNotPresent
 redis:
   repository: redis
   tag: 6-buster
   pullPolicy: IfNotPresent
 pgbouncer:
   repository: apache/airflow
   tag: airflow-pgbouncer-2021.04.28-1.14.0
   pullPolicy: IfNotPresent
 pgbouncerExporter:
   repository: apache/airflow
   tag: airflow-pgbouncer-exporter-2021.04.28-0.5.0
   pullPolicy: IfNotPresent
 gitSync:
   repository: k8s.gcr.io/git-sync/git-sync
   tag: v3.3.0
   pullPolicy: IfNotPresent
   
   # Select certain nodes for airflow pods.
   nodeSelector: {}
   affinity: {}
   tolerations: []
   
   # Add common labels to all objects and pods defined in this chart.
   labels: {}
   
   # Ingress configuration
   ingress:
 # Enable ingress resource
 enabled: false
   
 # Configs for the Ingress of the web Service
 web:
   # Annotations for the web Ingress
   annotations: {}
   
   # The path for the web Ingress
   path: ""
   
   # The hostname for the web Ingress
   host: ""
   
   # configs for web Ingress TLS
   tls:
 # Enable TLS termination for the web Ingress
 enabled: false
 # the name of a pre-created Secret containing a TLS private key and 
certificate
 secretName: ""
   
   # HTTP paths to add to the web Ingress before the default path
   precedingPaths: []
   
   # Http paths to add to the web Ingress after the default path
   succeedingPaths: []
   
 # Configs for the Ingress of the flower Service
 flower:
   # Annotations for the flower Ingress
   annotations: {}
   
   # The path for the flower Ingress
   path: ""
   
   # The hostname for the flower Ingress
   host: ""
   
   # configs for web Ingress TLS
   tls:
 # Enable TLS termination for the flower Ingress
 enabled: false
 # the name of a pre-created Secret containing a TLS private key and 
certificate
 secretName: ""
   
   # HTTP paths to add to the flower Ingress before the default path
   precedingPaths: []
   
   # Http paths to add to the flower Ingress after the default path
   succeedingPaths: []
   
   # Network policy configuration
   networkPolicies:
 # Enabled network policies
 enabled: false
   
   # Extra annotations to apply to all
   # Airflow pods
   airflowPodAnnotations: {}
   
   # Extra annotations to apply to
   # main Airflow configmap
   airflowConfigAnnotations: {}
   
   # `airflow_local_settings` file as a string (can be templated).
   airflowLocalSettings: ~
   
   # Enable RBAC (default on most clusters these days)
   rbac:
 # Specifies whether RBAC 

[GitHub] [airflow] boring-cyborg[bot] commented on issue #18558: Login failed in UI after setting Postgres external database in the helm chart

2021-09-27 Thread GitBox


boring-cyborg[bot] commented on issue #18558:
URL: https://github.com/apache/airflow/issues/18558#issuecomment-928345567


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ShakaibKhan commented on issue #17962: Warn if robots.txt is accessed

2021-09-27 Thread GitBox


ShakaibKhan commented on issue #17962:
URL: https://github.com/apache/airflow/issues/17962#issuecomment-928292345


   started pr to address this: https://github.com/apache/airflow/pull/18557


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ShakaibKhan opened a new pull request #18557: Warning of public exposure of deployment in UI with on/off config

2021-09-27 Thread GitBox


ShakaibKhan opened a new pull request #18557:
URL: https://github.com/apache/airflow/pull/18557


   
   related:  #17962
   
   Added warning banner message for when /robot.txt is hit with on/off config
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham closed issue #18458: Airflow deployed on Kubernetes cluster NOT showing airflow app metrics in STATSD Exporter

2021-09-27 Thread GitBox


jedcunningham closed issue #18458:
URL: https://github.com/apache/airflow/issues/18458


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] flolas edited a comment on pull request #17329: Split sql statements in DbApi run

2021-09-27 Thread GitBox


flolas edited a comment on pull request #17329:
URL: https://github.com/apache/airflow/pull/17329#issuecomment-928259116


   @potiuk Tests fail :( whats wrong?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] flolas commented on pull request #17329: Split sql statements in DbApi run

2021-09-27 Thread GitBox


flolas commented on pull request #17329:
URL: https://github.com/apache/airflow/pull/17329#issuecomment-928259116


   @potiuk Tests fails :( whats wrong?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] collinmcnulty opened a new pull request #18555: Add task information to logs about k8s pods

2021-09-27 Thread GitBox


collinmcnulty opened a new pull request #18555:
URL: https://github.com/apache/airflow/pull/18555


   Add annotations, which contain the task information, to log lines that 
reference a specific pod so that logs can be searched by task or DAG id. Also 
condenses a few more log elements into a single line to play better with 
Elastic.
   
   I did not have the confidence/time to go through every log line that 
references a pod name to add annotations, as some of them would require passing 
the annotations through several layers that I do not understand and do not want 
to break. I think I got the most common and critical log lines though.
   
   closes: #18329 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Brooke-white commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


Brooke-white commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r716999098



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
   that sounds good to me, I'm happy to make this change if the others here 
agree with taking this route @josh-fell @JavierLopezT 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] zachliu commented on issue #15000: When an ECS Task fails to start, ECS Operator raises a CloudWatch exception

2021-09-27 Thread GitBox


zachliu commented on issue #15000:
URL: https://github.com/apache/airflow/issues/15000#issuecomment-928219338


   > @zachliu @kanga333 did remove retry for now #16150 solved the problem?
   
   @eladkal unfortunately no, #16150 was for another issue that's only somewhat 
related to this one


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #18554: Bugfix: dag_bag.get_dag should not raise exception

2021-09-27 Thread GitBox


ephraimbuddy commented on a change in pull request #18554:
URL: https://github.com/apache/airflow/pull/18554#discussion_r716990684



##
File path: airflow/exceptions.py
##
@@ -150,10 +150,6 @@ class DuplicateTaskIdFound(AirflowException):
 """Raise when a Task with duplicate task_id is defined in the same DAG"""
 
 
-class SerializedDagNotFound(DagNotFound):

Review comment:
   Another option would be to use `try_except` in all the places we called 
`get_dag`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy edited a comment on pull request #18554: Bugfix: dag_bag.get_dag should not raise exception

2021-09-27 Thread GitBox


ephraimbuddy edited a comment on pull request #18554:
URL: https://github.com/apache/airflow/pull/18554#issuecomment-928203949


   > The API has been returning 404 for quite some time (and IMO it's the 
correct behaviour). #18523 only refactored the implementation to raise 404 in a 
different way.
   
   I meant on the webserver. Sorry I didn't make that clear. If you change the 
URL for the dag on tree/graph view such that the dag Id is not in 
SerializedDagModel. You will get the above error which I think should be for 
REST API


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on issue #12680: SparkSubmitHook - allow log parsing

2021-09-27 Thread GitBox


eladkal commented on issue #12680:
URL: https://github.com/apache/airflow/issues/12680#issuecomment-928205522


   This feels like a very custom use case for your needs.
   You can do that with a custom operator.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal closed issue #12680: SparkSubmitHook - allow log parsing

2021-09-27 Thread GitBox


eladkal closed issue #12680:
URL: https://github.com/apache/airflow/issues/12680


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (892c5fc -> 80ae70c)

2021-09-27 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 892c5fc  Add package filter info to Breeze build docs (#18550)
 add 80ae70c  Fix ``DetachedInstanceError`` when dag_run attrs are accessed 
from ti (#18499)

No new revisions were added by this update.

Summary of changes:
 airflow/models/taskinstance.py| 5 +
 tests/jobs/test_local_task_job.py | 2 +-
 tests/models/test_taskinstance.py | 2 +-
 3 files changed, 3 insertions(+), 6 deletions(-)


[GitHub] [airflow] kaxil merged pull request #18499: Fix DetachedInstanceError when dag_run attrs are accessed from ti

2021-09-27 Thread GitBox


kaxil merged pull request #18499:
URL: https://github.com/apache/airflow/pull/18499


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on pull request #18554: Bugfix: dag_bag.get_dag should not raise exception

2021-09-27 Thread GitBox


ephraimbuddy commented on pull request #18554:
URL: https://github.com/apache/airflow/pull/18554#issuecomment-928203949


   > The API has been returning 404 for quite some time (and IMO it's the 
correct behaviour). #18523 only refactored the implementation to raise 404 in a 
different way.
   
   I meant on the webserver. Sorry I didn't make that clear. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on issue #15000: When an ECS Task fails to start, ECS Operator raises a CloudWatch exception

2021-09-27 Thread GitBox


eladkal commented on issue #15000:
URL: https://github.com/apache/airflow/issues/15000#issuecomment-928203500


   @zachliu @kanga333 did remove retry for now #16150 solved the problem?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on pull request #18503: When calling `dr.get_task_instance` automatically set `dag_run` relationship

2021-09-27 Thread GitBox


ashb commented on pull request #18503:
URL: https://github.com/apache/airflow/pull/18503#issuecomment-928202842


   We don't need this now with Ephraim's latest pr


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb closed pull request #18503: When calling `dr.get_task_instance` automatically set `dag_run` relationship

2021-09-27 Thread GitBox


ashb closed pull request #18503:
URL: https://github.com/apache/airflow/pull/18503


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18499: Fix DetachedInstanceError when dag_run attrs are accessed from ti

2021-09-27 Thread GitBox


github-actions[bot] commented on pull request #18499:
URL: https://github.com/apache/airflow/pull/18499#issuecomment-928201848


   The PR most likely needs to run full matrix of tests because it modifies 
parts of the core of Airflow. However, committers might decide to merge it 
quickly and take the risk. If they don't merge it quickly - please rebase it to 
the latest main at your convenience, or amend the last commit of the PR, and 
push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal closed issue #17314: Kerberos configuration to enable allow kinit -f -A

2021-09-27 Thread GitBox


eladkal closed issue #17314:
URL: https://github.com/apache/airflow/issues/17314


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on issue #17314: Kerberos configuration to enable allow kinit -f -A

2021-09-27 Thread GitBox


eladkal commented on issue #17314:
URL: https://github.com/apache/airflow/issues/17314#issuecomment-928201286


   solved in https://github.com/apache/airflow/pull/17816


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on issue #16919: error when using mysql_to_s3 (TypeError: cannot safely cast non-equivalent object to int64)

2021-09-27 Thread GitBox


eladkal commented on issue #16919:
URL: https://github.com/apache/airflow/issues/16919#issuecomment-928199673


   @SasanAhmadi are you working on this issue?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


dstandish commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r716962976



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
   It's a good point re IAM auth.
   
   What if we just call the module redshift_sql but the hook RedshiftHook
   
   Then there are two redshift hooks until 3.0, but in different modules
   
   After 3.0 the old one is renamed
   
   And I guess we rename the new module at 3.0 too?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


dstandish commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r716962976



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
   What if we just call the module redshift_sql but the hook RedshiftHook
   
   Then there are two redshift hooks until 3.0, but in different modules
   
   After 3.0 the old one is renamed
   
   And I guess we rename the new module at 3.0 too?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


dstandish commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r716962976



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
   What if we just call the module redshift_sql but the hook RedshiftHook
   
   Then there are two redshift hooks until 3.0
   
   After 3.0 the old one is renamed
   
   And I guess we rename the new module at 3.0 too?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


dstandish commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r716962976



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
   What if we just call the module redshift_sql but the hook Redshift hook
   
   Then there are two redshift hooks until 3.0
   
   After 3.0 the old one is renamed




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on pull request #18554: Bugfix: dag_bag.get_dag should not raise exception

2021-09-27 Thread GitBox


uranusjr commented on pull request #18554:
URL: https://github.com/apache/airflow/pull/18554#issuecomment-928180413


   The API has been returning 404 for quite some time (and IMO it's the correct 
behaviour). #18523 only refactored the implementation to raise 404 in a 
different way.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on pull request #18554: Bugfix: dag_bag.get_dag should not raise exception

2021-09-27 Thread GitBox


ephraimbuddy commented on pull request #18554:
URL: https://github.com/apache/airflow/pull/18554#issuecomment-928176400


   Also, error for missing DAG now returns a REST API not found error:
   
   ```
   {
 "detail": null,
 "status": 404,
 "title": "DAG 'dag_pod_operatorxco' not found in serialized_dag table",
 "type": 
"http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/stable-rest-api-ref.html#section/Errors/NotFound;
   }
   ```
   I think this PR https://github.com/apache/airflow/pull/18523 caused it


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on pull request #17951: Refresh credentials for long-running pods on EKS

2021-09-27 Thread GitBox


mik-laj commented on pull request #17951:
URL: https://github.com/apache/airflow/pull/17951#issuecomment-928175875


   @potiuk Can you look at it? It is ready for review now. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy opened a new pull request #18554: Bugfix: dag_bag.get_dag should not raise exception

2021-09-27 Thread GitBox


ephraimbuddy opened a new pull request #18554:
URL: https://github.com/apache/airflow/pull/18554


   get_dag raising exception is breaking many parts of the codebase.
   The usage in code suggests that it should return None if a dag is not
   found. There are about 30 usages expecting it to return None if a dag
   is not found. A missing dag errors out in the UI instead of returning
   a message that DAG is missing.
   
   This PR adds a try/except and returns None when a dag is not found
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18147: Allow airflow standard images to run in openshift utilising the official helm chart #18136

2021-09-27 Thread GitBox


potiuk commented on pull request #18147:
URL: https://github.com/apache/airflow/pull/18147#issuecomment-928152614


   Some static/helm unit test failed 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham closed pull request #18553:  Extra debugging for helm tests

2021-09-27 Thread GitBox


jedcunningham closed pull request #18553:
URL: https://github.com/apache/airflow/pull/18553


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham opened a new pull request #18553:  Extra debugging for helm tests

2021-09-27 Thread GitBox


jedcunningham opened a new pull request #18553:
URL: https://github.com/apache/airflow/pull/18553


   I'm just seeing if I can identify why public runners are failing on


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] andrewgodwin opened a new pull request #18552: Allow core Triggerer loops to yield control

2021-09-27 Thread GitBox


andrewgodwin opened a new pull request #18552:
URL: https://github.com/apache/airflow/pull/18552


   In the case of having several hundred triggers, the core triggerer 
creation/deletion loops would block the main thread for several hundred 
milliseconds and bring the event loop to a halt.
   
   This change allows them to yield control after every trigger they process, 
preventing this. It's unfortunately not possible to unit test reliably, but I 
ran it through its paces with 500 triggers locally.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Brooke-white commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


Brooke-white commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r716879087



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
   Would it be possible to include the hook under its current name, 
`RedshiftStatementHook`, so we don't need to wait for 3.0, and then add in some 
shim for backwards compatibility in the future? 
   
   i.e.
   ```python
   RedshiftHook = RedshiftStatementHook
   ```
   
   While RedshiftOperator would likely work with the Postgres hook, it would be 
lacking the primary feature requests in #16355, which include extended datatype 
support and authentication via IAM and Identity provider. 
   
   IMO, the authentication bits are especially important because they simplify 
an existing workflow (connecting to Redshift with temporary credentials), and 
open this integration up to a new group of users (those requiring 
authentication with Redshift via an identity provider (e.g. Okta, Azure)). 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell commented on a change in pull request #18447: add RedshiftStatementHook, RedshiftOperator

2021-09-27 Thread GitBox


josh-fell commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r716876359



##
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
   Love the idea. Personally I'd vote for renaming the existing hook to 
`RedshiftClusterHook`.
   
   Getting to a point where users don't need to cross-pollinate their 
environment with the Postgres provider just to simply execute SQL in Redshift 
would be very beneficial. I agree that transition will take some time but it 
most certainly will be worth the effort.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #18494: Fix part of Google system tests

2021-09-27 Thread GitBox


mik-laj commented on a change in pull request #18494:
URL: https://github.com/apache/airflow/pull/18494#discussion_r716873050



##
File path: airflow/providers/google/cloud/example_dags/example_cloud_sql.py
##
@@ -48,8 +49,8 @@
 from airflow.utils.dates import days_ago
 
 GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'example-project')
-INSTANCE_NAME = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME', 'test-mysql')
-INSTANCE_NAME2 = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME2', 'test-mysql2')
+INSTANCE_NAME = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME', 'test-mysql') + 
str(random.getrandbits(16))

Review comment:
   Here is an example project that we used to run system tests for Google 
Cloud on Google Cloud Build:
   
   https://github.com/politools/airflow-system-tests




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #18494: Fix part of Google system tests

2021-09-27 Thread GitBox


mik-laj commented on a change in pull request #18494:
URL: https://github.com/apache/airflow/pull/18494#discussion_r716870133



##
File path: airflow/providers/google/cloud/example_dags/example_cloud_sql.py
##
@@ -48,8 +49,8 @@
 from airflow.utils.dates import days_ago
 
 GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'example-project')
-INSTANCE_NAME = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME', 'test-mysql')
-INSTANCE_NAME2 = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME2', 'test-mysql2')
+INSTANCE_NAME = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME', 'test-mysql') + 
str(random.getrandbits(16))

Review comment:
   @mnojek Airflow is a distributed application, which means that one DAG 
file is loaded multiple times on different nodes, so we have to make sure that 
this instance name has the same value on all nodes. These examples are used in 
system tests, where this condition is not necessary because we have a common 
memory, but these examples are also the inspiration for novice users who can 
use another executor e.g. CeleryExecutor, so each DAG will be loaded on each 
node separately.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated: Add package filter info to Breeze build docs (#18550)

2021-09-27 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 892c5fc  Add package filter info to Breeze build docs (#18550)
892c5fc is described below

commit 892c5fcce4a07b61c580e78bef62ab13767edaf6
Author: JavierLopezT 
AuthorDate: Mon Sep 27 18:42:49 2021 +0200

Add package filter info to Breeze build docs (#18550)
---
 BREEZE.rst | 7 +++
 1 file changed, 7 insertions(+)

diff --git a/BREEZE.rst b/BREEZE.rst
index 7d999ab..e153aaf 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -750,6 +750,13 @@ extra ``--`` flag.
 
  ./breeze build-docs -- --spellcheck-only
 
+This process can take some time, so in order to make it shorter you can filter 
by package, using the flag
+``--package-filter ``. The package name has to be one of the 
providers or ``apache-airflow``. For
+instance, for using it with Amazon, the command would be:
+
+.. code-block:: bash
+
+ ./breeze build-docs -- --package-filter apache-airflow-providers-amazon
 
 Often errors during documentation generation come from the docstrings of 
auto-api generated classes.
 During the docs building auto-api generated files are stored in the 
``docs/_api`` folder. This helps you


[GitHub] [airflow] potiuk merged pull request #18550: Add package filter info to Breeze build docs

2021-09-27 Thread GitBox


potiuk merged pull request #18550:
URL: https://github.com/apache/airflow/pull/18550


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18550: Add package filter info to Breeze build docs

2021-09-27 Thread GitBox


potiuk commented on pull request #18550:
URL: https://github.com/apache/airflow/pull/18550#issuecomment-928060148


   Nice!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (a458fcc -> 2fadf3c)

2021-09-27 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from a458fcc  Updating miscellaneous provider DAGs to use TaskFlow API 
where applicable (#18278)
 add 2fadf3c  Fix kubernetes engine system test (#18548)

No new revisions were added by this update.

Summary of changes:
 .../google/cloud/example_dags/example_kubernetes_engine.py  | 2 ++
 airflow/providers/google/common/hooks/base_google.py| 6 +++---
 2 files changed, 5 insertions(+), 3 deletions(-)


[GitHub] [airflow] potiuk merged pull request #18548: Fix kubernetes engine system test

2021-09-27 Thread GitBox


potiuk merged pull request #18548:
URL: https://github.com/apache/airflow/pull/18548


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated: Updating miscellaneous provider DAGs to use TaskFlow API where applicable (#18278)

2021-09-27 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new a458fcc  Updating miscellaneous provider DAGs to use TaskFlow API 
where applicable (#18278)
a458fcc is described below

commit a458fcc573845ff65244a2dafd204ed70129f3e8
Author: Josh Fell <48934154+josh-f...@users.noreply.github.com>
AuthorDate: Mon Sep 27 12:19:23 2021 -0400

Updating miscellaneous provider DAGs to use TaskFlow API where applicable 
(#18278)
---
 .../amazon/aws/example_dags/example_s3_bucket.py   |  11 +-
 .../aws/example_dags/example_s3_to_redshift.py |  31 +++--
 .../hive/example_dags/example_twitter_README.md|   2 +-
 .../hive/example_dags/example_twitter_dag.py   | 134 -
 .../apache/kylin/example_dags/example_kylin_dag.py |  64 --
 .../google/cloud/example_dags/example_s3_to_gcs.py |   7 +-
 .../example_dags/example_jenkins_job_trigger.py|  22 ++--
 .../azure/example_dags/example_fileshare.py|   9 +-
 .../papermill/example_dags/example_papermill.py|   7 +-
 .../qubole/example_dags/example_qubole.py  |  49 +++-
 .../providers/sqlite/example_dags/create_table.sql |  24 
 .../sqlite/example_dags/example_sqlite.py  |  27 ++---
 12 files changed, 160 insertions(+), 227 deletions(-)

diff --git a/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py 
b/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py
index ceeb4b2..ca226bc 100644
--- a/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py
+++ b/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py
@@ -16,8 +16,8 @@
 # under the License.
 import os
 
+from airflow.decorators import task
 from airflow.models.dag import DAG
-from airflow.operators.python import PythonOperator
 from airflow.providers.amazon.aws.hooks.s3 import S3Hook
 from airflow.providers.amazon.aws.operators.s3_bucket import 
S3CreateBucketOperator, S3DeleteBucketOperator
 from airflow.utils.dates import days_ago
@@ -25,6 +25,7 @@ from airflow.utils.dates import days_ago
 BUCKET_NAME = os.environ.get('BUCKET_NAME', 'test-airflow-12345')
 
 
+@task(task_id="s3_bucket_dag_add_keys_to_bucket")
 def upload_keys():
 """This is a python callback to add keys into the s3 bucket"""
 # add keys to bucket
@@ -41,6 +42,7 @@ with DAG(
 dag_id='s3_bucket_dag',
 schedule_interval=None,
 start_date=days_ago(2),
+default_args={"bucket_name": BUCKET_NAME},
 max_active_runs=1,
 tags=['example'],
 ) as dag:
@@ -48,17 +50,14 @@ with DAG(
 # [START howto_operator_s3_bucket]
 create_bucket = S3CreateBucketOperator(
 task_id='s3_bucket_dag_create',
-bucket_name=BUCKET_NAME,
 region_name='us-east-1',
 )
 
-add_keys_to_bucket = PythonOperator(
-task_id="s3_bucket_dag_add_keys_to_bucket", python_callable=upload_keys
-)
+# Using a task-decorated function to add keys
+add_keys_to_bucket = upload_keys()
 
 delete_bucket = S3DeleteBucketOperator(
 task_id='s3_bucket_dag_delete',
-bucket_name=BUCKET_NAME,
 force_delete=True,
 )
 # [END howto_operator_s3_bucket]
diff --git 
a/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py 
b/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py
index 1a7c911..9cec527 100644
--- a/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py
+++ b/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py
@@ -22,7 +22,8 @@ This is an example dag for using `S3ToRedshiftOperator` to 
copy a S3 key into a
 from os import getenv
 
 from airflow import DAG
-from airflow.operators.python import PythonOperator
+from airflow.decorators import task
+from airflow.models.baseoperator import chain
 from airflow.providers.amazon.aws.hooks.s3 import S3Hook
 from airflow.providers.amazon.aws.transfers.s3_to_redshift import 
S3ToRedshiftOperator
 from airflow.providers.postgres.operators.postgres import PostgresOperator
@@ -35,12 +36,14 @@ REDSHIFT_TABLE = getenv("REDSHIFT_TABLE", "test_table")
 # [END howto_operator_s3_to_redshift_env_variables]
 
 
-def _add_sample_data_to_s3():
+@task(task_id='setup__add_sample_data_to_s3')
+def add_sample_data_to_s3():
 s3_hook = S3Hook()
 s3_hook.load_string("0,Airflow", f'{S3_KEY}/{REDSHIFT_TABLE}', S3_BUCKET, 
replace=True)
 
 
-def _remove_sample_data_from_s3():
+@task(task_id='teardown__remove_sample_data_from_s3')
+def remove_sample_data_from_s3():
 s3_hook = S3Hook()
 if s3_hook.check_for_key(f'{S3_KEY}/{REDSHIFT_TABLE}', S3_BUCKET):
 s3_hook.delete_objects(S3_BUCKET, f'{S3_KEY}/{REDSHIFT_TABLE}')
@@ -49,9 +52,8 @@ def _remove_sample_data_from_s3():
 with DAG(
 dag_id="example_s3_to_redshift", start_date=days_ago(1), 
schedule_interval=None, tags=['example']
 ) as dag:

[GitHub] [airflow] ephraimbuddy merged pull request #18278: Updating miscellaneous provider DAGs to use TaskFlow API where applicable

2021-09-27 Thread GitBox


ephraimbuddy merged pull request #18278:
URL: https://github.com/apache/airflow/pull/18278


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk closed issue #18546: Show code of other files than DAG one

2021-09-27 Thread GitBox


potiuk closed issue #18546:
URL: https://github.com/apache/airflow/issues/18546


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #18546: Show code of other files than DAG one

2021-09-27 Thread GitBox


potiuk commented on issue #18546:
URL: https://github.com/apache/airflow/issues/18546#issuecomment-928022140


   That is unlikely to happen soon IMHO. 
   
   It's a dificult one to pull if you consider how Python code is parsed, it 
would be next-to-impossible to find out which files to include extra and 
specifying them manually misses the point. 
   
   I also think Airflow is NOT good to show code other than DAG code - it's not 
"code browser" there is little value and high complexity to make a UI that 
would let you navigate between files etc. I think a much better approach would 
be to use another browser for your code and write a custom plugin in airflow to 
get the link to that browser. 
   
   For example if you have the code in Git and use GitSync, you could add a 
plugin to have a view where you redirect tto GitHub or GitLab UI. Even now 
GitHub (and I believe GitLab too) have automated detection of the linked code 
even in Python, so if you keep the code in the same repo you will be able to 
even navigate between DAG and imported code.
   
   Let me convert this one to discussion , as I do not think we will ever want 
to make it a feature (but If others think otherwise we can always convert it 
back to a feature).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18516: Fix rendering nested task fields

2021-09-27 Thread GitBox


potiuk commented on pull request #18516:
URL: https://github.com/apache/airflow/pull/18516#issuecomment-928006395


   > Sorry house full of back to school colds here. Might get to it tomorrow 
but if @turbaszek is okay don't wait for me
   
   Can wait - I think would be good to get it to 2.2 but I think this is still 
"default" so I am not in a hurry.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jbarnettfreejazz commented on issue #18512: airflow deadlock trying to update rendered_task_instance_fields table (mysql)

2021-09-27 Thread GitBox


jbarnettfreejazz commented on issue #18512:
URL: https://github.com/apache/airflow/issues/18512#issuecomment-928004168


   Hi -- thanks for jumping on this -- This seems to primarily occur in the 
DELETE case. We're running MySql version 8.0.23
   
   Here's stack trace info 
   
   [2021-09-27 08:02:02,953] {taskinstance.py:1463} ERROR - Task failed with 
exception
   Traceback (most recent call last):
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1276, in _execute_context
   self.dialect.do_execute(
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py",
 line 608, in do_execute
   cursor.execute(statement, parameters)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 
206, in execute
   res = self._query(query)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 
319, in _query
   db.query(q)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 
259, in query
   _mysql.connection.query(self, query)
   MySQLdb._exceptions.OperationalError: (1213, 'Deadlock found when trying to 
get lock; try restarting transaction')
   
   The above exception was the direct cause of the following exception:
   
   Traceback (most recent call last):
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1165, in _run_raw_task
   self._prepare_and_execute_task_with_callbacks(context, task)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1248, in _prepare_and_execute_task_with_callbacks
   RenderedTaskInstanceFields.delete_old_records(self.task_id, self.dag_id)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", 
line 70, in wrapper
   return func(*args, session=session, **kwargs)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/renderedtifields.py",
 line 173, in delete_old_records
   session.query(cls).filter(
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", 
line 3926, in delete
   delete_op.exec_()
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py",
 line 1697, in exec_
   self._do_exec()
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py",
 line 1930, in _do_exec
   self._execute_stmt(delete_stmt)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py",
 line 1702, in _execute_stmt
   self.result = self.query._execute_crud(stmt, self.mapper)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", 
line 3568, in _execute_crud
   return conn.execute(stmt, self._params)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1011, in execute
   return meth(self, multiparams, params)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", 
line 298, in _execute_on_connection
   return connection._execute_clauseelement(self, multiparams, params)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1124, in _execute_clauseelement
   ret = self._execute_context(
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1316, in _execute_context
   self._handle_dbapi_exception(
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1510, in _handle_dbapi_exception
   util.raise_(
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", 
line 182, in raise_
   raise exception
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1276, in _execute_context
   self.dialect.do_execute(
 File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py",
 line 608, in do_execute
   cursor.execute(statement, parameters)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 
206, in execute
   res = self._query(query)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 
319, in _query
   db.query(q)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 
259, in query
   _mysql.connection.query(self, query)
   sqlalchemy.exc.OperationalError: (MySQLdb._exceptions.OperationalError) 
(1213, 'Deadlock found when trying to get lock; try restarting transaction')
   [SQL: DELETE FROM rendered_task_instance_fields WHERE 
rendered_task_instance_fields.dag_id = %s AND 
rendered_task_instance_fields.task_id = %s AND 
(rendered_task_instance_fields.dag_id, rendered_task_instance_fields.task_id, 

[GitHub] [airflow] ashb commented on pull request #18516: Fix rendering nested task fields

2021-09-27 Thread GitBox


ashb commented on pull request #18516:
URL: https://github.com/apache/airflow/pull/18516#issuecomment-928002133


   Sorry house full of back to school colds here. Might get to it tomorrow but 
if @turbaszek is okay don't wait for me


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18516: Fix rendering nested task fields

2021-09-27 Thread GitBox


potiuk commented on pull request #18516:
URL: https://github.com/apache/airflow/pull/18516#issuecomment-927999280


   kind reminder @ashb :) 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18476: Give MSSQL container more time to start up

2021-09-27 Thread GitBox


potiuk commented on pull request #18476:
URL: https://github.com/apache/airflow/pull/18476#issuecomment-927998717


   I think we have only those - few instable scheduler tests problems left. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on pull request #18522: Coerce datetime to pendulum for timetable

2021-09-27 Thread GitBox


potiuk edited a comment on pull request #18522:
URL: https://github.com/apache/airflow/pull/18522#issuecomment-927983797


   > Grrgh, the diffrent `str()` representation between `datetime.datetime` and 
`pendulum.DateTime` is super annoying. And yes, let's also add a test when we 
fix the failing one as well.
   
   Been there, done that. I think I mentioned it elsewhere, but I find pendulum 
creates more problems than it solves.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT commented on pull request #14415: SnowflakeToS3Operator

2021-09-27 Thread GitBox


JavierLopezT commented on pull request #14415:
URL: https://github.com/apache/airflow/pull/14415#issuecomment-927983491


   Closing. Anytime, I might open a PR with SnowflakeCopyIntoLocationOperator


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18522: Coerce datetime to pendulum for timetable

2021-09-27 Thread GitBox


potiuk commented on pull request #18522:
URL: https://github.com/apache/airflow/pull/18522#issuecomment-927983797


   > Grrgh, the diffrent `str()` representation between `datetime.datetime` and 
`pendulum.DateTime` is super annoying. And yes, let's also add a test when we 
fix the failing one as well.
   
   Been there, done that. I think I mentioned it elsewhere, but I find pendulum 
creates more problems than they solve.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT closed pull request #14415: SnowflakeToS3Operator

2021-09-27 Thread GitBox


JavierLopezT closed pull request #14415:
URL: https://github.com/apache/airflow/pull/14415


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT opened a new pull request #18550: Add package filter info to Breeze build docs

2021-09-27 Thread GitBox


JavierLopezT opened a new pull request #18550:
URL: https://github.com/apache/airflow/pull/18550


   This was mentioned in git hub actions failure, but I haven't found it in the 
documentation
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18494: Fix part of Google system tests

2021-09-27 Thread GitBox


potiuk commented on pull request #18494:
URL: https://github.com/apache/airflow/pull/18494#issuecomment-927973836


   Some static checks failed :(


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #18494: Fix part of Google system tests

2021-09-27 Thread GitBox


potiuk commented on a change in pull request #18494:
URL: https://github.com/apache/airflow/pull/18494#discussion_r716791580



##
File path: airflow/providers/google/cloud/example_dags/example_cloud_sql.py
##
@@ -48,8 +49,8 @@
 from airflow.utils.dates import days_ago
 
 GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'example-project')
-INSTANCE_NAME = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME', 'test-mysql')
-INSTANCE_NAME2 = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME2', 'test-mysql2')
+INSTANCE_NAME = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME', 'test-mysql') + 
str(random.getrandbits(16))

Review comment:
   Yes. This is HIGHLY annoying with cloudsql that name cannot ber reused 
for a day or so even if db is deleted. 
   
   However I agree random POSTFIX generation in the DAG is a bad idea.
   
   What we used to have in the past is that we had `variables.env` file in 
`files/airflow-breeze-config`  where we sourced files with variables and we had 
a very simple script that generated the random postfix if it was missing.
   
   Then you could do step-by-step testing with keeping the randomly generated 
postfix even across breeze restarts.
   When you needed to change the database name you'd simply remove the file and 
it would be re-generated automatically at breeze entry.
   
   
https://github.com/apache/airflow/blob/main/BREEZE.rst#customize-your-environment
   
   Something like that might work (writing it from memory so I am not sure if 
it is correct) in variables.env:
   
   ```
   if [[ ! -f /files/random.env ]]; then 
  echo "export RANDOM_POSTFIX=${RANDOM}" > /files/random.env
   fi
   source /files/random.env
   
   export GCSQL_MYSQL_INSTANCE_NAME="test-mysql-${RANDOM_POSTFIX}"
   ```
   
   This has several nice properties:
* everyone has its own random value
* you keep it stable between runs or even between debug sessions - for 
example you could ran tasks from the example DAG separately one-by-one
* you can very easily regenerate the number by simply deleting the 
/files/random.env
   
   In the past we even shared the whole `airflow-breeze-config` directory was 
actually checked out separate repository where we kept all variables used by 
the team. This way you could share different variables between same users who 
have access to the same repo - at the same time each of the users will have 
different postifx as the random.env would not be part of the repo.
   
   Just an inspiration if you would like to optimize your development workflow.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Echronix opened a new issue #18549: CloudWatch CreateLogGroup error on already existing LogGroup

2021-09-27 Thread GitBox


Echronix opened a new issue #18549:
URL: https://github.com/apache/airflow/issues/18549


   ### Apache Airflow version
   
   2.1.2
   
   ### Operating System
   
   Ubuntu Docker
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==2.0.0
   apache-airflow-providers-celery==2.0.0
   apache-airflow-providers-cncf-kubernetes==2.0.0
   apache-airflow-providers-docker==2.0.0
   apache-airflow-providers-elasticsearch==2.0.2
   apache-airflow-providers-ftp==2.0.0
   apache-airflow-providers-google==4.0.0
   apache-airflow-providers-grpc==2.0.0
   apache-airflow-providers-hashicorp==2.0.0
   apache-airflow-providers-http==2.0.0
   apache-airflow-providers-imap==2.0.0
   apache-airflow-providers-microsoft-azure==3.0.0
   apache-airflow-providers-mysql==2.0.0
   apache-airflow-providers-neo4j==2.0.0
   apache-airflow-providers-postgres==2.0.0
   apache-airflow-providers-redis==2.0.0
   apache-airflow-providers-sendgrid==2.0.0
   apache-airflow-providers-sftp==2.0.0
   apache-airflow-providers-slack==4.0.0
   apache-airflow-providers-sqlite==2.0.0
   apache-airflow-providers-ssh==2.0.0
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   ```
   config:
 core:
   executor: 'CeleryExecutor'
 logging:
   remote_logging: 'True'
   remote_base_log_folder: 
'cloudwatch://arn:aws:logs:eu-central-1::log-group:airflow'
   remote_log_conn_id: 'cloudwatch'
   ```
   
   The log group is already created in Terraform.
   
   ### What happened
   
   ```
   
   [2021-09-27 15:03:58,541: ERROR/ForkPoolWorker-15] Failed to execute task An 
error occurred (AccessDeniedException) when calling the CreateLogGroup 
operation: User: 
arn:aws:sts:::assumed-role/airflow-service-account-role/botocore-session-1632755037
 is not authorized to perform: logs:CreateLogGroup on resource: 
arn:aws:logs:eu-central-1::log-group:airflow:log-stream:.
   Traceback (most recent call last):
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/celery_executor.py",
 line 117, in _execute_in_fork
   args.func(args)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", 
line 48, in command
   return func(*args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 
91, in wrapper
   return f(*args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py",
 line 228, in task_run
   ti.init_run_context(raw=args.raw)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 2031, in init_run_context
   self._set_context(self)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/logging_mixin.py",
 line 54, in _set_context
   set_context(self.log, context)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/logging_mixin.py",
 line 173, in set_context
   handler.set_context(value)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/log/cloudwatch_task_handler.py",
 line 81, in set_context
   self.handler = watchtower.CloudWatchLogHandler(
 File 
"/home/airflow/.local/lib/python3.8/site-packages/watchtower/__init__.py", line 
148, in __init__
   _idempotent_create(self.cwl_client, "create_log_group", 
logGroupName=self.log_group)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/watchtower/__init__.py", line 
15, in _idempotent_create
   method_callable(*args, **kwargs)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/botocore/client.py", line 
386, in _api_call
   return self._make_api_call(operation_name, kwargs)
 File 
"/home/airflow/.local/lib/python3.8/site-packages/botocore/client.py", line 
705, in _make_api_call
   raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (AccessDeniedException) 
when calling the CreateLogGroup operation: User: 
arn:aws:sts:}:assumed-role/airflow-service-account-role/botocore-session-1632755037
 is not authorized to perform: logs:CreateLogGroup on resource: 
arn:aws:logs:eu-central-1::log-group:airflow:log-stream:
   
   ```
   
   ### What you expected to happen
   
   I would expect Airflow to create a stream and write the logs, not to create 
a log-group inside a log-group.
   
   ### How to reproduce
   
   Run any dag with the chart config.
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to 

[GitHub] [airflow] boring-cyborg[bot] commented on issue #18549: CloudWatch CreateLogGroup error on already existing LogGroup

2021-09-27 Thread GitBox


boring-cyborg[bot] commented on issue #18549:
URL: https://github.com/apache/airflow/issues/18549#issuecomment-927968278


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jj8huang commented on issue #16344: Tried to upgrade to Airflow 2.1.0

2021-09-27 Thread GitBox


jj8huang commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-927963260


   Yes I did! And i ended up getting this error: `Can't locate revision 
identified by 'ccde3e26fe78'` . Also worth noting that I couldn't find any logs 
in AWS of the other migrations running...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] deedmitrij opened a new pull request #18548: Fix kubernetes engine system test

2021-09-27 Thread GitBox


deedmitrij opened a new pull request #18548:
URL: https://github.com/apache/airflow/pull/18548


   1. Add `in_cluster=False` argument to `GKEStartPodOperator`
   2. Change order of activating service account and setting project
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jensgrnb commented on issue #18495: apache-airflow-providers-sendgrid==2.0.1 doesn't show in the connections drop down UI

2021-09-27 Thread GitBox


jensgrnb commented on issue #18495:
URL: https://github.com/apache/airflow/issues/18495#issuecomment-927960288


   Thanks for the code pointer, I'll double check the pasted api key!
   
   On Mon, 27 Sept 2021 at 16:51, Jarek Potiuk ***@***.***>
   wrote:
   
   > I did not use sendgrid so I am not sure but from the code it looks like
   > API_KEY is taken from connection password. I do not think host is ever
   > used. You just have to make sure your connection is named
   > 'sendgrid_default". You can also try env variable to test if your key is
   > correct (see below):
   >
   > The error 400 has nothing to do with password - it is Bad request - so if
   > anything this might be a bad version of sendgrid dependency, wrong
   > provider, or maybe wrong format of the KEY? You need to consult sendgrid
   > docs:
   >
   > From:
   > 
https://github.com/apache/airflow/blob/main/airflow/providers/sendgrid/utils/emailer.py
   >
   > def _post_sendgrid_mail(mail_data: Dict, conn_id: str = 
"sendgrid_default") -> None:
   > api_key = None
   > try:
   > conn = BaseHook.get_connection(conn_id)
   > api_key = conn.password
   > except AirflowException:
   > pass
   > if api_key is None:
   > warnings.warn(
   > "Fetching Sendgrid credentials from environment variables will 
be deprecated in a future "
   > "release. Please set credentials using a connection instead.",
   > PendingDeprecationWarning,
   > stacklevel=2,
   > )
   > api_key = os.environ.get('SENDGRID_API_KEY')
   > sendgrid_client = sendgrid.SendGridAPIClient(api_key=api_key)
   > response = 
sendgrid_client.client.mail.send.post(request_body=mail_data)
   > # 2xx status code.
   > if 200 <= response.status_code < 300:
   > log.info(
   > 'Email with subject %s is successfully sent to recipients: %s',
   > mail_data['subject'],
   > mail_data['personalizations'],
   > )
   > else:
   > log.error(
   > 'Failed to send out email with subject %s, status code: %s',
   > mail_data['subject'],
   > response.status_code,
   > )
   >
   > —
   > You are receiving this because you authored the thread.
   > Reply to this email directly, view it on GitHub
   > ,
   > or unsubscribe
   > 

   > .
   > Triage notifications on the go with GitHub Mobile for iOS
   > 

   > or Android
   > 
.
   >
   >
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (387c43f -> 391da64)

2021-09-27 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 387c43f  Fix intermittent orphan test (#18530)
 add 391da64  Excludes rightfullhy unlicensed files from chart from RAT 
check (#18547)

No new revisions were added by this update.

Summary of changes:
 .rat-excludes | 6 ++
 1 file changed, 6 insertions(+)


[GitHub] [airflow] potiuk merged pull request #18547: Excludes rightfullhy unlicensed files from chart from RAT check

2021-09-27 Thread GitBox


potiuk merged pull request #18547:
URL: https://github.com/apache/airflow/pull/18547


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on issue #18495: apache-airflow-providers-sendgrid==2.0.1 doesn't show in the connections drop down UI

2021-09-27 Thread GitBox


potiuk commented on issue #18495:
URL: https://github.com/apache/airflow/issues/18495#issuecomment-927951133


   I did not use sendgrid so I am not sure but from the code it looks like 
API_KEY is taken from connection password. I do not think host is ever used. 
You just have to make sure your connection is named 'sendgrid_default". You can 
also try env variable to test if your key is correct (see below):
   
   The error 400 has nothing to do with password - it is `Bad request` - so if 
anything this might be a bad version of sendgrid dependency, wrong provider, or 
maybe wrong format of the KEY? You need to consult sendgrid docs:
   
   From: 
https://github.com/apache/airflow/blob/main/airflow/providers/sendgrid/utils/emailer.py
   
   ```
   def _post_sendgrid_mail(mail_data: Dict, conn_id: str = "sendgrid_default") 
-> None:
   api_key = None
   try:
   conn = BaseHook.get_connection(conn_id)
   api_key = conn.password
   except AirflowException:
   pass
   if api_key is None:
   warnings.warn(
   "Fetching Sendgrid credentials from environment variables will 
be deprecated in a future "
   "release. Please set credentials using a connection instead.",
   PendingDeprecationWarning,
   stacklevel=2,
   )
   api_key = os.environ.get('SENDGRID_API_KEY')
   sendgrid_client = sendgrid.SendGridAPIClient(api_key=api_key)
   response = sendgrid_client.client.mail.send.post(request_body=mail_data)
   # 2xx status code.
   if 200 <= response.status_code < 300:
   log.info(
   'Email with subject %s is successfully sent to recipients: %s',
   mail_data['subject'],
   mail_data['personalizations'],
   )
   else:
   log.error(
   'Failed to send out email with subject %s, status code: %s',
   mail_data['subject'],
   response.status_code,
   )
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18547: Excludes rightfullhy unlicensed files from chart from RAT check

2021-09-27 Thread GitBox


github-actions[bot] commented on pull request #18547:
URL: https://github.com/apache/airflow/pull/18547#issuecomment-927945440


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest main or amend the last commit of 
the PR, and push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #18547: Excludes rightfullhy unlicensed files from chart from RAT check

2021-09-27 Thread GitBox


potiuk opened a new pull request #18547:
URL: https://github.com/apache/airflow/pull/18547


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   >