[GitHub] [airflow] alexbegg commented on issue #19506: Salesforce connections should not require all extras

2021-11-09 Thread GitBox


alexbegg commented on issue #19506:
URL: https://github.com/apache/airflow/issues/19506#issuecomment-964869874


   I am just about ready to submit a PR fix for this. I just want to first add 
this as an issue so I can provide more detail behind the issue.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg opened a new issue #19506: Salesforce connections should not require all extras

2021-11-09 Thread GitBox


alexbegg opened a new issue #19506:
URL: https://github.com/apache/airflow/issues/19506


   ### Apache Airflow Provider(s)
   
   salesforce
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-salesforce==1!3.2.0
   
   ### Apache Airflow version
   
   2.1.4
   
   ### Operating System
   
   Debian GNU/Linux 10 (buster)
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   Right now the `SalesforceHook` is requiring all of the extras for a 
Salesforce connection type in Airflow.
   
   I am assuming this issue was never brought up before because users of this 
hook have been using the Airflow UI to make connections (which presents all 
extra fields), however with things such as Secrets Backends, it should be 
possible to set up a Salesforce connection URI without having to explicitly 
provide all of the extras.
   
   The issue is the hook's author designed the hook's `get_conn` to pass in 
every extra value using an improper/invalid method of defaulting to None by 
using `or None` but still referencing the extras key directly. I tested this in 
as low as Python 2.7 and as high as Python 3.7 and in both versions if any one 
of these extras are not provided you will get a `KeyError`
   
   
https://github.com/apache/airflow/blob/e9a72a4e95e6d23bae010ad92499cd7b06d50037/airflow/providers/salesforce/hooks/salesforce.py#L137-L149
   
   ### What you expected to happen
   
   Any extras value not provided will just default to None (or to 
`api.DEFAULT_API_VERSION` for the "version" extra). It should be possible to 
set up a Salesforce connection URI using a secrets backend without having to 
explicitly provide all of the extras.
   
   ### How to reproduce
   
   Set up a secrets backend (such as via environment variable) and pass in the 
minimum connection values needed for "Password" connection type:
   
   _(this is an example, no real passwords shown)_
   `export 
AIRFLOW_CONN_SALESFORCE_DEFAULT='http://your_username:your_password@https%3A%2F%2Fyour_host.lightning.force.com?extra__salesforce__security_token=your_token'`
   
   It will error with `KeyError: 'extra__salesforce__domain'` and keep 
resulting in key errors for each extras key until you finally provide all 
extras, like so:
   
   _(this is an example, no real passwords shown)_
   `export 
AIRFLOW_CONN_SALESFORCE_DEFAULT='http://your_username:your_password@https%3A%2F%2Fyour_host.lightning.force.com?extra__salesforce__security_token=your_token&extra__salesforce__domain=&extra__salesforce__instance=&extra__salesforce__instance_url=&extra__salesforce__organization_id=&extra__salesforce__version=&extra__salesforce__proxies=&extra__salesforce__client_id=Aiflow&extra__salesforce__consumer_key=&extra__salesforce__private_key_file_path=&extra__salesforce__private_key='`
   
   ### Anything else
   
   In addition to this, the `SalesforceHook` should also accept extras without 
the need for the `extra__salesforce__` prefix, like many other connections do.
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Bowrna commented on pull request #19470: Add how-to Guide for MSSQL operators

2021-11-09 Thread GitBox


Bowrna commented on pull request #19470:
URL: https://github.com/apache/airflow/pull/19470#issuecomment-964848616


   thank you @eladkal and @potiuk 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish opened a new pull request #19505: XCom.serialize_value should have all params set does

2021-11-09 Thread GitBox


dstandish opened a new pull request #19505:
URL: https://github.com/apache/airflow/pull/19505


   When implementing a custom XCom backend, in order to store XCom objects 
organized by dag_id, run_id etc, we need to pass those params to 
`serialize_value`.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on issue #16921: Add support for Salesforce Bulk API

2021-11-09 Thread GitBox


alexbegg commented on issue #16921:
URL: https://github.com/apache/airflow/issues/16921#issuecomment-964831154


   I just want to share that it is fairly simple to use a SalesforceHook for a 
SalesforceBulk connection, I am doing it at my company (you just have to `pip 
install salesforce-bulk`: https://github.com/heroku/salesforce-bulk):
   
   ```
   from airflow.providers.salesforce.hooks.salesforce import SalesforceHook
   from salesforce_bulk import SalesforceBulk
   
   ...
   
   sf_hook = SalesforceHook()
   sf_conn = sf_hook.get_conn()
   bulk = SalesforceBulk(sessionId=sf_conn.session_id, host=sf_conn.sf_instance)
   ```
   
   So it should be fairly simple to make this into a `SalesforceBulkHook` (I 
think "Api" in the name is a bit unnecessary since we aren't calling the 
existing hook SalesforceApiHook).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] sharon2719 closed pull request #18930: Add Guide for Apache Pinot

2021-11-09 Thread GitBox


sharon2719 closed pull request #18930:
URL: https://github.com/apache/airflow/pull/18930


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish commented on a change in pull request #19504: Do not check for key before attempting download

2021-11-09 Thread GitBox


dstandish commented on a change in pull request #19504:
URL: https://github.com/apache/airflow/pull/19504#discussion_r746281750



##
File path: airflow/providers/amazon/aws/hooks/s3.py
##
@@ -806,10 +806,15 @@ def download_file(
 """
 self.log.info('Downloading source S3 file from Bucket %s with path 
%s', bucket_name, key)
 
-if not self.check_for_key(key, bucket_name):
-raise AirflowException(f'The source file in Bucket {bucket_name} 
with path {key} does not exist')
-
-s3_obj = self.get_key(key, bucket_name)
+try:
+s3_obj = self.get_key(key, bucket_name)
+except ClientError as e:

Review comment:
   my preference would be to not catch this error at all, but catching it 
keeps this consistent with the existing behavior




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dstandish opened a new pull request #19504: Do not check for key before attempting download

2021-11-09 Thread GitBox


dstandish opened a new pull request #19504:
URL: https://github.com/apache/airflow/pull/19504


   When you download a key that exists, notice that it retrieves creds twice:
   
   ```
   [2021-11-09 22:25:18,736] {s3.py:807} INFO - Downloading source S3 file from 
Bucket oss-test-xcom with path 
test-dag/test-task/2021-01-01T00:00:00+00:00/hellodf
   [2021-11-09 22:25:18,736] {base_aws.py:401} INFO - Airflow Connection: 
aws_conn_id=aws_default
   [2021-11-09 22:25:18,752] {credentials.py:1224} INFO - Found credentials in 
shared credentials file: ~/.aws/credentials
   [2021-11-09 22:25:19,049] {base_aws.py:424} WARNING - Unable to use Airflow 
Connection for credentials.
   [2021-11-09 22:25:19,049] {base_aws.py:425} INFO - Fallback on boto3 
credential strategy
   [2021-11-09 22:25:19,049] {base_aws.py:428} INFO - Creating session using 
boto3 credential strategy region_name=None
   
/Users/dstandish/code/airflow/airflow/providers/amazon/aws/hooks/base_aws.py:494
 DeprecationWarning: client_type is deprecated. Set client_type from class 
attribute.
   [2021-11-09 22:25:19,066] {credentials.py:1224} INFO - Found credentials in 
shared credentials file: ~/.aws/credentials
   [2021-11-09 22:25:19,526] {base_aws.py:401} INFO - Airflow Connection: 
aws_conn_id=aws_default
   [2021-11-09 22:25:19,594] {base_aws.py:424} WARNING - Unable to use Airflow 
Connection for credentials.
   [2021-11-09 22:25:19,594] {base_aws.py:425} INFO - Fallback on boto3 
credential strategy
   [2021-11-09 22:25:19,594] {base_aws.py:428} INFO - Creating session using 
boto3 credential strategy region_name=None
   /Users/dstandish/code/airflow/airflow/providers/amazon/aws/hooks/s3.py:343 
DeprecationWarning: resource_type is deprecated. Set resource_type from class 
attribute.
   [2021-11-09 22:25:19,628] {credentials.py:1224} INFO - Found credentials in 
shared credentials file: ~/.aws/credentials
   ```
   
   The first is for checking existence and the second is retrieving the object.
   
   We don't need to check for existence.  We can just ask for the object and if 
it's not there, the api will let is know.  And when the object _is_ there, 
we'll only have retrieved creds once.
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg edited a comment on issue #19500: Multiple SLA miss emails if multiple schedulers are used

2021-11-09 Thread GitBox


alexbegg edited a comment on issue #19500:
URL: https://github.com/apache/airflow/issues/19500#issuecomment-964817534


   > I'm wondering why you use two schedulers. Are you using multiple headers 
for high availability?
   
   I did that because Astronomer actually recommends it for production: "To 
increase the speed at which tasks are scheduled and ensure high-availability, 
we recommend provisioning 2 or more Airflow Schedulers for production 
environments" As stated under the "Airflow Scheduler" section of this 
documentation: 
https://www.astronomer.io/docs/enterprise/v0.25/deploy/configure-deployment#scale-core-resources
   
   However, I don't have another reason to use 2 schedulers, I might switch 
back to 1. Nethertheless, it still is a sort of bug.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on issue #19500: Multiple SLA miss emails if multiple schedulers are used

2021-11-09 Thread GitBox


alexbegg commented on issue #19500:
URL: https://github.com/apache/airflow/issues/19500#issuecomment-964817534


   > I'm wondering why you use two schedulers. Are you using multiple headers 
for high availability?
   
   I did that because Astronomer actually recommends it for production: "To 
increase the speed at which tasks are scheduled and ensure high-availability, 
we recommend provisioning 2 or more Airflow Schedulers for production 
environments" As stated under the "Airflow Scheduler" section of his 
documentation: 
https://www.astronomer.io/docs/enterprise/v0.25/deploy/configure-deployment#scale-core-resources
   
   However, I don't have another reason to use 2 schedulers, I might switch 
back to 1. Nethertheless, it still is a sort of bug.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] harryplumer opened a new issue #19503: SnowflakeHook opening connection twice

2021-11-09 Thread GitBox


harryplumer opened a new issue #19503:
URL: https://github.com/apache/airflow/issues/19503


   ### Apache Airflow Provider(s)
   
   snowflake
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-snowflake==2.2.0 
   
   ### Apache Airflow version
   
   2.2.0
   
   ### Operating System
   
   Linux-5.4.141-67.229.amzn2.x86_64-x86_64-with-glibc2.31
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   The `get_connection()` method is called twice in the SnowflakeHook, 
initializing the connection twice and slowing the hook down. See the following 
example logs on a single `run` of the SnowflakeHook:
   
   
   https://user-images.githubusercontent.com/30101670/141052877-648a5543-cf9a-4c16-82a6-02f6b62b47ba.png";>
   
   
   
   ### What you expected to happen
   
   The code is using the `get_connection` method on [line 
271](https://github.com/apache/airflow/blob/main/airflow/providers/snowflake/hooks/snowflake.py#L271)
 and then calling the method again on [line 
272](https://github.com/apache/airflow/blob/main/airflow/providers/snowflake/hooks/snowflake.py#L272)
   
   ### How to reproduce
   
   Run the SnowflakeHook
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #19503: SnowflakeHook opening connection twice

2021-11-09 Thread GitBox


boring-cyborg[bot] commented on issue #19503:
URL: https://github.com/apache/airflow/issues/19503#issuecomment-964790956


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaustubhharapanahalli opened a new issue #19502: Issue while connecting to docker daemon from containerized Airflow

2021-11-09 Thread GitBox


kaustubhharapanahalli opened a new issue #19502:
URL: https://github.com/apache/airflow/issues/19502


   ### Apache Airflow Provider(s)
   
   docker
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==2.3.0
   apache-airflow-providers-celery==2.1.0
   apache-airflow-providers-cncf-kubernetes==2.0.3
   apache-airflow-providers-docker==2.2.0
   apache-airflow-providers-elasticsearch==2.0.3
   apache-airflow-providers-ftp==2.0.1
   apache-airflow-providers-google==6.0.0
   apache-airflow-providers-grpc==2.0.1
   apache-airflow-providers-hashicorp==2.1.1
   apache-airflow-providers-http==2.0.1
   apache-airflow-providers-imap==2.0.1
   apache-airflow-providers-microsoft-azure==3.2.0
   apache-airflow-providers-mysql==2.1.1
   apache-airflow-providers-odbc==2.0.1
   apache-airflow-providers-postgres==2.3.0
   apache-airflow-providers-redis==2.0.1
   apache-airflow-providers-sendgrid==2.0.1
   apache-airflow-providers-sftp==2.1.1
   apache-airflow-providers-slack==4.1.0
   apache-airflow-providers-sqlite==2.0.1
   apache-airflow-providers-ssh==2.2.0
   
   ### Apache Airflow version
   
   2.2.1 (latest released)
   
   ### Operating System
   
   PRETTY_NAME="Debian GNU/Linux 10 (buster)"
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   ```yaml
   version: "3.9"
   
   services:
 postgres:
   restart: always
   image: postgres:latest
   container_name: dpg_postgres
   environment:
 - POSTGRES_USER=airflow
 - POSTGRES_PASSWORD=airflow
 - POSTGRES_DB=airflow
   ports:
 - "5434:5432"
   
 webserver:
   container_name: dpg_airflow
   build:
 context: .
 args:
   PYTHON_BASE_IMAGE: "python:3.7-slim-buster"
   PYTHON_MAJOR_MINOR_VERSION: 3.7
   AIRFLOW_INSTALLATION_METHOD: "apache-airflow"
   AIRFLOW_VERSION: "2.2.1"
   AIRFLOW_INSTALL_VERSION: "==2.2.1"
   AIRFLOW_CONSTRAINTS_REFERENCE: "constraints-2-0"
   AIRFLOW_SOURCES_FROM: "empty"
   AIRFLOW_SOURCES_TO: "/empty"
   ADDITIONAL_AIRFLOW_EXTRAS: "apache-airflow-providers-amazon \
 apache-airflow-providers-http \
 apache-airflow-providers-hashicorp \
 apache-airflow-providers-docker \
 apache-airflow-providers-ftp \
 apache-airflow-providers-postgres \
 apache-airflow-providers-sftp \
 apache-airflow-providers-ssh"
   ADDITIONAL_DEV_APT_DEPS: "gcc g++"
   environment:
 key_name:
 project_group:
   hostname: webserver
   restart: always
   depends_on:
 - postgres
   env_file:
 - .env
   volumes:
 - ./dags:/opt/airflow/dags
 - ./plugins:/opt/airflow/plugins
 - data_volume:/data/project_meta
   ports:
 - "8090:8080"
   entrypoint: ./dags/scripts/airflow-entrypoint.sh
   healthcheck:
 test: ["CMD-SHELL", "[ -f ./opt/airflow/airflow-webserver.pid ]"]
 interval: 30s
 timeout: 30s
 retries: 32
   
   volumes:
 data_volume:
   ```
   
   ### What happened
   
   Hello, I was trying to connect to docker daemon from a containerized airflow 
setup. I saw this one issue: https://github.com/apache/airflow/issues/16803 > 
looking at this, I tired to migrate my Airflow version to 2.2.1 and set 
mount_tmp_dir=False
   
   I tried to run this using two approaches.
   1. By not setting a tcp docker url
   2. By setting a tcp docker url
   
   In the first approach, I got this error:
   ```bash
   *** Reading local file: 
/opt/airflow/logs/hello_world/echo_name/2021-11-10T04:14:45.866031+00:00/1.log
   [2021-11-10, 04:14:47 UTC] {taskinstance.py:1035} INFO - Dependencies all 
met for 
   [2021-11-10, 04:14:47 UTC] {taskinstance.py:1035} INFO - Dependencies all 
met for 
   [2021-11-10, 04:14:47 UTC] {taskinstance.py:1241} INFO - 
   

   [2021-11-10, 04:14:47 UTC] {taskinstance.py:1242} INFO - Starting attempt 1 
of 2
   [2021-11-10, 04:14:47 UTC] {taskinstance.py:1243} INFO - 
   

   [2021-11-10, 04:14:47 UTC] {taskinstance.py:1262} INFO - Executing 
 on 2021-11-10 04:14:45.866031+00:00
   [2021-11-10, 04:14:47 UTC] {standard_task_runner.py:52} INFO - Started 
process 5836 to run task
   [2021-11-10, 04:14:47 UTC] {standard_task_runner.py:76} INFO - Running: 
['***', 'tasks', 'run', 'hello_world', 'echo_name', 
'manual__2021-11-10T04:14:45.866031+00:00', '--job-id', '2', '--raw', 
'--subdir', 'DAGS_FOLDER/dag_creator.py', '--cfg-path', '/tmp/tmp39jdfg29', 
'--error-file', '/tmp/tmp2tmtb5n4']
   [2021-11-10, 04:14:47 UTC] {standard_task_runner.py:77} INFO - Job 2: 
Subtask echo_name
   [2021-11-10, 04:14:47 UTC] {logging_mixin.py:109} INFO - Running 
 on host webserver
   [2021-

[GitHub] [airflow] EricGao888 commented on issue #19500: Multiple SLA miss emails if multiple schedulers are used

2021-11-09 Thread GitBox


EricGao888 commented on issue #19500:
URL: https://github.com/apache/airflow/issues/19500#issuecomment-964753067


   I'm wondering why you use two schedulers. Are you using multiple headers for 
high availability?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell opened a new pull request #19501: Update Operators and Hooks doc to reflect latest

2021-11-09 Thread GitBox


josh-fell opened a new pull request #19501:
URL: https://github.com/apache/airflow/pull/19501


   Updating paths to point to non-deprecated modules and alphabetizing the 
tables in each group.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg opened a new issue #19500: Multiple SLA miss emails if multiple schedulers are used

2021-11-09 Thread GitBox


alexbegg opened a new issue #19500:
URL: https://github.com/apache/airflow/issues/19500


   ### Apache Airflow version
   
   2.1.4
   
   ### Operating System
   
   Debian GNU/Linux 10 (buster)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   2 schedulers and CeleryExecutor with 2 workers
   
   ### What happened
   
   I am receiving 2 identical SLA miss emails for each SLA miss due to having 2 
schedulers running.
   
   I know the scheduler is what checks the SLA so I can understand why both 
schedules sent out a SLA miss, but if the 2 schedulers can know to not run the 
same task twice, it should also be robust enough to not send the SLA miss email 
twice.
   
   ### What you expected to happen
   
   I should only receive 1 SLA miss email
   
   ### How to reproduce
   
   Have 2 schedulers running and set up an SLA and set up a task to run past 
the SLA
   
   ### Anything else
   
   If I can get a better understanding of how multiple scheduler work 
side-by-side I can make a PR to fix this, but I come from the Airflow 1.10.x 
world and I am new to using 2 schedulers in Airflow.
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell commented on a change in pull request #17421: Add ShortCircuitOperator configurability for respecting downstream trigger rules

2021-11-09 Thread GitBox


josh-fell commented on a change in pull request #17421:
URL: https://github.com/apache/airflow/pull/17421#discussion_r74617



##
File path: airflow/operators/python.py
##
@@ -219,17 +219,31 @@ def execute(self, context: Dict):
 
 class ShortCircuitOperator(PythonOperator, SkipMixin):
 """
-Allows a workflow to continue only if a condition is met. Otherwise, the
-workflow "short-circuits" and downstream tasks are skipped.
-
-The ShortCircuitOperator is derived from the PythonOperator. It evaluates a
-condition and short-circuits the workflow if the condition is False. Any
-downstream tasks are marked with a state of "skipped". If the condition is
-True, downstream tasks proceed as normal.
-
-The condition is determined by the result of `python_callable`.
+Allows a workflow to continue only if a condition is met. Otherwise, the 
workflow "short-circuits" and
+downstream tasks are skipped. The short-circuiting can be configured to 
either respect or ignore the
+``trigger_rule`` set for downstream tasks. If 
``ignore_downstream_trigger_rules`` is set to True, the
+default setting, all downstream tasks are skipped without considering the 
``trigger_rule`` defined for
+tasks.  However, if this parameter is set to False, the direct, downstream 
tasks are skipped but the
+specified ``trigger_rule`` for other subsequent downstream tasks are 
respected. In this mode,
+the operator assumes the direct, downstream tasks were purposely meant to 
be skipped but perhaps
+not other subsequent tasks.
+
+The ShortCircuitOperator is derived from the PythonOperator. It evaluates 
a condition and short-circuits
+the workflow if the condition is False. Any downstream tasks are marked 
with a state of "skipped" based
+on the short-circuiting mode configured. If the condition is True, 
downstream tasks proceed as normal.
+
+The condition is determined by the result of ``python_callable``.
+
+:param ignore_downstream_trigger_rules: If set to True, all downstream 
tasks from this operator task will
+be skipped. This is the default behavior. If set to False, the direct, 
downstream task(s) will be
+skipped but the ``trigger_rule`` defined for a other downstream tasks 
will be respected.
+:type ignore_downstream_trigger_rules: bool

Review comment:
   Oh yes, 100%. I'll add the necessary context to the docs.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexbegg commented on issue #18031: Alert if a task misses deadline

2021-11-09 Thread GitBox


alexbegg commented on issue #18031:
URL: https://github.com/apache/airflow/issues/18031#issuecomment-964690136


   Maybe I am not understanding the ticket, but I don't find this to be true:
   
   > 1. sla_miss_callback only fires after the task finishes. That means if the 
task is never finished in the first place due to it being blocked, or is still 
running, sla_miss_callback is not fired.
   
   if I have a DAG that should be hourly and for example I set a 1-hour SLA, I 
will get an SLA miss email if a single DAG run is still running past 1 hour. If 
a DAG finished and it failed it will use `on_failure_callback`, and if a DAG 
finished and it succeeded it will use `on_success_callback`. So that covers 
both cases of a DAG being complete (which is only possible if no tasks are 
running or blocked) so obviously the `sla_miss_callback` is if the DAG is 
missing its SLA, even if it is running.
   
   I do agree that there should be better control for task-specific deadlines, 
but also this can be accomplished partially today by putting the part of the 
DAG that needs a deadline should be by itself in a separate DAG with an SLA in 
place, and then the remainder of a DAG will be in a 2nd task and use 
`ExternalTaskSensor`.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] closed issue #10779: Scheduler can't creating DAG runs if some were externally triggered

2021-11-09 Thread GitBox


github-actions[bot] closed issue #10779:
URL: https://github.com/apache/airflow/issues/10779


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on issue #10779: Scheduler can't creating DAG runs if some were externally triggered

2021-11-09 Thread GitBox


github-actions[bot] commented on issue #10779:
URL: https://github.com/apache/airflow/issues/10779#issuecomment-964657351


   This issue has been closed because it has not received response from the 
issue author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on issue #11202: Airflow flower: Connection reset

2021-11-09 Thread GitBox


github-actions[bot] commented on issue #11202:
URL: https://github.com/apache/airflow/issues/11202#issuecomment-964657332


   This issue has been automatically marked as stale because it has been open 
for 30 days with no response from the author. It will be closed in next 7 days 
if no further activity occurs from the issue author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #19355: Ensure the example DAGs are all working

2021-11-09 Thread GitBox


kaxil commented on a change in pull request #19355:
URL: https://github.com/apache/airflow/pull/19355#discussion_r746113056



##
File path: airflow/example_dags/example_kubernetes_executor_config.py
##
@@ -25,147 +25,133 @@
 from airflow import DAG
 from airflow.decorators import task
 from airflow.example_dags.libs.helper import print_stuff
-from airflow.settings import AIRFLOW_HOME
 
 log = logging.getLogger(__name__)
 
+
 try:
 from kubernetes.client import models as k8s
-
-with DAG(
-dag_id='example_kubernetes_executor_config',
-schedule_interval=None,
-start_date=datetime(2021, 1, 1),
-catchup=False,
-tags=['example3'],
-) as dag:
-# You can use annotations on your kubernetes pods!
-start_task_executor_config = {
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(annotations={"test": "annotation"}))
-}
-
-@task(executor_config=start_task_executor_config)
-def start_task():
-print_stuff()
-
-start_task = start_task()
-
-# [START task_with_volume]
-executor_config_volume_mount = {
-"pod_override": k8s.V1Pod(
-spec=k8s.V1PodSpec(
-containers=[
-k8s.V1Container(
-name="base",
-volume_mounts=[
-k8s.V1VolumeMount(mount_path="/foo/", 
name="example-kubernetes-test-volume")
-],
-)
-],
-volumes=[
-k8s.V1Volume(
-name="example-kubernetes-test-volume",
-host_path=k8s.V1HostPathVolumeSource(path="/tmp/"),
-)
-],
-)
-),
-}
-
-@task(executor_config=executor_config_volume_mount)
-def test_volume_mount():
-"""
-Tests whether the volume has been mounted.
-"""
-with open('/foo/volume_mount_test.txt', 'w') as foo:
-foo.write('Hello')
-
-return_code = os.system("cat /foo/volume_mount_test.txt")
-if return_code != 0:
-raise ValueError(f"Error when checking volume mount. Return 
code {return_code}")
-
-volume_task = test_volume_mount()
-# [END task_with_volume]
-
-# [START task_with_template]
-executor_config_template = {
-"pod_template_file": os.path.join(AIRFLOW_HOME, 
"pod_templates/basic_template.yaml"),
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(labels={"release": "stable"})),
-}
-
-@task(executor_config=executor_config_template)
-def task_with_template():
-print_stuff()
-
-task_with_template = task_with_template()
-# [END task_with_template]
-
-# [START task_with_sidecar]
-executor_config_sidecar = {
-"pod_override": k8s.V1Pod(
-spec=k8s.V1PodSpec(
-containers=[
-k8s.V1Container(
-name="base",
-
volume_mounts=[k8s.V1VolumeMount(mount_path="/shared/", 
name="shared-empty-dir")],
-),
-k8s.V1Container(
-name="sidecar",
-image="ubuntu",
-args=["echo \"retrieved from mount\" > 
/shared/test.txt"],
-command=["bash", "-cx"],
-
volume_mounts=[k8s.V1VolumeMount(mount_path="/shared/", 
name="shared-empty-dir")],
-),
-],
-volumes=[
-k8s.V1Volume(name="shared-empty-dir", 
empty_dir=k8s.V1EmptyDirVolumeSource()),
-],
-)
-),
-}
-
-@task(executor_config=executor_config_sidecar)
-def test_sharedvolume_mount():
-"""
-Tests whether the volume has been mounted.
-"""
-for i in range(5):
-try:
-return_code = os.system("cat /shared/test.txt")
-if return_code != 0:
-raise ValueError(f"Error when checking volume mount. 
Return code {return_code}")
-except ValueError as e:
-if i > 4:
-raise e
-
-sidecar_task = test_sharedvolume_mount()
-# [END task_with_sidecar]
-
-# Test that we can add labels to pods
-executor_config_non_root = {
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(labels={"release": "stable"}))
-}
-
-@task(executor_config=executor_config_non_root)
-def non_root_task():
-print_st

[GitHub] [airflow] potiuk edited a comment on pull request #19499: Align runner selection rule to ci.yml

2021-11-09 Thread GitBox


potiuk edited a comment on pull request #19499:
URL: https://github.com/apache/airflow/pull/19499#issuecomment-964619644


   Why would that be better? What do you want to achieve this way?
   No-one can restart any runners - this is How GitHub Actions work.
   Having build images run on self-hosted runners was a deliberate decision, so 
I wonder why you'd like to change it? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #19499: Align runner selection rule to ci.yml

2021-11-09 Thread GitBox


potiuk commented on pull request #19499:
URL: https://github.com/apache/airflow/pull/19499#issuecomment-964619644


   Why would that be better? What do you want to achieve this way?
   No-one can restart them - having build images run on self-hosted runners was 
a deliberate decision. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham commented on a change in pull request #19355: Ensure the example DAGs are all working

2021-11-09 Thread GitBox


jedcunningham commented on a change in pull request #19355:
URL: https://github.com/apache/airflow/pull/19355#discussion_r746107308



##
File path: airflow/example_dags/example_kubernetes_executor_config.py
##
@@ -25,147 +25,133 @@
 from airflow import DAG
 from airflow.decorators import task
 from airflow.example_dags.libs.helper import print_stuff
-from airflow.settings import AIRFLOW_HOME
 
 log = logging.getLogger(__name__)
 
+
 try:
 from kubernetes.client import models as k8s
-
-with DAG(
-dag_id='example_kubernetes_executor_config',
-schedule_interval=None,
-start_date=datetime(2021, 1, 1),
-catchup=False,
-tags=['example3'],
-) as dag:
-# You can use annotations on your kubernetes pods!
-start_task_executor_config = {
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(annotations={"test": "annotation"}))
-}
-
-@task(executor_config=start_task_executor_config)
-def start_task():
-print_stuff()
-
-start_task = start_task()
-
-# [START task_with_volume]
-executor_config_volume_mount = {
-"pod_override": k8s.V1Pod(
-spec=k8s.V1PodSpec(
-containers=[
-k8s.V1Container(
-name="base",
-volume_mounts=[
-k8s.V1VolumeMount(mount_path="/foo/", 
name="example-kubernetes-test-volume")
-],
-)
-],
-volumes=[
-k8s.V1Volume(
-name="example-kubernetes-test-volume",
-host_path=k8s.V1HostPathVolumeSource(path="/tmp/"),
-)
-],
-)
-),
-}
-
-@task(executor_config=executor_config_volume_mount)
-def test_volume_mount():
-"""
-Tests whether the volume has been mounted.
-"""
-with open('/foo/volume_mount_test.txt', 'w') as foo:
-foo.write('Hello')
-
-return_code = os.system("cat /foo/volume_mount_test.txt")
-if return_code != 0:
-raise ValueError(f"Error when checking volume mount. Return 
code {return_code}")
-
-volume_task = test_volume_mount()
-# [END task_with_volume]
-
-# [START task_with_template]
-executor_config_template = {
-"pod_template_file": os.path.join(AIRFLOW_HOME, 
"pod_templates/basic_template.yaml"),
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(labels={"release": "stable"})),
-}
-
-@task(executor_config=executor_config_template)
-def task_with_template():
-print_stuff()
-
-task_with_template = task_with_template()
-# [END task_with_template]
-
-# [START task_with_sidecar]
-executor_config_sidecar = {
-"pod_override": k8s.V1Pod(
-spec=k8s.V1PodSpec(
-containers=[
-k8s.V1Container(
-name="base",
-
volume_mounts=[k8s.V1VolumeMount(mount_path="/shared/", 
name="shared-empty-dir")],
-),
-k8s.V1Container(
-name="sidecar",
-image="ubuntu",
-args=["echo \"retrieved from mount\" > 
/shared/test.txt"],
-command=["bash", "-cx"],
-
volume_mounts=[k8s.V1VolumeMount(mount_path="/shared/", 
name="shared-empty-dir")],
-),
-],
-volumes=[
-k8s.V1Volume(name="shared-empty-dir", 
empty_dir=k8s.V1EmptyDirVolumeSource()),
-],
-)
-),
-}
-
-@task(executor_config=executor_config_sidecar)
-def test_sharedvolume_mount():
-"""
-Tests whether the volume has been mounted.
-"""
-for i in range(5):
-try:
-return_code = os.system("cat /shared/test.txt")
-if return_code != 0:
-raise ValueError(f"Error when checking volume mount. 
Return code {return_code}")
-except ValueError as e:
-if i > 4:
-raise e
-
-sidecar_task = test_sharedvolume_mount()
-# [END task_with_sidecar]
-
-# Test that we can add labels to pods
-executor_config_non_root = {
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(labels={"release": "stable"}))
-}
-
-@task(executor_config=executor_config_non_root)
-def non_root_task():
-

[GitHub] [airflow] SamWheating commented on a change in pull request #18557: Warning of public exposure of deployment in UI with on/off config

2021-11-09 Thread GitBox


SamWheating commented on a change in pull request #18557:
URL: https://github.com/apache/airflow/pull/18557#discussion_r746106955



##
File path: airflow/www/views.py
##
@@ -702,6 +702,22 @@ def _iter_parsed_moved_data_table_names():
 # Second segment is a version marker that we don't need to 
show.
 yield segments[2], table_name
 
+warn_deployment_query = session.query(Log).filter(Log.event == 
"robots").count()

Review comment:
   Could we also just query for events within one week old? Something like:
   
   ```python
   robots_file_access_count = (
   session.query(Log)
   .filter(Log.event == "robots")
   .filter(Log.dttm > (utcnow() - timedelta(days=7)))
   .count()
   )
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] SamWheating commented on a change in pull request #18557: Warning of public exposure of deployment in UI with on/off config

2021-11-09 Thread GitBox


SamWheating commented on a change in pull request #18557:
URL: https://github.com/apache/airflow/pull/18557#discussion_r746106955



##
File path: airflow/www/views.py
##
@@ -702,6 +702,22 @@ def _iter_parsed_moved_data_table_names():
 # Second segment is a version marker that we don't need to 
show.
 yield segments[2], table_name
 
+warn_deployment_query = session.query(Log).filter(Log.event == 
"robots").count()

Review comment:
   Could we also just query for events within one week old? Something like:
   
   ```python
   warn_deployment_query = (
   session.query(Log)
   .filter(Log.event == "robots")
   .filter(Log.dttm > (utcnow() - timedelta(days=7)))
   .count()
   )
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #19355: Ensure the example DAGs are all working

2021-11-09 Thread GitBox


kaxil commented on a change in pull request #19355:
URL: https://github.com/apache/airflow/pull/19355#discussion_r746106743



##
File path: airflow/example_dags/example_kubernetes_executor_config.py
##
@@ -25,147 +25,133 @@
 from airflow import DAG
 from airflow.decorators import task
 from airflow.example_dags.libs.helper import print_stuff
-from airflow.settings import AIRFLOW_HOME
 
 log = logging.getLogger(__name__)
 
+
 try:
 from kubernetes.client import models as k8s
-
-with DAG(
-dag_id='example_kubernetes_executor_config',
-schedule_interval=None,
-start_date=datetime(2021, 1, 1),
-catchup=False,
-tags=['example3'],
-) as dag:
-# You can use annotations on your kubernetes pods!
-start_task_executor_config = {
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(annotations={"test": "annotation"}))
-}
-
-@task(executor_config=start_task_executor_config)
-def start_task():
-print_stuff()
-
-start_task = start_task()
-
-# [START task_with_volume]
-executor_config_volume_mount = {
-"pod_override": k8s.V1Pod(
-spec=k8s.V1PodSpec(
-containers=[
-k8s.V1Container(
-name="base",
-volume_mounts=[
-k8s.V1VolumeMount(mount_path="/foo/", 
name="example-kubernetes-test-volume")
-],
-)
-],
-volumes=[
-k8s.V1Volume(
-name="example-kubernetes-test-volume",
-host_path=k8s.V1HostPathVolumeSource(path="/tmp/"),
-)
-],
-)
-),
-}
-
-@task(executor_config=executor_config_volume_mount)
-def test_volume_mount():
-"""
-Tests whether the volume has been mounted.
-"""
-with open('/foo/volume_mount_test.txt', 'w') as foo:
-foo.write('Hello')
-
-return_code = os.system("cat /foo/volume_mount_test.txt")
-if return_code != 0:
-raise ValueError(f"Error when checking volume mount. Return 
code {return_code}")
-
-volume_task = test_volume_mount()
-# [END task_with_volume]
-
-# [START task_with_template]
-executor_config_template = {
-"pod_template_file": os.path.join(AIRFLOW_HOME, 
"pod_templates/basic_template.yaml"),
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(labels={"release": "stable"})),
-}
-
-@task(executor_config=executor_config_template)
-def task_with_template():
-print_stuff()
-
-task_with_template = task_with_template()
-# [END task_with_template]
-
-# [START task_with_sidecar]
-executor_config_sidecar = {
-"pod_override": k8s.V1Pod(
-spec=k8s.V1PodSpec(
-containers=[
-k8s.V1Container(
-name="base",
-
volume_mounts=[k8s.V1VolumeMount(mount_path="/shared/", 
name="shared-empty-dir")],
-),
-k8s.V1Container(
-name="sidecar",
-image="ubuntu",
-args=["echo \"retrieved from mount\" > 
/shared/test.txt"],
-command=["bash", "-cx"],
-
volume_mounts=[k8s.V1VolumeMount(mount_path="/shared/", 
name="shared-empty-dir")],
-),
-],
-volumes=[
-k8s.V1Volume(name="shared-empty-dir", 
empty_dir=k8s.V1EmptyDirVolumeSource()),
-],
-)
-),
-}
-
-@task(executor_config=executor_config_sidecar)
-def test_sharedvolume_mount():
-"""
-Tests whether the volume has been mounted.
-"""
-for i in range(5):
-try:
-return_code = os.system("cat /shared/test.txt")
-if return_code != 0:
-raise ValueError(f"Error when checking volume mount. 
Return code {return_code}")
-except ValueError as e:
-if i > 4:
-raise e
-
-sidecar_task = test_sharedvolume_mount()
-# [END task_with_sidecar]
-
-# Test that we can add labels to pods
-executor_config_non_root = {
-"pod_override": 
k8s.V1Pod(metadata=k8s.V1ObjectMeta(labels={"release": "stable"}))
-}
-
-@task(executor_config=executor_config_non_root)
-def non_root_task():
-print_st

[GitHub] [airflow] SamWheating commented on a change in pull request #18557: Warning of public exposure of deployment in UI with on/off config

2021-11-09 Thread GitBox


SamWheating commented on a change in pull request #18557:
URL: https://github.com/apache/airflow/pull/18557#discussion_r745083002



##
File path: airflow/www/views.py
##
@@ -702,6 +702,22 @@ def _iter_parsed_moved_data_table_names():
 # Second segment is a version marker that we don't need to 
show.
 yield segments[2], table_name
 
+warn_deployment_query = session.query(Log).filter(Log.event == 
"robots").count()
+if (
+permissions.ACTION_CAN_ACCESS_MENU,
+permissions.RESOURCE_ADMIN_MENU,
+) in user_permissions and warn_deployment_query > 0:
+flash(
+Markup(
+'Recent requests have been made to /robots.txt. '
+'This indicates that this deployment may be accessible to 
the public internet. '
+'This warning can be disabled by setting 
webserver.warn_deployment_exposure=False in '

Review comment:
   I don't think this warning can be disabled? I don't see the value of 
`warn_deployment_exposure` actually being referenced anywhere other than the 
config. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] khalidmammadov opened a new pull request #19499: Align runner selection rule to ci.yml

2021-11-09 Thread GitBox


khalidmammadov opened a new pull request #19499:
URL: https://github.com/apache/airflow/pull/19499


   This is to align runner selection rules to ci.yml so builds run on hosted 
runners for non contributors rather than on self-hosted. 
   Non contributors dont have access to self-hosted runners and cant influence 
them by any means (i.e. restart, increase etc.)
   This will make "build-image" job to run ubuntu and independent from 
self-hosted runner.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] annotated tag 2.2.2rc1 updated (eaead7d -> 4f9704b)

2021-11-09 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to annotated tag 2.2.2rc1
in repository https://gitbox.apache.org/repos/asf/airflow.git.


*** WARNING: tag 2.2.2rc1 was modified! ***

from eaead7d  (commit)
  to 4f9704b  (tag)
 tagging eaead7d721c743cc7ea2d826c11bc65b87b1121e (commit)
 replaces 2.2.1
  by Jed Cunningham
  on Tue Nov 9 14:38:49 2021 -0700

- Log -
Apache Airflow 2.2.2rc1
-BEGIN PGP SIGNATURE-

iQIzBAABCAAdFiEEceQtGocRZ+vS1BcnvVKfeQM9slAFAmGK6mkACgkQvVKfeQM9
slBb1Q//V3Su8bEHDcrzZjcI6ckqsl43I+6OtVskpgeaUdAwAGAJS+w3+OuKOzCj
xo2GHmeJrRFF58Nng0LGtXJXqpl+lf94j7NXZ7kJi19UVMz3lYpSA9h8N0c3xyVe
V9aZlvKOL2EYoLM563jFyF9HaDg/D61OFI8h6gAIpUbLUiOsfbeKqwsl5WvdQN4K
WGgpfKrbZLfPSnyZX0gn9S+jb05nCXoK/msReataWmJAjEP+d+TzEyXxBFyX2H1r
33VFiEiDJrw6VMuWgNXWOioYbqRVP/cqYY7lFNvpFbcCctDyekQzzJgiT+g1NxMG
ECdrz7zZwRU2o//tejHFxoLuWPDVFHJVbizMCBBs3VCJUCHMELXVPnVPcWgtbFJH
9KTywjLtBSeswXCsjki2D0gD3fBUBedIoe/TSSl/J3Wp/C3forcgiACQjg+2Oe2Q
D5+SWGrxzwSXETZjCwFRt9rCRmjeRYzq+47F6zLxtT4588OxwazydZAds9N0antE
dLiueKk/cyYrdx+SSDu8H7l3aWTMcCb9ZnD0Iew2zyvZq/qwIHUMTX1iDkzO/PJs
ggAuoOBEiIqXAAkQWi+XFYvZ6urhtjHHIme10yJlZMMvw8kWZXQAt18ajA1McISv
liwgeY072csndQjBXjS5IH/v72ZBNmL73KZG4jfMJCZHrAADz6I=
=sEeV
-END PGP SIGNATURE-
---


No new revisions were added by this update.

Summary of changes:


svn commit: r50857 - /dev/airflow/2.2.2rc1/

2021-11-09 Thread jedcunningham
Author: jedcunningham
Date: Tue Nov  9 22:06:40 2021
New Revision: 50857

Log:
Add artifacts for Airflow 2.2.2rc1

Added:
dev/airflow/2.2.2rc1/
dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz   (with props)
dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz.asc
dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz.sha512
dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz   (with props)
dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz.asc
dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz.sha512
dev/airflow/2.2.2rc1/apache_airflow-2.2.2-py3-none-any.whl   (with props)
dev/airflow/2.2.2rc1/apache_airflow-2.2.2-py3-none-any.whl.asc
dev/airflow/2.2.2rc1/apache_airflow-2.2.2-py3-none-any.whl.sha512

Added: dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz
==
Binary file - no diff available.

Propchange: dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz.asc
==
--- dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz.asc (added)
+++ dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz.asc Tue Nov  9 
22:06:40 2021
@@ -0,0 +1,11 @@
+-BEGIN PGP SIGNATURE-
+
+iQFJBAABCAAzFiEEzeFcbk06jsTs9LpLZnTgitfeQG8FAmGK74sVHGtheGlsbmFp
+a0BhcGFjaGUub3JnAAoJEGZ04IrX3kBvL+IIAIqb/Db+17r0K+qRXWP8VQi1CCAG
+uoICkc/aFr06DluN5ifx9JwWfzHFvgzeyZTcXW567YwmCbCgRP6DPPlgzFGXxZUj
+jrLIKp3o6TZQdynaKkz+CG2CZBmbbQju8yMMKQYluiRCzNItyUb5wmC9gm8oBr3O
++fa1KrvM8hnNmcB/wcPe0ncD+LXSbbq2zgRji6eDJltdSJO1J1vxTzvrByfZnr59
+kDdYBWzt6JOaJLfDTqkDSZJIvesAhwQdnxE0uIymgo35A4HI1xyi8Yq25PR+cR9s
+LPECZZdPi+gsOqeiJF1afKVx/rAscVUtbxdWrChzvgiR3FQVc2RLJf1d79M=
+=v2FP
+-END PGP SIGNATURE-

Added: dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz.sha512
==
--- dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz.sha512 (added)
+++ dev/airflow/2.2.2rc1/apache-airflow-2.2.2-source.tar.gz.sha512 Tue Nov  9 
22:06:40 2021
@@ -0,0 +1 @@
+a6d3da0e193f27c6196def9b121d4c830db4ee0615f01710d4dccb2978e8b491ff73e0baeede687678f1f88519c85bdb9801457c6cbc366f978f6eb79b63625f
  apache-airflow-2.2.2-source.tar.gz

Added: dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz.asc
==
--- dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz.asc (added)
+++ dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz.asc Tue Nov  9 22:06:40 
2021
@@ -0,0 +1,11 @@
+-BEGIN PGP SIGNATURE-
+
+iQFJBAABCAAzFiEEzeFcbk06jsTs9LpLZnTgitfeQG8FAmGK74wVHGtheGlsbmFp
+a0BhcGFjaGUub3JnAAoJEGZ04IrX3kBvCwgIALKy/YQerAJWH+U3P79YgkmHF8Q9
+valKFgZqbAoVBOvBpTFWfl47CDIqEZIItReN31ihTbMc/ov4tMwX1cvEGN9G0ihu
+FAfKfr5qjM51CxW2dZQOzMqb/536wuWd9W8BA15oBsQB3RBno7VAKsPBTEdMEyYB
+1Hd5ZTFss1OEAWYGyAXWQW3DwCbvVEueasvPlzf5m4YtDDh9ztuQViKX1L7py9JE
+1i3h2y1kmB3kcVJOR2UjxpsijFW9+y/7rG6JNUph11RtxY3+OgTKCnqAgUZ1lRp0
+LwfVCPly8G4lDCYRyVBoFGkR8ANGI8lZgTy13m53LzV/IdG0F0jS4JG2+mw=
+=YPtV
+-END PGP SIGNATURE-

Added: dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz.sha512
==
--- dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz.sha512 (added)
+++ dev/airflow/2.2.2rc1/apache-airflow-2.2.2.tar.gz.sha512 Tue Nov  9 22:06:40 
2021
@@ -0,0 +1 @@
+82aeed340faf0c5cc768df4d5153c7527103cc40c3e325fcd6ab9146436b81f25ef93e3b0b5398735f233ccc80d196b73196136e526ec8dd31228a3aa89972b3
  apache-airflow-2.2.2.tar.gz

Added: dev/airflow/2.2.2rc1/apache_airflow-2.2.2-py3-none-any.whl
==
Binary file - no diff available.

Propchange: dev/airflow/2.2.2rc1/apache_airflow-2.2.2-py3-none-any.whl
--
svn:mime-type = application/octet-stream

Added: dev/airflow/2.2.2rc1/apache_airflow-2.2.2-py3-none-any.whl.asc
==
--- dev/airflow/2.2.2rc1/apache_airflow-2.2.2-py3-none-any.whl.asc (added)
+++ dev/airflow/2.2.2rc1/apache_airflow-2.2.2-py3-none-any.whl.asc Tue Nov  9 
22:06:40 2021
@@ -0,0 +1,11 @@
+-BEGIN PGP SIGNATURE-
+
+iQFJBAABCAAzFiEEzeFcbk06jsTs9LpLZnTgitfeQG8FAmGK74wVHGtheGlsbmFp
+a0BhcGFjaGUub3JnAAoJEGZ04IrX3kBv7eMIAKT6YdFZADBmi/qUVELf

[airflow] annotated tag constraints-2.2.2rc1 updated (b67ed92 -> 8948aa5)

2021-11-09 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to annotated tag constraints-2.2.2rc1
in repository https://gitbox.apache.org/repos/asf/airflow.git.


*** WARNING: tag constraints-2.2.2rc1 was modified! ***

from b67ed92  (commit)
  to 8948aa5  (tag)
 tagging b67ed92a1831fa32e1d2ed03fec9c47334ef4209 (commit)
 replaces constraints-2.2.1
  by Jed Cunningham
  on Tue Nov 9 14:58:15 2021 -0700

- Log -
Constraints for Apache Airflow 2.2.2rc1
-BEGIN PGP SIGNATURE-

iQIzBAABCAAdFiEEceQtGocRZ+vS1BcnvVKfeQM9slAFAmGK7vcACgkQvVKfeQM9
slDxMg/7B9d5a0f3a6N+ZDcPow3ygTA1nmMjphpwhi8L5ZzGV+Y60gjJz0xFk7xe
aupoPbpeNnCS9wZhZ0Lwl1Oi/eNoWj9hpgaMXctRoHYItikbxFEo/tUqG9R7/niZ
JoZui0nauU1me8rLdIcWnGZpV2N3XgHXoJbAO883/P1YyIVKRvOpj1j7ritCcUgG
xohDFgkKG7WZ8C700mVt91a0wsrr+EIBgVafbxDQwNni3jK+ESMBgissdPwdk0EV
HZ1cKCgMk/aRY4ecgNTPJRoIq0pOQxdNWywh1oy4ObtfrxGubg39iwYt9JL6T8M0
RVfWecoCE+67E0Y0cMyzWVRpohZgBtfB7MhxinhIUGNd6Gu1lV9dj4HhlzvUO0nA
8H9Af2cadocboWX/J+BFSRKK+ML7lHYpACxutuRBXPJj5QJXp+oeJFgJBljMs0Id
47iJY07Ezxap6hs9nDeXXkhJd5MrWBCfpnSeRwXyQVly0mALFaFEDSfv9AbGs0ot
8R8I8U9p7QyptHhAeSh1fIifz/LUFz3Ly+7RsR1BmhfRb1wTQqx9ox3GwwXsrtBB
wVTIXX5gO5h2tXViu1CXvGXMFGSukRQcdGg5ylZkd7L14c5P/4kt6EckBoFEdw79
SzeES8EWjwfUsOI3N3A8l9bskOQESXs6iEWHJmqQXotWFEXhg7s=
=qYZ4
-END PGP SIGNATURE-
---


No new revisions were added by this update.

Summary of changes:


[GitHub] [airflow] uranusjr commented on pull request #19410: Remove inaccurate execution date from triggered dag extra link

2021-11-09 Thread GitBox


uranusjr commented on pull request #19410:
URL: https://github.com/apache/airflow/pull/19410#issuecomment-964583517


   > it just times out at 68% completion. Any ideas on how I can move forward 
here?
   
   Don't worry about those, the CI runner timed out.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on a change in pull request #17421: Add ShortCircuitOperator configurability for respecting downstream trigger rules

2021-11-09 Thread GitBox


eladkal commented on a change in pull request #17421:
URL: https://github.com/apache/airflow/pull/17421#discussion_r746071223



##
File path: airflow/operators/python.py
##
@@ -219,17 +219,31 @@ def execute(self, context: Dict):
 
 class ShortCircuitOperator(PythonOperator, SkipMixin):
 """
-Allows a workflow to continue only if a condition is met. Otherwise, the
-workflow "short-circuits" and downstream tasks are skipped.
-
-The ShortCircuitOperator is derived from the PythonOperator. It evaluates a
-condition and short-circuits the workflow if the condition is False. Any
-downstream tasks are marked with a state of "skipped". If the condition is
-True, downstream tasks proceed as normal.
-
-The condition is determined by the result of `python_callable`.
+Allows a workflow to continue only if a condition is met. Otherwise, the 
workflow "short-circuits" and
+downstream tasks are skipped. The short-circuiting can be configured to 
either respect or ignore the
+``trigger_rule`` set for downstream tasks. If 
``ignore_downstream_trigger_rules`` is set to True, the
+default setting, all downstream tasks are skipped without considering the 
``trigger_rule`` defined for
+tasks.  However, if this parameter is set to False, the direct, downstream 
tasks are skipped but the
+specified ``trigger_rule`` for other subsequent downstream tasks are 
respected. In this mode,
+the operator assumes the direct, downstream tasks were purposely meant to 
be skipped but perhaps
+not other subsequent tasks.
+
+The ShortCircuitOperator is derived from the PythonOperator. It evaluates 
a condition and short-circuits
+the workflow if the condition is False. Any downstream tasks are marked 
with a state of "skipped" based
+on the short-circuiting mode configured. If the condition is True, 
downstream tasks proceed as normal.
+
+The condition is determined by the result of ``python_callable``.
+
+:param ignore_downstream_trigger_rules: If set to True, all downstream 
tasks from this operator task will
+be skipped. This is the default behavior. If set to False, the direct, 
downstream task(s) will be
+skipped but the ``trigger_rule`` defined for a other downstream tasks 
will be respected.
+:type ignore_downstream_trigger_rules: bool

Review comment:
   Probably worth also to mention something about it in the operator docs?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v2-2-test updated (c966a5e -> eaead7d)

2021-11-09 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v2-2-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


omit c966a5e  Fix 2.2.2 release date in changelog
 add eaead7d  Fix 2.2.2 release date in changelog (#19498)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (c966a5e)
\
 N -- N -- N   refs/heads/v2-2-test (eaead7d)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:


[airflow] branch main updated (d18e2b0 -> 5786340)

2021-11-09 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from d18e2b0  ``KubernetesExecutor`` should default to template image if 
used (#19484)
 add 5786340  Minor grammar and sentence flow corrections in pip 
installation docs (#19468)

No new revisions were added by this update.

Summary of changes:
 docs/apache-airflow/installation/installing-from-pypi.rst | 12 ++--
 1 file changed, 6 insertions(+), 6 deletions(-)


[GitHub] [airflow] boring-cyborg[bot] commented on pull request #19468: Minor grammar and sentence flow corrections in pip installation docs

2021-11-09 Thread GitBox


boring-cyborg[bot] commented on pull request #19468:
URL: https://github.com/apache/airflow/pull/19468#issuecomment-964570500


   Awesome work, congrats on your first merged pull request!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal merged pull request #19468: Minor grammar and sentence flow corrections in pip installation docs

2021-11-09 Thread GitBox


eladkal merged pull request #19468:
URL: https://github.com/apache/airflow/pull/19468


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v2-2-stable updated: Fix 2.2.2 release date in changelog (#19498)

2021-11-09 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-2-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v2-2-stable by this push:
 new eaead7d  Fix 2.2.2 release date in changelog (#19498)
eaead7d is described below

commit eaead7d721c743cc7ea2d826c11bc65b87b1121e
Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com>
AuthorDate: Tue Nov 9 14:35:50 2021 -0700

Fix 2.2.2 release date in changelog (#19498)
---
 CHANGELOG.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index 241be38..cd45331 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -1,4 +1,4 @@
-Airflow 2.2.2, 2021-11-11
+Airflow 2.2.2, 2021-11-12
 -
 
 Bug Fixes


[GitHub] [airflow] kaxil merged pull request #19498: Sync `v2-2-stable` with `v2-2-test` to release `2.2.2`

2021-11-09 Thread GitBox


kaxil merged pull request #19498:
URL: https://github.com/apache/airflow/pull/19498


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham opened a new pull request #19498: Sync `v2-2-stable` with `v2-2-test` to release `2.2.2`

2021-11-09 Thread GitBox


jedcunningham opened a new pull request #19498:
URL: https://github.com/apache/airflow/pull/19498


   Missed updating the release date 🤦‍♂️


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v2-2-test updated: Fix 2.2.2 release date in changelog

2021-11-09 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-2-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v2-2-test by this push:
 new c966a5e  Fix 2.2.2 release date in changelog
c966a5e is described below

commit c966a5e75367ebaac70ed1b5f9b07c47f9aa27f4
Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com>
AuthorDate: Tue Nov 9 14:29:05 2021 -0700

Fix 2.2.2 release date in changelog
---
 CHANGELOG.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index 241be38..cd45331 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -1,4 +1,4 @@
-Airflow 2.2.2, 2021-11-11
+Airflow 2.2.2, 2021-11-12
 -
 
 Bug Fixes


[GitHub] [airflow] josh-fell commented on a change in pull request #19497: Add "access_key" to DEFAULT_SENSITIVE_FIELDS

2021-11-09 Thread GitBox


josh-fell commented on a change in pull request #19497:
URL: https://github.com/apache/airflow/pull/19497#discussion_r746055063



##
File path: airflow/utils/log/secrets_masker.py
##
@@ -34,6 +34,7 @@
 DEFAULT_SENSITIVE_FIELDS = frozenset(
 {
 'access_token',
+'access_key',

Review comment:
   Thinking if "key" should be added? This would be the 4th entry with 
"key" in the name and generic "secret" and "token" are also in this set. But 
perhaps it is _too_ general of a term.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell commented on a change in pull request #19137: Add RedshiftDataHook

2021-11-09 Thread GitBox


josh-fell commented on a change in pull request #19137:
URL: https://github.com/apache/airflow/pull/19137#discussion_r744830614



##
File path: airflow/providers/amazon/aws/operators/redshift_data.py
##
@@ -0,0 +1,127 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from time import sleep
+from typing import Optional
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.providers.amazon.aws.hooks.redshift_data import RedshiftDataHook
+
+
+class RedshiftDataOperator(BaseOperator):
+"""
+Executes SQL Statements against an Amazon Redshift cluster using Redshift 
Data
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:RedshiftDataOperator`
+
+:param sql: the sql code to be executed
+:type sql: Can receive a str representing a sql statement,
+or an iterable of str (sql statements)
+:param aws_conn_id: AWS connection id (default: aws_default)
+:type aws_conn_id: str
+:param parameters: (optional) the parameters to render the SQL query with.
+:type parameters: dict or iterable
+:param autocommit: if True, each command is automatically committed.
+(default value: False)
+:type autocommit: bool
+"""
+
+template_fields = ('sql',)

Review comment:
   Any other fields to add here that users might want to have flexibility 
to dynamically generate a value for? Maybe `parameters`, `cluster_identifier`, 
`database`, and/or `db_user`. The latter 3 could maybe be added to a Connection 
Extra and accessed with Jinja now that the `Connection` object is accessible in 
the template context like `"{{ conn.conn_id.extra_dejson. }}"` when 
calling the operator. No strong opinions though just something to think about.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v2-2-stable updated (360474f -> 1a790a6)

2021-11-09 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to branch v2-2-stable
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 360474f  Update changelog for 2.2.1rc2
 add 848ac3f  Bump version to 2.2.2
 add 8666bf0  Add explicit session parameter in PoolSlotsAvailableDep 
(#18875)
 add 46bf6b4  Fix typo in ``tutorial.rst`` (#18983)
 add 901901a  Use ``execution_date`` to check for existing ``DagRun`` for 
``TriggerDagRunOperator`` (#18968)
 add 9a60e62  Add Note to SLA regarding schedule_interval (#19173)
 add 7a3212d  Fix Toggle Wrap on DAG code page (#19211)
 add d291f76  sqlite_default has been hard-coded to /tmp, usegettempdir 
instead, (#19255)
 add 7a14324  Fix hidden tooltip position (#19261)
 add 44caa7e  Fix MySQL db migration with default encoding/collation 
(#19268)
 add 06c1cea  Bugfix: Check next run exists before reading data interval 
(#19307)
 add f9b48bb  Switch default Python version to 3.7 (#19317)
 add a0934d1  Clarify dag-not-found error message (#19338)
 add fbb7fbd  Improve Kubernetes Executor docs (#19339)
 add ee532d9  Docs: Fix typo in ``dag-run.rst`` (#19340)
 add 34768a8  Fix message on "Mark as" confirmation page (#19363)
 add 9b01467  Only mark SchedulerJobs as failed, not any jobs (#19375)
 add 8151307  Fix downgrade for a DB Migration (#19390)
 add 8dc9541  Task should fail immediately when pod is unprocessable 
(#19359)
 add 36c7308  Check if job object is None before calling .is_alive() 
(#19380)
 add 75f1d2a  Add missing parameter documentation for "timetable" (#19282)
 add 157a864  Fix serialization of Params with set data type (#19267)
 add ea0c4bd  Fix task instance modal in gantt view (#19258)
 add d478f93  Fix moving of dangling TaskInstance rows for SQL Server 
(#19425)
 add 3e8782a  Fix Serialization when``relativedelta`` is passed as 
``schedule_interval``  (#19418)
 add bef01d9  Fix bug when checking for existence of a Variable (#19395)
 add e9dffdd  FAB still requires WTForms < 3.0 (#19466)
 add 4b921da  Restored proper default branch and constraint branch
 add e5e2b5f  Update image used in docker docs
 add 43c9730  Fix whitespace error causing failing graphviz test (#19472)
 add 67af807  Disable React UI tests for non-main
 add 95c9505  Fix failing static check (#18890)
 add 562e7d2  Fix failing static check (#18891)
 add 1a790a6  Add 2.2.2 to `CHANGELOG` and `UPDATING`

No new revisions were added by this update.

Summary of changes:
 .github/workflows/ci.yml   |   4 +-
 .pre-commit-config.yaml|   1 +
 CHANGELOG.txt  |  34 +
 README.md  |  14 +-
 UPDATING.md|   5 +
 airflow/api/common/experimental/trigger_dag.py |   6 +-
 airflow/cli/commands/standalone_command.py |   5 +-
 airflow/dag_processing/processor.py|  25 ++--
 airflow/executors/kubernetes_executor.py   |  10 +-
 airflow/jobs/scheduler_job.py  |   1 +
 .../7b2661a43ba3_taskinstance_keyed_to_dagrun.py   |   9 +-
 airflow/models/dag.py  |  16 +-
 airflow/models/dagrun.py   |  65 +---
 airflow/models/param.py|   4 +-
 airflow/models/variable.py |   3 +-
 airflow/serialization/schema.json  |  22 ++-
 airflow/serialization/serialized_objects.py|  50 +--
 airflow/ti_deps/deps/pool_slots_available_dep.py   |   2 +-
 airflow/timetables/interval.py |   5 +-
 airflow/utils/cli.py   |   6 +-
 airflow/utils/db.py|  81 +++---
 airflow/www/security.py|   2 +-
 airflow/www/static/css/main.css|   9 ++
 airflow/www/static/js/dag_code.js  |  12 +-
 airflow/www/templates/airflow/_messages.html   |   4 +-
 airflow/www/templates/airflow/dags.html|  10 +-
 airflow/www/templates/airflow/main.html|   6 +-
 airflow/www/templates/appbuilder/flash.html|   2 +-
 airflow/www/views.py   |   2 +
 docs/apache-airflow/concepts/tasks.rst |   4 +
 docs/apache-airflow/dag-run.rst|   4 +-
 docs/apache-airflow/executor/kubernetes.rst| 166 +
 docs/apache-airflow/tutorial.rst   |   2 +-
 .../extending/add-apt-packages/Dockerfile  |   2 +-
 .../add-build-essential-extend/Dockerfile  |   2 +-
 .../extending/add-providers/Dockerfile |   2 +-
 .../extending/add-pypi-packages/Dockerfile |   2 +-
 .../extending/embedding-dags/Dockerfile|   2 +-
 .../extending/writ

[GitHub] [airflow] jedcunningham merged pull request #19481: Sync `v2-2-stable` with `v2-2-test` to release `2.2.2`

2021-11-09 Thread GitBox


jedcunningham merged pull request #19481:
URL: https://github.com/apache/airflow/pull/19481


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell commented on a change in pull request #19497: Add "access_key" to DEFAULT_SENSITIVE_FIELDS

2021-11-09 Thread GitBox


josh-fell commented on a change in pull request #19497:
URL: https://github.com/apache/airflow/pull/19497#discussion_r746055063



##
File path: airflow/utils/log/secrets_masker.py
##
@@ -34,6 +34,7 @@
 DEFAULT_SENSITIVE_FIELDS = frozenset(
 {
 'access_token',
+'access_key',

Review comment:
   Generally thinking if "key" should be added? This would be the 4th entry 
with "key" in the name and generic "secret" and "token" are also in this set. 
But perhaps it is _too_ general of a term.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexInhert commented on pull request #19482: [19458] Added column duration to DAG runs view

2021-11-09 Thread GitBox


alexInhert commented on pull request #19482:
URL: https://github.com/apache/airflow/pull/19482#issuecomment-964557364


   Should it have tooltip that explains that the metric is in seconds?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexInhert removed a comment on issue #19458: DAG Run Views showing information of DAG duration

2021-11-09 Thread GitBox


alexInhert removed a comment on issue #19458:
URL: https://github.com/apache/airflow/issues/19458#issuecomment-964557144


   Should it have tooltip that explains that the metric is in seconds?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alexInhert commented on issue #19458: DAG Run Views showing information of DAG duration

2021-11-09 Thread GitBox


alexInhert commented on issue #19458:
URL: https://github.com/apache/airflow/issues/19458#issuecomment-964557144


   Should it have tooltip that explains that the metric is in seconds?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell opened a new pull request #19497: Add "access_key" to DEFAULT_SENSITIVE_FIELDS

2021-11-09 Thread GitBox


josh-fell opened a new pull request #19497:
URL: https://github.com/apache/airflow/pull/19497


   Users who call the `WasbHook` and authenticate with a Shared Access Key to 
Azure Blob Storage will see their Shared Access Key in plain text within the 
Task Instance logs. Unfortunately since `Host` is not masked and is required to 
authenticate with a Shared Access Key, the entirety of the supplied credentials 
are written in the log.
   
   **Sample log entry before**
   ```shell
   [2021-11-09, 20:49:52 UTC] {base.py:79} INFO - Using connection to: id: 
wasb_default. Host: https://myAccountBlahBlah.blob.core.windows.net/, Port: 
None, Schema: , Login: , Password: None, extra: 
{'extra__wasb__connection_string': '', 'extra__wasb__sas_token': '***', 
'extra__wasb__shared_access_key': 
'KEa1QvMjxMNLuzTgVZFvS6PpQv087Ls0Oq+7Ic/fa9Lu3RQwunHi61yZTCJSCIo1gZBNLuzTgVZFvS6PpQv087Ls0Oq+7Ic/fa9Lu3RQwunHi61yZTCJSCIo1gZBH1cc/KZb3EAKXrqWXX==',
 'extra__wasb__tenant_id': ''}
   ```
   
   
   **Sample log entry after**
   ```shell
   [2021-11-09, 20:56:37 UTC] {base.py:79} INFO - Using connection to: id: 
wasb_default. Host: https://myAccountBlahBlah.blob.core.windows.net/, Port: 
None, Schema: , Login: , Password: None, extra: 
{'extra__wasb__connection_string': '', 'extra__wasb__sas_token': '***', 
'extra__wasb__shared_access_key': '***', 'extra__wasb__tenant_id': ''}
   ```
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] john-jac commented on a change in pull request #19137: Add RedshiftDataHook

2021-11-09 Thread GitBox


john-jac commented on a change in pull request #19137:
URL: https://github.com/apache/airflow/pull/19137#discussion_r746044118



##
File path: docs/apache-airflow-providers-amazon/operators/redshift_data.rst
##
@@ -0,0 +1,51 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+.. _howto/operator:RedshiftDataOperator:
+
+RedshiftDataOperator
+
+
+.. contents::
+  :depth: 1
+  :local:
+
+Overview
+
+
+Use the :class:`RedshiftDataOperator 
` to execute
+statements against an Amazon Redshift cluster.
+
+
+example_redshift_data_execute_sql.py
+
+
+Purpose
+"""
+
+This is a basic example dag for using :class:`RedshiftDataOperator 
`

Review comment:
   Done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] john-jac commented on a change in pull request #19137: Add RedshiftDataHook

2021-11-09 Thread GitBox


john-jac commented on a change in pull request #19137:
URL: https://github.com/apache/airflow/pull/19137#discussion_r746042827



##
File path: airflow/providers/amazon/aws/operators/redshift_data.py
##
@@ -0,0 +1,127 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from time import sleep
+from typing import Optional
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.providers.amazon.aws.hooks.redshift_data import RedshiftDataHook
+
+
+class RedshiftDataOperator(BaseOperator):
+"""
+Executes SQL Statements against an Amazon Redshift cluster using Redshift 
Data
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:RedshiftDataOperator`
+
+:param sql: the sql code to be executed
+:type sql: Can receive a str representing a sql statement,
+or an iterable of str (sql statements)
+:param aws_conn_id: AWS connection id (default: aws_default)
+:type aws_conn_id: str
+:param parameters: (optional) the parameters to render the SQL query with.
+:type parameters: dict or iterable
+:param autocommit: if True, each command is automatically committed.
+(default value: False)
+:type autocommit: bool
+"""
+
+template_fields = ('sql',)
+template_ext = ('.sql',)
+

Review comment:
   Done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (2590013 -> d18e2b0)

2021-11-09 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 2590013  Clarify that .asf.yml and codecov.yml should be changed in 
main (#19496)
 add d18e2b0  ``KubernetesExecutor`` should default to template image if 
used (#19484)

No new revisions were added by this update.

Summary of changes:
 airflow/kubernetes/kube_config.py  |  5 -
 tests/kubernetes/test_pod_generator.py | 22 ++
 tests/models/test_taskinstance.py  |  1 -
 3 files changed, 18 insertions(+), 10 deletions(-)


[GitHub] [airflow] kaxil merged pull request #19484: KubernetesExecutor should default to template image if used

2021-11-09 Thread GitBox


kaxil merged pull request #19484:
URL: https://github.com/apache/airflow/pull/19484


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (4d14885 -> 2590013)

2021-11-09 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 4d14885  Fix docker "after entrypoint" custom script example (#19495)
 add 2590013  Clarify that .asf.yml and codecov.yml should be changed in 
main (#19496)

No new revisions were added by this update.

Summary of changes:
 dev/README_RELEASE_AIRFLOW.md | 15 ++-
 1 file changed, 10 insertions(+), 5 deletions(-)


[GitHub] [airflow] potiuk merged pull request #19496: Clarify that .asf.yml and codecov.yml should be changed in main

2021-11-09 Thread GitBox


potiuk merged pull request #19496:
URL: https://github.com/apache/airflow/pull/19496


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (316632e -> 4d14885)

2021-11-09 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 316632e  Update helm chart release docs (#19494)
 add 4d14885  Fix docker "after entrypoint" custom script example (#19495)

No new revisions were added by this update.

Summary of changes:
 docs/docker-stack/entrypoint.rst | 10 +-
 1 file changed, 5 insertions(+), 5 deletions(-)


[GitHub] [airflow] kaxil merged pull request #19495: Fix docker "after entrypoint" custom script example

2021-11-09 Thread GitBox


kaxil merged pull request #19495:
URL: https://github.com/apache/airflow/pull/19495


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #19496: Clarify that .asf.yml and codecov.yml should be changed in main

2021-11-09 Thread GitBox


github-actions[bot] commented on pull request #19496:
URL: https://github.com/apache/airflow/pull/19496#issuecomment-964540340


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest main or amend the last commit of 
the PR, and push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated: Update helm chart release docs (#19494)

2021-11-09 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 316632e  Update helm chart release docs (#19494)
316632e is described below

commit 316632e63bf1ef79446ed2cd9587b2d3d666bf1a
Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com>
AuthorDate: Tue Nov 9 13:59:42 2021 -0700

Update helm chart release docs (#19494)
---
 dev/README_RELEASE_HELM_CHART.md | 97 
 docs/publish_docs.py |  2 +-
 2 files changed, 80 insertions(+), 19 deletions(-)

diff --git a/dev/README_RELEASE_HELM_CHART.md b/dev/README_RELEASE_HELM_CHART.md
index ad10324..54afbe1 100644
--- a/dev/README_RELEASE_HELM_CHART.md
+++ b/dev/README_RELEASE_HELM_CHART.md
@@ -38,6 +38,11 @@
   - [Publish documentation](#publish-documentation)
   - [Notify developers of release](#notify-developers-of-release)
   - [Update Announcements page](#update-announcements-page)
+  - [Create release on GitHub](#create-release-on-github)
+  - [Close the milestone](#close-the-milestone)
+  - [Announce the release on the community 
slack](#announce-the-release-on-the-community-slack)
+  - [Tweet about the release](#tweet-about-the-release)
+  - [Bump chart version in Chart.yaml](#bump-chart-version-in-chartyaml)
   - [Remove old releases](#remove-old-releases)
 
 
@@ -499,6 +504,7 @@ svn checkout 
https://dist.apache.org/repos/dist/release/airflow airflow-release
 
 # Create new folder for the release
 cd airflow-release/helm-chart
+export AIRFLOW_SVN_RELEASE_HELM=$(pwd)
 svn mkdir ${VERSION}
 cd ${VERSION}
 
@@ -518,7 +524,7 @@ Create and push the release tag:
 ```shell
 cd "${AIRFLOW_REPO_ROOT}"
 git checkout helm-chart/${RC}
-git tag -s helm-chart/${VERSION}
+git tag -s helm-chart/${VERSION} -m "Apache Airflow Helm Chart ${VERSION}"
 git push origin helm-chart/${VERSION}
 ```
 
@@ -529,11 +535,12 @@ In our cases, documentation for the released versions is 
published in a separate
 build tools are available in the `apache/airflow` repository, so you have to 
coordinate
 between the two repositories to be able to build the documentation.
 
-- First, copy the airflow-site repository and set the environment variable 
``AIRFLOW_SITE_DIRECTORY``.
+- First, copy the airflow-site repository, create branch, and set the 
environment variable ``AIRFLOW_SITE_DIRECTORY``.
 
 ```shell
 git clone https://github.com/apache/airflow-site.git airflow-site
 cd airflow-site
+git checkout -b helm-${VERSION}-docs
 export AIRFLOW_SITE_DIRECTORY="$(pwd)"
 ```
 
@@ -545,20 +552,6 @@ between the two repositories to be able to build the 
documentation.
 ./breeze build-docs -- --package-filter helm-chart --for-production
 ```
 
-- Update `index.yaml`
-
-  We upload `index.yaml` to the Airflow website to allow: `helm repo add 
https://airflow.apache.org`.
-
-```shell
-cd "${AIRFLOW_SITE_DIRECTORY}"
-curl 
https://dist.apache.org/repos/dist/dev/airflow/helm-chart/${RC}/index.yaml -o 
index.yaml
-https://dist.apache.org/repos/dist/dev/airflow/helm-chart/${VERSION}
-sed -i 
"s|https://dist.apache.org/repos/dist/dev/airflow/helm-chart/$RC|https://downloads.apache.org/airflow/helm-chart/$VERSION|"
 index.yaml
-
-git commit -m "Add documentation for Apache Airflow Helm Chart ${VERSION}"
-git push
-```
-
 - Now you can preview the documentation.
 
 ```shell
@@ -569,14 +562,33 @@ between the two repositories to be able to build the 
documentation.
 
 ```shell
 ./docs/publish_docs.py --package-filter helm-chart
+```
+
+- Update `index.yaml`
+
+  Regenerate `index.yaml` so it can be added to the Airflow website to allow: 
`helm repo add https://airflow.apache.org`.
+
+```shell
 cd "${AIRFLOW_SITE_DIRECTORY}"
+curl 
https://dist.apache.org/repos/dist/dev/airflow/helm-chart/$RC/index.yaml -o 
index.yaml
+cp ${AIRFLOW_SVN_RELEASE_HELM}/${VERSION}/airflow-${VERSION}.tgz .
+helm repo index --merge ./index.yaml . --url 
"https://downloads.apache.org/airflow/helm-chart/$VERSION";
+rm airflow-${VERSION}.tgz
+mv index.yaml landing-pages/site/static/index.yaml
+```
+
+- Commit new docs, push, and open PR
+
+```shell
+git add .
 git commit -m "Add documentation for Apache Airflow Helm Chart ${VERSION}"
 git push
+# and finally open a PR
 ```
 
 ## Notify developers of release
 
-- Notify us...@airflow.apache.org (cc'ing d...@airflow.apache.org and 
annou...@apache.org) that
+- Notify us...@airflow.apache.org (cc'ing d...@airflow.apache.org) that
 the artifacts have been published:
 
 Subject:
@@ -597,7 +609,7 @@ I am pleased to announce that we have released Apache 
Airflow Helm chart $VERSIO
 
 The source release, as well as the "binary" Helm Chart release, are available:
 
-📦   Official Sources: 
https://airflow.

[GitHub] [airflow] kaxil merged pull request #19494: Update helm chart release docs

2021-11-09 Thread GitBox


kaxil merged pull request #19494:
URL: https://github.com/apache/airflow/pull/19494


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk opened a new pull request #19496: Clarify that .asf.yml and codecov.yml should be changed in main

2021-11-09 Thread GitBox


potiuk opened a new pull request #19496:
URL: https://github.com/apache/airflow/pull/19496


   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #19495: Fix docker "after entrypoint" custom script example

2021-11-09 Thread GitBox


github-actions[bot] commented on pull request #19495:
URL: https://github.com/apache/airflow/pull/19495#issuecomment-964522731


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest main or amend the last commit of 
the PR, and push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk closed pull request #19139: Changing password as it gets masked across logs and causes issues

2021-11-09 Thread GitBox


potiuk closed pull request #19139:
URL: https://github.com/apache/airflow/pull/19139


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] khalidmammadov opened a new pull request #19139: Changing password as it gets masked across logs and causes issues

2021-11-09 Thread GitBox


khalidmammadov opened a new pull request #19139:
URL: https://github.com/apache/airflow/pull/19139


   Current postgres password gets masked during DB engine init and added to 
secret list and masked whenever mentioned in the logs. 
   As it currently set to "airflow" it's converted to *** everywhere airflow is 
printed in the logs including folder path etc. 
   
   Currently, this will fix one of the failing test cases from quarantined list 
that expects "airflow" path in the log but gets ***. 
   But this also should be noted for future similar cases as to not to name 
password as "airflow" or similar reserved/widely used words as they will be 
masked in the logs.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk closed pull request #19410: Remove inaccurate execution date from triggered dag extra link

2021-11-09 Thread GitBox


potiuk closed pull request #19410:
URL: https://github.com/apache/airflow/pull/19410


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #19453: Improve message and documentation around moved data

2021-11-09 Thread GitBox


potiuk commented on pull request #19453:
URL: https://github.com/apache/airflow/pull/19453#issuecomment-964505596


   Thanks for the commends @jedcunningham @uranusjr @kaxil  - I think it's much 
better now :)
   
   Proposal: if it won't make it in 2.2.2 (I doubt now)  - then we can 
cherry-pick that one to v2-2-test and build docs using it so that at least the 
docs are published with 2.2.2 release :)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ShakaibKhan commented on a change in pull request #18557: Warning of public exposure of deployment in UI with on/off config

2021-11-09 Thread GitBox


ShakaibKhan commented on a change in pull request #18557:
URL: https://github.com/apache/airflow/pull/18557#discussion_r745987996



##
File path: airflow/www/views.py
##
@@ -702,6 +702,22 @@ def _iter_parsed_moved_data_table_names():
 # Second segment is a version marker that we don't need to 
show.
 yield segments[2], table_name
 
+warn_deployment_query = session.query(Log).filter(Log.event == 
"robots").count()

Review comment:
   Wondering if it would be good enough to clear requests older than a week 
and only display the message on recent requests




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ShakaibKhan commented on a change in pull request #18557: Warning of public exposure of deployment in UI with on/off config

2021-11-09 Thread GitBox


ShakaibKhan commented on a change in pull request #18557:
URL: https://github.com/apache/airflow/pull/18557#discussion_r745987064



##
File path: airflow/www/views.py
##
@@ -702,6 +702,22 @@ def _iter_parsed_moved_data_table_names():
 # Second segment is a version marker that we don't need to 
show.
 yield segments[2], table_name
 
+warn_deployment_query = session.query(Log).filter(Log.event == 
"robots").count()
+if (
+permissions.ACTION_CAN_ACCESS_MENU,
+permissions.RESOURCE_ADMIN_MENU,
+) in user_permissions and warn_deployment_query > 0:
+flash(
+Markup(
+'Recent requests have been made to /robots.txt. '
+'This indicates that this deployment may be accessible to 
the public internet. '
+'This warning can be disabled by setting 
webserver.warn_deployment_exposure=False in '
+'airflow.cfg. Read more about web deployment security https://airflow.apache.org/docs/apache-airflow/stable/security/webserver.html";>here'

Review comment:
   I will add the config to documentation




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ShakaibKhan commented on a change in pull request #18557: Warning of public exposure of deployment in UI with on/off config

2021-11-09 Thread GitBox


ShakaibKhan commented on a change in pull request #18557:
URL: https://github.com/apache/airflow/pull/18557#discussion_r745986631



##
File path: airflow/www/views.py
##
@@ -702,6 +702,22 @@ def _iter_parsed_moved_data_table_names():
 # Second segment is a version marker that we don't need to 
show.
 yield segments[2], table_name
 
+warn_deployment_query = session.query(Log).filter(Log.event == 
"robots").count()
+if (
+permissions.ACTION_CAN_ACCESS_MENU,
+permissions.RESOURCE_ADMIN_MENU,
+) in user_permissions and warn_deployment_query > 0:
+flash(
+Markup(
+'Recent requests have been made to /robots.txt. '
+'This indicates that this deployment may be accessible to 
the public internet. '
+'This warning can be disabled by setting 
webserver.warn_deployment_exposure=False in '

Review comment:
   removed it during a merge, I will put it back in the if statement




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ShakaibKhan commented on a change in pull request #18557: Warning of public exposure of deployment in UI with on/off config

2021-11-09 Thread GitBox


ShakaibKhan commented on a change in pull request #18557:
URL: https://github.com/apache/airflow/pull/18557#discussion_r745986631



##
File path: airflow/www/views.py
##
@@ -702,6 +702,22 @@ def _iter_parsed_moved_data_table_names():
 # Second segment is a version marker that we don't need to 
show.
 yield segments[2], table_name
 
+warn_deployment_query = session.query(Log).filter(Log.event == 
"robots").count()
+if (
+permissions.ACTION_CAN_ACCESS_MENU,
+permissions.RESOURCE_ADMIN_MENU,
+) in user_permissions and warn_deployment_query > 0:
+flash(
+Markup(
+'Recent requests have been made to /robots.txt. '
+'This indicates that this deployment may be accessible to 
the public internet. '
+'This warning can be disabled by setting 
webserver.warn_deployment_exposure=False in '

Review comment:
   removed it during a merge, I will put it back




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #19453: Improve message and documentation around moved data

2021-11-09 Thread GitBox


potiuk commented on pull request #19453:
URL: https://github.com/apache/airflow/pull/19453#issuecomment-964501613


   > Hi, can we add some simple logging message to code ?
   
   `airflow db upgrade` does not have logging. It will print the output to 
stdout.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #19453: Improve message and documentation around moved data

2021-11-09 Thread GitBox


potiuk commented on a change in pull request #19453:
URL: https://github.com/apache/airflow/pull/19453#discussion_r745977846



##
File path: docs/apache-airflow/installation/upgrading.rst
##
@@ -0,0 +1,77 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Upgrading Airflow to a newer version
+
+
+When you are upgrading Airflow you should always upgrade the metadata DB as 
part of the upgrade
+process. This might happen automatically in some cases.

Review comment:
   I like the idea of adding more sections. I refactored it quite a bit.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] john-jac commented on a change in pull request #19137: Add RedshiftDataHook

2021-11-09 Thread GitBox


john-jac commented on a change in pull request #19137:
URL: https://github.com/apache/airflow/pull/19137#discussion_r745973945



##
File path: airflow/providers/amazon/aws/operators/redshift_data.py
##
@@ -0,0 +1,127 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from time import sleep
+from typing import Optional
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.providers.amazon.aws.hooks.redshift_data import RedshiftDataHook
+
+
+class RedshiftDataOperator(BaseOperator):
+"""
+Executes SQL Statements against an Amazon Redshift cluster using Redshift 
Data
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:RedshiftDataOperator`
+
+:param sql: the sql code to be executed
+:type sql: Can receive a str representing a sql statement,
+or an iterable of str (sql statements)

Review comment:
   Done

##
File path: docs/apache-airflow-providers-amazon/operators/redshift_data.rst
##
@@ -0,0 +1,51 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+.. _howto/operator:RedshiftDataOperator:
+
+RedshiftDataOperator
+
+
+.. contents::
+  :depth: 1
+  :local:
+
+Overview
+
+
+Use the :class:`RedshiftDataOperator 
` to execute
+statements against an Amazon Redshift cluster.
+

Review comment:
   Done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] john-jac commented on a change in pull request #19137: Add RedshiftDataHook

2021-11-09 Thread GitBox


john-jac commented on a change in pull request #19137:
URL: https://github.com/apache/airflow/pull/19137#discussion_r745972830



##
File path: airflow/providers/amazon/aws/hooks/redshift_data.py
##
@@ -0,0 +1,152 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift clusters."""
+
+from typing import Optional
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class RedshiftDataHook(AwsBaseHook):
+"""
+Interact with AWS Redshift Data, using the boto3 library
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+:class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+
+:param aws_conn_id: The Airflow connection used for AWS credentials.
+:type aws_conn_id: str
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "redshift-data"
+super().__init__(*args, **kwargs)
+
+def execute_statement(
+self,
+cluster_identifier: str,
+database: str,
+sql: str,
+db_user: Optional[str] = "",
+parameters: Optional[list] = None,
+secret_arn: Optional[str] = "",
+statement_name: Optional[str] = "",
+with_event: Optional[bool] = False,
+):
+"""
+Runs an SQL statement, which can be data manipulation language (DML)
+or data definition language (DDL)
+
+:param cluster_identifier: unique identifier of a cluster
+:type cluster_identifier: str
+:param database: the name of the database
+:type database: str
+:param sql: the SQL statement text to run
+:type sql: str
+:param db_user: the database user name
+:type db_user: str
+:param parameters: the parameters for the SQL statement
+:type parameters: list
+:param secret_arn: the name or ARN of the secret that enables db access
+:type secret_arn: str
+:param statement_name: the name of the SQL statement
+:type statement_name: str
+:param with_event: indicates whether to send an event to EventBridge
+:type with_event: bool
+
+"""
+"""only provide parameter argument if it is valid"""

Review comment:
   Done

##
File path: 
airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py
##
@@ -0,0 +1,83 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from datetime import datetime, timedelta
+from os import getenv
+
+from airflow.decorators import dag, task
+from airflow.providers.amazon.aws.hooks.redshift_data import RedshiftDataHook
+from airflow.providers.amazon.aws.operators.redshift_data import 
RedshiftDataOperator
+
+# [START howto_operator_redshift_data_env_variables]
+REDSHIFT_CLUSTER_IDENTIFIER = getenv("REDSHIFT_CLUSTER_IDENTIFIER", 
"test-cluster")
+REDSHIFT_DATABASE = getenv("REDSHIFT_DATABASE", "test-database")
+REDSHIFT_DATABASE_USER = getenv("REDSHIFT_DATABASE_USER", "awsuser")
+# [END howto_operator_redshift_data_env_variables]
+
+REDSHIFT_QUERY = """
+SELECT table_schema,
+   table_name
+FROM information_schema.tables
+WHERE table_schema NOT IN ('information_schema', 'pg_catalog')
+  AND table_type = 'BASE TABLE'
+ORDER BY table_schema,
+ table_name;
+"""
+POLL_INTERVAL = 10
+TIMEOUT = 600
+
+
+# [START howto_redshift_data]
+@dag(
+d

[GitHub] [airflow] jedcunningham opened a new pull request #19495: Fix docker "after entrypoint" custom script example

2021-11-09 Thread GitBox


jedcunningham opened a new pull request #19495:
URL: https://github.com/apache/airflow/pull/19495


   Fix both the tag used and the image used to run the custom script.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] john-jac commented on a change in pull request #19137: Add RedshiftDataHook

2021-11-09 Thread GitBox


john-jac commented on a change in pull request #19137:
URL: https://github.com/apache/airflow/pull/19137#discussion_r745972585



##
File path: airflow/providers/amazon/aws/hooks/redshift_data.py
##
@@ -0,0 +1,152 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift clusters."""
+
+from typing import Optional
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class RedshiftDataHook(AwsBaseHook):
+"""
+Interact with AWS Redshift Data, using the boto3 library
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+:class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+
+:param aws_conn_id: The Airflow connection used for AWS credentials.
+:type aws_conn_id: str
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "redshift-data"
+super().__init__(*args, **kwargs)
+
+def execute_statement(
+self,
+cluster_identifier: str,
+database: str,
+sql: str,
+db_user: Optional[str] = "",
+parameters: Optional[list] = None,
+secret_arn: Optional[str] = "",
+statement_name: Optional[str] = "",
+with_event: Optional[bool] = False,
+):
+"""
+Runs an SQL statement, which can be data manipulation language (DML)
+or data definition language (DDL)
+
+:param cluster_identifier: unique identifier of a cluster
+:type cluster_identifier: str
+:param database: the name of the database
+:type database: str
+:param sql: the SQL statement text to run
+:type sql: str
+:param db_user: the database user name
+:type db_user: str
+:param parameters: the parameters for the SQL statement
+:type parameters: list
+:param secret_arn: the name or ARN of the secret that enables db access
+:type secret_arn: str
+:param statement_name: the name of the SQL statement
+:type statement_name: str
+:param with_event: indicates whether to send an event to EventBridge
+:type with_event: bool
+
+"""
+"""only provide parameter argument if it is valid"""
+if parameters:
+response = self.get_conn().execute_statement(
+ClusterIdentifier=cluster_identifier,
+Database=database,
+Sql=sql,
+DbUser=db_user,
+WithEvent=with_event,
+SecretArn=secret_arn,
+StatementName=statement_name,
+Parameters=parameters,
+)
+else:
+response = self.get_conn().execute_statement(
+ClusterIdentifier=cluster_identifier,
+Database=database,
+Sql=sql,
+DbUser=db_user,
+WithEvent=with_event,
+SecretArn=secret_arn,
+StatementName=statement_name,
+)
+return response['Id'] if response['Id'] else None
+
+def describe_statement(
+self,
+id: str,
+):
+"""
+Describes the details about a specific instance when a query was run
+by the Amazon Redshift Data API
+
+:param id: the identifier of the SQL statement to describe.
+:type id: str
+
+"""
+response = self.get_conn().describe_statement(
+Id=id,
+)
+return response['Status']
+
+def get_statement_result(
+self,
+id: str,
+next_token: Optional[str] = "",
+):
+"""
+Fetches the temporarily cached result of an SQL statement, a token is
+returned to page through the statement results
+
+:param id: the identifier of the SQL statement to describe.
+:type id: str
+:param next_token: a value that indicates the starting point for the 
next set of response records
+:type next_token: str
+
+"""
+response = self.get_conn().get_statement_result(
+Id=id,
+NextToken=next_token,
+)
+  

[GitHub] [airflow] john-jac commented on a change in pull request #19137: Add RedshiftDataHook

2021-11-09 Thread GitBox


john-jac commented on a change in pull request #19137:
URL: https://github.com/apache/airflow/pull/19137#discussion_r745972405



##
File path: airflow/providers/amazon/aws/hooks/redshift_data.py
##
@@ -0,0 +1,152 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift clusters."""
+
+from typing import Optional
+
+from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
+
+
+class RedshiftDataHook(AwsBaseHook):
+"""
+Interact with AWS Redshift Data, using the boto3 library
+
+Additional arguments (such as ``aws_conn_id``) may be specified and
+are passed down to the underlying AwsBaseHook.
+
+.. seealso::
+:class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook`
+
+:param aws_conn_id: The Airflow connection used for AWS credentials.
+:type aws_conn_id: str
+"""
+
+def __init__(self, *args, **kwargs) -> None:
+kwargs["client_type"] = "redshift-data"
+super().__init__(*args, **kwargs)
+
+def execute_statement(
+self,
+cluster_identifier: str,
+database: str,
+sql: str,
+db_user: Optional[str] = "",
+parameters: Optional[list] = None,
+secret_arn: Optional[str] = "",
+statement_name: Optional[str] = "",
+with_event: Optional[bool] = False,
+):
+"""
+Runs an SQL statement, which can be data manipulation language (DML)
+or data definition language (DDL)
+
+:param cluster_identifier: unique identifier of a cluster
+:type cluster_identifier: str
+:param database: the name of the database
+:type database: str
+:param sql: the SQL statement text to run
+:type sql: str
+:param db_user: the database user name
+:type db_user: str
+:param parameters: the parameters for the SQL statement
+:type parameters: list
+:param secret_arn: the name or ARN of the secret that enables db access
+:type secret_arn: str
+:param statement_name: the name of the SQL statement
+:type statement_name: str
+:param with_event: indicates whether to send an event to EventBridge
+:type with_event: bool
+
+"""
+"""only provide parameter argument if it is valid"""
+if parameters:
+response = self.get_conn().execute_statement(
+ClusterIdentifier=cluster_identifier,
+Database=database,
+Sql=sql,
+DbUser=db_user,
+WithEvent=with_event,
+SecretArn=secret_arn,
+StatementName=statement_name,
+Parameters=parameters,
+)
+else:
+response = self.get_conn().execute_statement(
+ClusterIdentifier=cluster_identifier,
+Database=database,
+Sql=sql,
+DbUser=db_user,
+WithEvent=with_event,
+SecretArn=secret_arn,
+StatementName=statement_name,
+)
+return response['Id'] if response['Id'] else None
+
+def describe_statement(
+self,
+id: str,
+):
+"""
+Describes the details about a specific instance when a query was run
+by the Amazon Redshift Data API
+
+:param id: the identifier of the SQL statement to describe.
+:type id: str
+
+"""
+response = self.get_conn().describe_statement(
+Id=id,
+)
+return response['Status']

Review comment:
   Done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #19453: Improve message and documentation around moved data

2021-11-09 Thread GitBox


potiuk commented on a change in pull request #19453:
URL: https://github.com/apache/airflow/pull/19453#discussion_r745971927



##
File path: airflow/www/templates/airflow/dags.html
##
@@ -57,7 +57,8 @@
   Airflow found incompatible data in the {{ original_table_name 
}} table in the
   metadatabase, and has moved them to {{ moved_table_name }} 
during the database migration
   to upgrade. Please inspect the moved data to decide whether you need to 
keep them, and manually drop
-  the {{ moved_table_name }} table to dismiss this warning.
+  the {{ moved_table_name }} table to dismiss this 
warning.
+ Read more about it in Upgrading

Review comment:
   In the end it's not that important I removed the paragraph




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk edited a comment on pull request #19470: Add how-to Guide for MSSQL operators

2021-11-09 Thread GitBox


potiuk edited a comment on pull request #19470:
URL: https://github.com/apache/airflow/pull/19470#issuecomment-964480919


   Yeah @Bowrna . The Doc errors are a but misleading and require a bit of 
deeper understanding :
   
   You need to make sure that link to "example_dags" is added to the index - 
otherwise no-one will be able to find it :). It's this file:
   
   ```
   /docs/apache-airflow-providers-microsoft-mssql/index.rst
   ```
   
   You can see `index.rst` in other providers that alredy have `example_dags` 
and do it similarly.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #19470: Add how-to Guide for MSSQL operators

2021-11-09 Thread GitBox


potiuk commented on pull request #19470:
URL: https://github.com/apache/airflow/pull/19470#issuecomment-964480919


   Yeah @Bowrna . The Doc errors are a but misleading and require a bit of 
deeper understanding :
   
   You need to make sure that link to "example_dags" is added to the index - 
otherwise no-one will be able to find it :):
   
   ```
   /docs/apache-airflow-providers-microsoft-mssql/index.rst
   ```
   
   You can see `index.rst` in other providers that alredy have `example_dags` 
and do it similarly.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #19494: Update helm chart release docs

2021-11-09 Thread GitBox


github-actions[bot] commented on pull request #19494:
URL: https://github.com/apache/airflow/pull/19494#issuecomment-964477367


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest main or amend the last commit of 
the PR, and push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v2-2-test updated (79e707b -> 1a790a6)

2021-11-09 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to branch v2-2-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard 79e707b  Disable React UI tests for non-main
 discard 62fa057  Fix whitespace error causing failing graphviz test (#19472)
 discard 6cf67a1  Fix static checks
 discard 19edd2d  Restored proper default branch and constraint branch
 discard 0fd01e6  Apply suggestions from code review
 discard 51b0786  Add 2.2.2 to `CHANGELOG` and `UPDATING`
 add 4b921da  Restored proper default branch and constraint branch
 add e5e2b5f  Update image used in docker docs
 add 43c9730  Fix whitespace error causing failing graphviz test (#19472)
 add 67af807  Disable React UI tests for non-main
 add 95c9505  Fix failing static check (#18890)
 add 562e7d2  Fix failing static check (#18891)
 add 1a790a6  Add 2.2.2 to `CHANGELOG` and `UPDATING`

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (79e707b)
\
 N -- N -- N   refs/heads/v2-2-test (1a790a6)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 .pre-commit-config.yaml | 4 ++--
 CHANGELOG.txt   | 1 +
 airflow/www/security.py | 6 +++---
 3 files changed, 6 insertions(+), 5 deletions(-)


[GitHub] [airflow] jedcunningham opened a new pull request #19494: Update helm chart release docs

2021-11-09 Thread GitBox


jedcunningham opened a new pull request #19494:
URL: https://github.com/apache/airflow/pull/19494


   Flesh out the helm chart release docs.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on pull request #19470: Add how-to Guide for MSSQL operators

2021-11-09 Thread GitBox


eladkal commented on pull request #19470:
URL: https://github.com/apache/airflow/pull/19470#issuecomment-964418802


   I can't say for sure because I can't see your code but you can check other 
providers for example Amazon:
   
https://github.com/apache/airflow/blob/main/airflow/providers/amazon/provider.yaml#L43
   You need to set similar `how-to-guide:` entry.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] subkanthi commented on issue #19489: PostgresSqlHook needs to override DbApiHook.get_uri to pull in extra for client_encoding=utf-8 during create_engine

2021-11-09 Thread GitBox


subkanthi commented on issue #19489:
URL: https://github.com/apache/airflow/issues/19489#issuecomment-964356546


   Just copying the same get_uri to postgres.py seems to like duplicate logic, 
should we have a common base class for both postgres and mysql  instead of 
deriving from DbApiHook?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Bowrna commented on pull request #19470: Add how-to Guide for MSSQL operators

2021-11-09 Thread GitBox


Bowrna commented on pull request #19470:
URL: https://github.com/apache/airflow/pull/19470#issuecomment-964330746


   > > @potiuk I have added the docs and i tried to commit the doc into 
repository and the pre-commit build fails
   > > ```
   > > - hook id: provider-yamls
   > > - exit code: 1
   > > 
   > > Checking integration duplicates
   > > Checking completeness of list of {sensors, hooks, operators}
   > >  -- {sensors, hooks, operators} - Expected modules(Left): Current 
Modules(Right)
   > > Checking for duplicates in list of {sensors, hooks, operators}
   > > Checking completeness of list of transfers
   > >  -- Expected transfers modules(Left): Current transfers Modules(Right)
   > > Checking for duplicates in list of transfers
   > > Checking connection classes belong to package
   > >  -- Checking providers: present in code(left), mentioned in 
/Users/sathishkannan/code/airflow/.github/ISSUE_TEMPLATE/airflow_providers_bug_report.yml
 (right)
   > > Checking doc files
   > >  -- Checking document urls: expected(left), current(right)
   > > -- Items in the left set but not the right:
   > >'/docs/apache-airflow-providers-microsoft-mssql/operators.rst'
   > > ```
   > > 
   > > 
   > > 
   > >   
   > > 
   > > 
   > >   
   > > 
   > > 
   > > 
   > >   
   > > I could not understand this failure message. Can you help me to 
understand this?
   > 
   > Note that it fails on `hook id: provider-yamls` so this gives you context 
into where the problem is. Something that is related to the provider yaml. 
Possibly you also edited a howto guide (`.rst` file) but you did not add a 
corresponded entry in the provider `.yaml`
   
   @eladkal thank you. I could understand the context of the problem in 
provider yaml. I have edited the `.rst` file. But I am not sure what I have to 
add in the provider `.yaml` file. Do I have to increase the version or make any 
changes to `.yaml` file for adding `.rst` file? If there is any help/guide 
about this, it would be useful for me to get started. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kenfu1010 commented on issue #8375: CLI trigger_dag ignores subdir flag

2021-11-09 Thread GitBox


kenfu1010 commented on issue #8375:
URL: https://github.com/apache/airflow/issues/8375#issuecomment-964329743


   Was there ever a resolution on this?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on pull request #19406: Shell edits

2021-11-09 Thread GitBox


ephraimbuddy commented on pull request #19406:
URL: https://github.com/apache/airflow/pull/19406#issuecomment-964320033


   > Hi @potiuk Thanks for your feedback. I'd like to know. I have already 
submitted the PR link on Outreachy, do I need to delete the entire PR or is 
there a way I can go about removing the unnecessary commits in the PR?
   
   You can remove the changes in your file, commit them, rebase and push


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jedcunningham commented on a change in pull request #19036: replaced '.' with '-' and adjusted trimmed_pod_id per ticket comments

2021-11-09 Thread GitBox


jedcunningham commented on a change in pull request #19036:
URL: https://github.com/apache/airflow/pull/19036#discussion_r745762360



##
File path: airflow/kubernetes/pod_generator.py
##
@@ -459,10 +459,14 @@ def make_unique_pod_id(pod_id: str) -> str:
 return None
 
 safe_uuid = uuid.uuid4().hex  # safe uuid will always be less than 63 
chars
-# Strip trailing '-' and '.' as they can't be followed by '.'
-trimmed_pod_id = pod_id[:MAX_LABEL_LEN].rstrip('-.')
 
-safe_pod_id = f"{trimmed_pod_id}.{safe_uuid}"
+# Get prefix length after subtracting the uuid length. Clean up . and 
- from
+# end of podID because they can't be followed by '.'

Review comment:
   ```suggestion
   # Get prefix length after subtracting the uuid length. Clean up '.' 
and '-' from
   # end of podID ('.' can't be followed by '-').
   ```
   
   Technically we no longer need to remove any trailing`-`, but I think it 
makes sense to continue doing so.

##
File path: tests/kubernetes/test_pod_generator.py
##
@@ -658,8 +658,8 @@ def test_deserialize_model_file(self):
 def test_pod_name_confirm_to_max_length(self, _, pod_id):
 name = PodGenerator.make_unique_pod_id(pod_id)
 assert len(name) <= 253
-parts = name.split(".")
-if len(pod_id) <= 63:
+parts = name.split("-")
+if len(pod_id) <= 63 - 33:
 assert len(parts[0]) == len(pod_id)
 else:
 assert len(parts[0]) <= 63

Review comment:
   ```suggestion
   parts = name.split("-")
   assert parts[0] == pod_id[:30]
   ```
   
   Also the following line I can't suggest, line 666, no longer makes sense 
imo. We should check it against a fixed length, I think this is right:
   
   ```
   assert len(parts[1]) == 32
   ```

##
File path: tests/kubernetes/test_pod_generator.py
##
@@ -684,7 +684,7 @@ def test_pod_name_is_valid(self, pod_id, 
expected_starts_with):
 len(name) <= 253 and all(ch.lower() == ch for ch in name) and 
re.match(regex, name)
 ), "pod_id is invalid - fails allowed regex check"
 
-assert name.rsplit(".")[0] == expected_starts_with
+assert name[:-33] == expected_starts_with

Review comment:
   ```suggestion
   assert name.rsplit("-")[0] == expected_starts_with
   ```
   
   Nit, I think the `rsplit` approach is easier to grok?

##
File path: tests/kubernetes/test_pod_generator.py
##
@@ -497,7 +497,7 @@ def test_ensure_max_label_length(self, mock_uuid):
 base_worker_pod=worker_config,
 )
 
-assert result.metadata.name == 'a' * 63 + '.' + self.static_uuid.hex
+assert result.metadata.name == ('a' * 63)[:-33] + '-' + 
self.static_uuid.hex

Review comment:
   ```suggestion
   assert result.metadata.name == 'a' * 30 + '-' + self.static_uuid.hex
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on pull request #19491: Set X-Frame-Options header to DENY unless X_FRAME_ENABLED is set to true

2021-11-09 Thread GitBox


uranusjr commented on pull request #19491:
URL: https://github.com/apache/airflow/pull/19491#issuecomment-964279021


   I think the description is the other way around? The request (and the 
patch!) is to set `X-Frame-Options` to `deny` _unless_ `X_FRAME_ENABLED` is 
True, which makes sense because `deny` disables embedding.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] subkanthi opened a new pull request #19491: Set X-Frame-Options header to DENY only if X_FRAME_ENABLED is set to …

2021-11-09 Thread GitBox


subkanthi opened a new pull request #19491:
URL: https://github.com/apache/airflow/pull/19491


   Set X-Frame-Options header to DENY only if X_FRAME_ENABLED is set to true.
   
   closes: #17255
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18675: New Tree View

2021-11-09 Thread GitBox


ashb commented on a change in pull request #18675:
URL: https://github.com/apache/airflow/pull/18675#discussion_r745734163



##
File path: airflow/www/package.json
##
@@ -56,19 +60,29 @@
 "webpack-manifest-plugin": "^2.2.0"
   },
   "dependencies": {
+"@chakra-ui/react": "^1.6.6",
+"@emotion/cache": "^11.4.0",
+"@emotion/react": "^11.4.1",
+"@emotion/styled": "^11",
 "bootstrap-3-typeahead": "^4.0.2",
+"camelcase-keys": "^7.0.0",
 "codemirror": "^5.59.1",
 "d3": "^3.4.4",
 "d3-shape": "^2.1.0",
 "d3-tip": "^0.9.1",
 "dagre-d3": "^0.6.4",
 "datatables.net": "^1.10.23",
 "datatables.net-bs": "^1.10.23",
+"dayjs": "^1.10.6",
 "eonasdan-bootstrap-datetimepicker": "^4.17.47",
+"framer-motion": "^4",
 "jquery": ">=3.5.0",
 "jshint": "^2.12.0",
 "moment-timezone": "^0.5.28",
 "nvd3": "^1.8.6",
+"react": "^17.0.2",
+"react-dom": "^17.0.2",
+"react-icons": "^4.2.0",

Review comment:
   (I just picked that somewhat at random -- it may not actually do what we 
want or be the best option)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18675: New Tree View

2021-11-09 Thread GitBox


ashb commented on a change in pull request #18675:
URL: https://github.com/apache/airflow/pull/18675#discussion_r745710768



##
File path: airflow/www/package.json
##
@@ -56,19 +60,29 @@
 "webpack-manifest-plugin": "^2.2.0"
   },
   "dependencies": {
+"@chakra-ui/react": "^1.6.6",
+"@emotion/cache": "^11.4.0",
+"@emotion/react": "^11.4.1",
+"@emotion/styled": "^11",
 "bootstrap-3-typeahead": "^4.0.2",
+"camelcase-keys": "^7.0.0",
 "codemirror": "^5.59.1",
 "d3": "^3.4.4",
 "d3-shape": "^2.1.0",
 "d3-tip": "^0.9.1",
 "dagre-d3": "^0.6.4",
 "datatables.net": "^1.10.23",
 "datatables.net-bs": "^1.10.23",
+"dayjs": "^1.10.6",
 "eonasdan-bootstrap-datetimepicker": "^4.17.47",
+"framer-motion": "^4",
 "jquery": ">=3.5.0",
 "jshint": "^2.12.0",
 "moment-timezone": "^0.5.28",
 "nvd3": "^1.8.6",
+"react": "^17.0.2",
+"react-dom": "^17.0.2",
+"react-icons": "^4.2.0",

Review comment:
   Oh, we'll need to include the license for _anything_ that webpack pulls 
in and thus we ship directly as part of the our artefacts. That is... tedius.
   
   @bbovenzi Something like https://github.com/codepunkt/webpack-license-plugin 
might be what we need here - us trying to maintain it manually is just never 
going to work.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18675: New Tree View

2021-11-09 Thread GitBox


ashb commented on a change in pull request #18675:
URL: https://github.com/apache/airflow/pull/18675#discussion_r745710768



##
File path: airflow/www/package.json
##
@@ -56,19 +60,29 @@
 "webpack-manifest-plugin": "^2.2.0"
   },
   "dependencies": {
+"@chakra-ui/react": "^1.6.6",
+"@emotion/cache": "^11.4.0",
+"@emotion/react": "^11.4.1",
+"@emotion/styled": "^11",
 "bootstrap-3-typeahead": "^4.0.2",
+"camelcase-keys": "^7.0.0",
 "codemirror": "^5.59.1",
 "d3": "^3.4.4",
 "d3-shape": "^2.1.0",
 "d3-tip": "^0.9.1",
 "dagre-d3": "^0.6.4",
 "datatables.net": "^1.10.23",
 "datatables.net-bs": "^1.10.23",
+"dayjs": "^1.10.6",
 "eonasdan-bootstrap-datetimepicker": "^4.17.47",
+"framer-motion": "^4",
 "jquery": ">=3.5.0",
 "jshint": "^2.12.0",
 "moment-timezone": "^0.5.28",
 "nvd3": "^1.8.6",
+"react": "^17.0.2",
+"react-dom": "^17.0.2",
+"react-icons": "^4.2.0",

Review comment:
   Oh, we'll need to include the license for _anything_ that webpack pulls 
in. That is... sucky.
   
   @bbovenzi Something like https://github.com/codepunkt/webpack-license-plugin 
might be what we need here - us trying to maintain it manually is just never 
going to work.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb closed issue #19397: Upgrading from 2.1.2 to 2.2.1 airflow db upgrade errors

2021-11-09 Thread GitBox


ashb closed issue #19397:
URL: https://github.com/apache/airflow/issues/19397


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on issue #19397: Upgrading from 2.1.2 to 2.2.1 airflow db upgrade errors

2021-11-09 Thread GitBox


ashb commented on issue #19397:
URL: https://github.com/apache/airflow/issues/19397#issuecomment-964232816


   Just confirmed this works on Mysql with #19425 when going from 2.1.2 to 
v2-2-test branch, so this will be fixed when 2.2.2 is out in a few days


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #19482: [19458] Added column duration to DAG runs view

2021-11-09 Thread GitBox


github-actions[bot] commented on pull request #19482:
URL: https://github.com/apache/airflow/pull/19482#issuecomment-964225160


   The PR is likely OK to be merged with just subset of tests for default 
Python and Database versions without running the full matrix of tests, because 
it does not modify the core of Airflow. If the committers decide that the full 
tests matrix is needed, they will add the label 'full tests needed'. Then you 
should rebase to the latest main or amend the last commit of the PR, and push 
it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   >