[GitHub] [airflow] ephraimbuddy commented on a change in pull request #15336: Fail task when containers inside a pod fails
ephraimbuddy commented on a change in pull request #15336: URL: https://github.com/apache/airflow/pull/15336#discussion_r612165116 ## File path: tests/executors/test_kubernetes_executor.py ## @@ -507,3 +506,113 @@ def test_process_status_catchall(self): self._run() self.watcher.watcher_queue.put.assert_not_called() + +def test_container_status_of_terminating_fails_pod(self): +self.pod.status.phase = "Pending" +self.pod.status.container_statuses = [ +k8s.V1ContainerStatus( +container_id=None, +image="apache/airflow:2.0.1-python3.8", +image_id="", +name="base", +ready="false", +restart_count=0, +state=k8s.V1ContainerState( +terminated=k8s.V1ContainerStateTerminated( +reason="Terminating", exit_code=1 Review comment: Hi Jed, I've changed the code to only fail pod when there's image pill error as I was not able to reproduce terminating status when pod is pending. To reproduce this change, just change the `worker_container_tag` value to `201-python` so you can reproduce Image pull error and scheduler stuck in queued forever. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] patsevanton commented on issue #15340: helm install airflow in namespace get error: File "", line 32, in TimeoutError: There are still unapplied migrations after 60
patsevanton commented on issue #15340: URL: https://github.com/apache/airflow/issues/15340#issuecomment-818472351 **kubectl describe -n x pod airflow-scheduler-658d5d4454-r2sgl** ``` Name: airflow-scheduler-658d5d4454-r2sgl Namespace:x Priority: 0 Node: ubuntu1804/192.168.22.7 Start Time: Tue, 13 Apr 2021 05:54:59 + Labels: component=scheduler pod-template-hash=658d5d4454 release=airflow tier=airflow Annotations: checksum/airflow-config: d84f720b402097e58a879efc896869845ec8bae56455470bf241221b2a016f19 checksum/extra-configmaps: 2e44e493035e2f6a255d08f8104087ff10d30aef6f63176f1b18f75f73295598 checksum/extra-secrets: bb91ef06ddc31c0c5a29973832163d8b0b597812a793ef911d33b622bc9d1655 checksum/metadata-secret: a954626eab69d09b0c9bfd44128c793948c18d943d9e97431903985654b350c5 checksum/pgbouncer-config-secret: da52bd1edfe820f0ddfacdebb20a4cc6407d296ee45bcb500a6407e2261a5ba2 checksum/result-backend-secret: af25d110685219c9219e6a4f9b268566118a4b732de33192387a111d1f241c89 cluster-autoscaler.kubernetes.io/safe-to-evict: true Status: Pending IP: 10.1.78.6 IPs: IP: 10.1.78.6 Controlled By: ReplicaSet/airflow-scheduler-658d5d4454 Init Containers: wait-for-airflow-migrations: Container ID: containerd://ac2a25e781647e59aa341e5e308ebbef60408d69b1a2f6b5f2d83df808718ec2 Image: apache/airflow:2.0.0 Image ID: docker.io/apache/airflow@sha256:e973fef20d3be5b6ea328d2707ac87b90f680382790d1eb027bd7766699b2409 Port: Host Port: Args: python -c import airflow import logging import os import time from alembic.config import Config from alembic.runtime.migration import MigrationContext from alembic.script import ScriptDirectory from airflow import settings package_dir = os.path.abspath(os.path.dirname(airflow.__file__)) directory = os.path.join(package_dir, 'migrations') config = Config(os.path.join(package_dir, 'alembic.ini')) config.set_main_option('script_location', directory) config.set_main_option('sqlalchemy.url', settings.SQL_ALCHEMY_CONN.replace('%', '%%')) script_ = ScriptDirectory.from_config(config) timeout=60 with settings.engine.connect() as connection: context = MigrationContext.configure(connection) ticker = 0 while True: source_heads = set(script_.get_heads()) db_heads = set(context.get_current_heads()) if source_heads == db_heads: break if ticker >= timeout: raise TimeoutError("There are still unapplied migrations after {} seconds.".format(ticker)) ticker += 1 time.sleep(1) logging.info('Waiting for migrations... %s second(s)', ticker) State: Waiting Reason: CrashLoopBackOff Last State: Terminated Reason: Error Exit Code:1 Started: Tue, 13 Apr 2021 06:15:15 + Finished: Tue, 13 Apr 2021 06:16:24 + Ready: False Restart Count: 7 Environment: AIRFLOW__CORE__FERNET_KEY:Optional: false AIRFLOW__CORE__SQL_ALCHEMY_CONN:Optional: false AIRFLOW_CONN_AIRFLOW_DB:Optional: false Mounts: /var/run/secrets/kubernetes.io/serviceaccount from airflow-scheduler-token-q6zfr (ro) Containers: scheduler: Container ID: Image: apache/airflow:2.0.0 Image ID: Port: Host Port: Args: bash -c exec airflow scheduler State: Waiting Reason: PodInitializing Ready: False Restart Count: 0 Liveness: exec [python -Wignore -c import os os.environ['AIRFLOW__CORE__LOGGING_LEVEL'] = 'ERROR' os.environ['AIRFLOW__LOGGING__LOGGING_LEVEL'] = 'ERROR' from airflow.jobs.scheduler_job import SchedulerJob from airflow.utils.db import create_session from airflow.utils.net import get_hostname import sys with create_session() as session: job = session.query(SchedulerJob).filter_by(hostname=get_hostname()).order_by( SchedulerJob.latest_heartbeat.desc()).limit(1).first() sys.exit(0 if job.is_alive() else 1) ] delay=10s timeout=5s period=30s #success=1 #failure=10 Environment: AIRFLOW__CORE__FERNET_KEY:Optional: false AIRFLOW__CORE__SQL_ALCHEMY_CONN:Optional: false
[GitHub] [airflow] patsevanton commented on issue #15340: helm install airflow in namespace get error: File "", line 32, in TimeoutError: There are still unapplied migrations after 60
patsevanton commented on issue #15340: URL: https://github.com/apache/airflow/issues/15340#issuecomment-818469847 **kubectl logs -n x airflow-scheduler-658d5d4454-r2sgl** `error: a container name must be specified for pod airflow-scheduler-658d5d4454-r2sgl, choose one of: [scheduler scheduler-gc] or one of the init containers: [wait-for-airflow-migrations]` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] patsevanton commented on issue #15340: helm install airflow in namespace get error: File "", line 32, in TimeoutError: There are still unapplied migrations after 60
patsevanton commented on issue #15340: URL: https://github.com/apache/airflow/issues/15340#issuecomment-818469574 **kubectl logs -n x airflow-postgresql-0** ``` postgresql 05:56:01.18 postgresql 05:56:01.18 Welcome to the Bitnami postgresql container postgresql 05:56:01.18 Subscribe to project updates by watching https://github.com/bitnami/bitnami-docker-postgresql postgresql 05:56:01.18 Submit issues and feature requests at https://github.com/bitnami/bitnami-docker-postgresql/issues postgresql 05:56:01.18 Send us your feedback at contain...@bitnami.com postgresql 05:56:01.19 postgresql 05:56:01.20 INFO ==> ** Starting PostgreSQL setup ** postgresql 05:56:01.23 INFO ==> Validating settings in POSTGRESQL_* env vars.. postgresql 05:56:01.24 INFO ==> Loading custom pre-init scripts... postgresql 05:56:01.24 INFO ==> Initializing PostgreSQL database... postgresql 05:56:01.25 INFO ==> postgresql.conf file not detected. Generating it... postgresql 05:56:01.25 INFO ==> pg_hba.conf file not detected. Generating it... postgresql 05:56:02.32 INFO ==> Starting PostgreSQL in background... postgresql 05:56:02.44 INFO ==> Changing password of postgres postgresql 05:56:02.45 INFO ==> Configuring replication parameters postgresql 05:56:02.47 INFO ==> Configuring fsync postgresql 05:56:02.47 INFO ==> Loading custom scripts... postgresql 05:56:02.48 INFO ==> Enabling remote connections postgresql 05:56:02.48 INFO ==> Stopping PostgreSQL... postgresql 05:56:03.49 INFO ==> ** PostgreSQL setup finished! ** postgresql 05:56:03.52 INFO ==> ** Starting PostgreSQL ** 2021-04-13 05:56:03.537 GMT [1] LOG: listening on IPv4 address "0.0.0.0", port 5432 2021-04-13 05:56:03.537 GMT [1] LOG: listening on IPv6 address "::", port 5432 2021-04-13 05:56:03.556 GMT [1] LOG: listening on Unix socket "/tmp/.s.PGSQL.5432" 2021-04-13 05:56:03.586 GMT [178] LOG: database system was shut down at 2021-04-13 05:56:02 GMT 2021-04-13 05:56:03.596 GMT [1] LOG: database system is ready to accept connections 2021-04-13 05:56:10.476 GMT [193] LOG: incomplete startup packet 2021-04-13 05:56:12.106 GMT [194] LOG: incomplete startup packet 2021-04-13 05:57:20.415 GMT [284] LOG: incomplete startup packet 2021-04-13 05:57:22.399 GMT [286] LOG: incomplete startup packet 2021-04-13 05:58:44.731 GMT [397] LOG: incomplete startup packet 2021-04-13 05:58:45.741 GMT [398] LOG: incomplete startup packet 2021-04-13 06:00:17.733 GMT [533] LOG: incomplete startup packet 2021-04-13 06:00:18.752 GMT [534] LOG: incomplete startup packet 2021-04-13 06:02:18.723 GMT [703] LOG: incomplete startup packet 2021-04-13 06:02:21.740 GMT [714] LOG: incomplete startup packet 2021-04-13 06:04:51.723 GMT [917] LOG: incomplete startup packet 2021-04-13 06:05:01.784 GMT [933] LOG: incomplete startup packet 2021-04-13 06:08:53.728 GMT [1248] LOG: incomplete startup packet 2021-04-13 06:08:56.783 GMT [1256] LOG: incomplete startup packet 2021-04-13 06:15:15.739 GMT [1773] LOG: incomplete startup packet 2021-04-13 06:15:16.759 GMT [1780] LOG: incomplete startup packet ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] patsevanton commented on issue #15340: helm install airflow in namespace get error: File "", line 32, in TimeoutError: There are still unapplied migrations after 60
patsevanton commented on issue #15340: URL: https://github.com/apache/airflow/issues/15340#issuecomment-818469219 **kubectl logs -n sdpcc airflow-webserver-86857b5969-sqkv6** `Error from server (BadRequest): container "webserver" in pod "airflow-webserver-86857b5969-sqkv6" is waiting to start: PodInitializing` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] patsevanton edited a comment on issue #15340: helm install airflow in namespace get error: File "", line 32, in TimeoutError: There are still unapplied migrations a
patsevanton edited a comment on issue #15340: URL: https://github.com/apache/airflow/issues/15340#issuecomment-818469219 **kubectl logs -n x airflow-webserver-86857b5969-sqkv6** `Error from server (BadRequest): container "webserver" in pod "airflow-webserver-86857b5969-sqkv6" is waiting to start: PodInitializing` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] patsevanton opened a new issue #15340: helm install airflow in namespace get error: File "", line 32, in TimeoutError: There are still unapplied migrations after 60
patsevanton opened a new issue #15340: URL: https://github.com/apache/airflow/issues/15340 **Apache Airflow version**: master git **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): ``` Server Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.17", ``` **Environment**: - **Cloud provider or hardware configuration**: Microk8s - **OS** (e.g. from /etc/os-release): VERSION="18.04.3 LTS (Bionic Beaver)" - **Kernel** (e.g. `uname -a`): - **Install tools**: - **Others**: **What happened**: ``` git clone https://github.com/apache/airflow.git cd airflow/chart/ helm dependency update kubectl create namespace x werf helm install --wait --set webserver.defaultUser.password=password,ingress.enabled=true,ingress.hosts[0]=airflow.192.168.22.7.xip.io --namespace x airflow ./ ``` Log ``` │ ┌ deploy/airflow-webserver po/airflow-webserver-86857b5969-sqkv6 container/wait-for-airflow-migrations logs │ │ [2021-04-13 05:57:20,571] {:35} INFO - Waiting for migrations... 60 second(s) │ │ Traceback (most recent call last): │ │ File "", line 32, in │ │ TimeoutError: There are still unapplied migrations after 60 seconds. │ └ deploy/airflow-webserver po/airflow-webserver-86857b5969-sqkv6 container/wait-for-airflow-migrations logs ``` Next line log ``` │ deploy/airflow-scheduler ERROR: po/airflow-scheduler-658d5d4454-r2sgl container/wait-for-airflow-migrations: CrashLoopBackOff: back-off 10s restarting failed container=wait-for-airflow-migrations ↵ │ pod=airflow-scheduler-658d5d4454-r2sgl_sdpcc(40e85057-2aa5-4e9e-a47d-e91530038c0c) │ 1/1 allowed errors occurred for deploy/airflow-scheduler: continue tracking ``` Full log https://gist.github.com/patsevanton/0edd5571cf69aa539edcdb803c288061 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] xinbinhuang commented on a change in pull request #14640: Allow ExternalTaskSensor to wait for taskgroup
xinbinhuang commented on a change in pull request #14640: URL: https://github.com/apache/airflow/pull/14640#discussion_r608782468 ## File path: airflow/sensors/external_task.py ## @@ -164,18 +184,23 @@ def poke(self, context, session=None): if self.failed_states: count_failed = self.get_count(dttm_filter, session, self.failed_states) -if count_failed == len(dttm_filter): +if count_failed > 0: Review comment: I don't think it's necessary more than an entry in the `UPDATING.md`. I think the only situation where you will have multiple counts is when the `execution_date_fn` returns more than one execution date to wait for. However, the original behavior will get you into a weird state when only part of the TIs fail, i.e. one fail and one succeeds, resulting in time out. IMHO, I think this's more like a bug than intended behavior. WDYT? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jacobhjkim edited a comment on pull request #15266: Update Google Ads hook
jacobhjkim edited a comment on pull request #15266: URL: https://github.com/apache/airflow/pull/15266#issuecomment-818450741 > @jacobhjkim Is this a breaking change for anyone using v3 (or v4, if it exists)? The current Airflow release's google Ads SDK will return a deprecation error (not warning) when you use `api_version=v3`. So I don't think this change will be a breaking change. People who are using `v4` won't get affected since they are already specifying `v4`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jacobhjkim commented on pull request #15266: Update Google Ads hook
jacobhjkim commented on pull request #15266: URL: https://github.com/apache/airflow/pull/15266#issuecomment-818450741 > @jacobhjkim Is this a breaking change for anyone using v3 (or v4, if it exists)? The current Airflow release's google Ads SDK will return a deprecation error (not warning) when you use `api_version=v3`. So I don't think this change will be a breaking change. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] xinbinhuang edited a comment on pull request #15233: Update sqs.py
xinbinhuang edited a comment on pull request #15233: URL: https://github.com/apache/airflow/pull/15233#issuecomment-818435020 @apogre Can you also add some tests and fix the CI, thanks! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] xinbinhuang commented on pull request #15233: Update sqs.py
xinbinhuang commented on pull request #15233: URL: https://github.com/apache/airflow/pull/15233#issuecomment-818437152 As I understand, `MessageAttributeNames` doesn't really help filter messages. Rather it will only return the specified attributes when receiving messages. So if a message does not have the required attributes, it will still return, correct? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] xinbinhuang commented on pull request #15233: Update sqs.py
xinbinhuang commented on pull request #15233: URL: https://github.com/apache/airflow/pull/15233#issuecomment-818435020 @apogre Can you also add some tests thanks! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] xinbinhuang commented on a change in pull request #15233: Update sqs.py
xinbinhuang commented on a change in pull request #15233: URL: https://github.com/apache/airflow/pull/15233#discussion_r612128944 ## File path: airflow/providers/amazon/aws/sensors/sqs.py ## @@ -50,13 +52,15 @@ def __init__( aws_conn_id: str = 'aws_default', max_messages: int = 5, wait_time_seconds: int = 1, +message_attribute_names: list = ['.*'] Review comment: Let's push it under the constructor ```suggestion message_attribute_names: Optional[List[str]] = None ``` And then within the constructor ```python self.message_attribute_names = ['.*'] if message_attribute_names is None else message_attribute_names ``` The reason being python has gotcha around [mutable default arguments](https://docs.python-guide.org/writing/gotchas/#mutable-default-arguments) ## File path: airflow/providers/amazon/aws/sensors/sqs.py ## @@ -38,6 +38,8 @@ class SQSSensor(BaseSensorOperator): :type max_messages: int :param wait_time_seconds: The time in seconds to wait for receiving messages (default: 1 second) :type wait_time_seconds: int +:param message_attribute_names: The message attribute names to filter receiving messages (default: All Messages) +:type wait_time_seconds: list Review comment: ```suggestion :type message_attribute_names: list ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] thesuperzapper commented on issue #15259: Scheduler livenessprobe and k8s v1.20+
thesuperzapper commented on issue #15259: URL: https://github.com/apache/airflow/issues/15259#issuecomment-818403977 @kimyen while not explicitly stated in the docs of the `stable/airflow` chart, you can use `8.X.X` chart versions with an older airflow/python version. For example, the values to use `airflow 2.0.1` with `python 3.6`: ```yaml airflow: image: repository: apache/airflow tag: 2.0.1-python3.6 ``` For example, the values to use `airflow 1.15.0` with `python 3.6`: ```yaml airflow: # needed for airflow 1.10 to work legacyCommands: true image: repository: apache/airflow tag: 1.10.15-python3.6 ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] thesuperzapper edited a comment on issue #15259: Scheduler livenessprobe and k8s v1.20+
thesuperzapper edited a comment on issue #15259: URL: https://github.com/apache/airflow/issues/15259#issuecomment-818403977 @kimyen while not explicitly stated in the docs of the `stable/airflow` chart, you can use `8.X.X` chart versions with an older airflow/python version. For example, the values to use `airflow 2.0.1` with `python 3.6`: ```yaml airflow: image: repository: apache/airflow tag: 2.0.1-python3.6 ``` For example, the values to use `airflow 1.15.0` with `python 3.6`: ```yaml airflow: # needed for airflow 1.10 to work legacyCommands: true image: repository: apache/airflow tag: 1.10.15-python3.6 ``` See the other values here: https://github.com/airflow-helm/charts/blob/main/charts/airflow/values.yaml -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on issue #15306: Support Serialized DAGs on CLI Commands
jhtimmins commented on issue #15306: URL: https://github.com/apache/airflow/issues/15306#issuecomment-818397269 @kaxil I'm not super familiar with DAG serialization, but I think this sounds reasonable. Are there any drawbacks to supporting this functionality? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on issue #15318: Add CLI to delete roles
jhtimmins commented on issue #15318: URL: https://github.com/apache/airflow/issues/15318#issuecomment-818396684 @alexInhert can you share some info about why it's inconvenient to do this through the UI? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on a change in pull request #15195: Fix dag sort in dag stats
jhtimmins commented on a change in pull request #15195: URL: https://github.com/apache/airflow/pull/15195#discussion_r612088046 ## File path: airflow/utils/dag_processing.py ## @@ -834,7 +834,7 @@ def _log_file_processing_stats(self, known_file_paths): rows.append((file_path, processor_pid, runtime, num_dags, num_errors, last_runtime, last_run)) # Sort by longest last runtime. (Can't sort None values in python3) Review comment: This seems to suggest that it's sorting the results by the length of each DAG's last runtime. Is your change sorting by the time of the last runtime? Or was it not sorting in the way the comment suggests and you're just fixing it? If it's the former, can you update the comment to reflect the new sort order? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] tag nightly-master updated (5da8319 -> 1a85ba9)
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a change to tag nightly-master in repository https://gitbox.apache.org/repos/asf/airflow.git. *** WARNING: tag nightly-master was modified! *** from 5da8319 (commit) to 1a85ba9 (commit) from 5da8319 Fix exception caused by missing keys in the ElasticSearch Record (#15163) add 62aa796 Chart: Add tests to check labels, kind and annotations (#15313) add aaa3bf6 Fix url generation for TriggerDagRunOperatorLink (#14990) add 16902d0 Ensure executors end method is called (#14085) add 1dfbb8d Avoids error on pushing PROD image as cache (#15321) add da780fc Fixes doc for SQSSensor (#15323) add b477072 Add links to new modules for deprecated modules (#15316) add d944f5a Fix DAG last run link (#15327) add 925ef28 Adds description field in variable (#12413) (#15194) add 6f8ab9e Remove python2 related handlings and dependencies (#15301) add 30c6300 Chart: Allow disabling `git-sync` for Webserver (#15314) add cb9b9b3 Fix `sendgrid` -> `google`. (#15334) add 18c5b8a Standardize default fab perms (#14946) add 8b56629 Add Configurable LivenessProbe Values to Scheduler (#15333) add e4c0689 Fix Helm GitSync dag volume mount from pod-template-file (#15331) add 1a85ba9 Add dynamic connection fields to Azure Connection (#15159) No new revisions were added by this update. Summary of changes: .github/actions/cancel-workflow-runs | 2 +- .../endpoints/role_and_permission_endpoint.py | 12 +- airflow/api_connexion/endpoints/user_endpoint.py | 4 +- airflow/config_templates/config.yml| 3 +- airflow/config_templates/default_airflow.cfg | 3 +- airflow/contrib/hooks/aws_athena_hook.py | 2 +- airflow/contrib/hooks/aws_datasync_hook.py | 2 +- airflow/contrib/hooks/aws_dynamodb_hook.py | 2 +- airflow/contrib/hooks/aws_firehose_hook.py | 2 +- airflow/contrib/hooks/aws_glue_catalog_hook.py | 2 +- airflow/contrib/hooks/aws_hook.py | 7 +- airflow/contrib/hooks/aws_lambda_hook.py | 5 +- airflow/contrib/hooks/aws_logs_hook.py | 2 +- airflow/contrib/hooks/aws_sns_hook.py | 2 +- airflow/contrib/hooks/aws_sqs_hook.py | 2 +- .../contrib/hooks/azure_container_instance_hook.py | 6 +- .../contrib/hooks/azure_container_volume_hook.py | 5 +- airflow/contrib/hooks/azure_cosmos_hook.py | 2 +- airflow/contrib/hooks/azure_data_lake_hook.py | 2 +- airflow/contrib/hooks/azure_fileshare_hook.py | 2 +- airflow/contrib/hooks/bigquery_hook.py | 2 +- airflow/contrib/hooks/cassandra_hook.py| 2 +- airflow/contrib/hooks/cloudant_hook.py | 2 +- airflow/contrib/hooks/databricks_hook.py | 2 +- airflow/contrib/hooks/datadog_hook.py | 2 +- airflow/contrib/hooks/datastore_hook.py| 2 +- airflow/contrib/hooks/dingding_hook.py | 2 +- airflow/contrib/hooks/discord_webhook_hook.py | 2 +- airflow/contrib/hooks/emr_hook.py | 2 +- airflow/contrib/hooks/fs_hook.py | 2 +- airflow/contrib/hooks/ftp_hook.py | 2 +- airflow/contrib/hooks/gcp_api_base_hook.py | 2 +- airflow/contrib/hooks/gcp_bigtable_hook.py | 2 +- airflow/contrib/hooks/gcp_cloud_build_hook.py | 2 +- airflow/contrib/hooks/gcp_compute_hook.py | 4 +- airflow/contrib/hooks/gcp_container_hook.py| 5 +- airflow/contrib/hooks/gcp_dataflow_hook.py | 7 +- airflow/contrib/hooks/gcp_dataproc_hook.py | 2 +- airflow/contrib/hooks/gcp_dlp_hook.py | 2 +- airflow/contrib/hooks/gcp_function_hook.py | 2 +- airflow/contrib/hooks/gcp_kms_hook.py | 2 +- airflow/contrib/hooks/gcp_mlengine_hook.py | 2 +- airflow/contrib/hooks/gcp_natural_language_hook.py | 2 +- airflow/contrib/hooks/gcp_pubsub_hook.py | 2 +- airflow/contrib/hooks/gcp_spanner_hook.py | 2 +- airflow/contrib/hooks/gcp_speech_to_text_hook.py | 2 +- airflow/contrib/hooks/gcp_sql_hook.py | 2 +- airflow/contrib/hooks/gcp_tasks_hook.py| 2 +- airflow/contrib/hooks/gcp_text_to_speech_hook.py | 2 +- airflow/contrib/hooks/gcp_translate_hook.py| 2 +- .../contrib/hooks/gcp_video_intelligence_hook.py | 2 +- airflow/contrib/hooks/gcp_vision_hook.py | 2 +- airflow/contrib/hooks/gcs_hook.py | 2 +- airflow/contrib/hooks/gdrive_hook.py | 2 +- airflow/contrib/hooks/grpc_hook.py | 2 +- airflow/contrib/hooks/imap_hook.py | 2 +- airflow/contrib/hooks/jenkins_hook.py | 2 +- airflow/contrib/hooks/jira_ho
[GitHub] [airflow] github-actions[bot] commented on pull request #15266: Update Google Ads hook
github-actions[bot] commented on pull request #15266: URL: https://github.com/apache/airflow/pull/15266#issuecomment-818383483 The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest master or amend the last commit of the PR, and push it with --force-with-lease. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on pull request #15266: Update Google Ads hook
jhtimmins commented on pull request #15266: URL: https://github.com/apache/airflow/pull/15266#issuecomment-818383219 @jacobhjkim Is this a breaking change for anyone using v3 (or v4, if it exists)? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #15339: Updates provider release process
github-actions[bot] commented on pull request #15339: URL: https://github.com/apache/airflow/pull/15339#issuecomment-818381950 [The Workflow run](https://github.com/apache/airflow/actions/runs/743155658) is cancelling this PR. Building images for the PR has failed. Follow the workflow link to check the reason. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on a change in pull request #15295: Prevent creating flask sessions on REST API requests
jhtimmins commented on a change in pull request #15295: URL: https://github.com/apache/airflow/pull/15295#discussion_r612078540 ## File path: airflow/www/security.py ## @@ -170,7 +190,11 @@ def __init__(self, appbuilder): if not view or not getattr(view, 'datamodel', None): continue view.datamodel = CustomSQLAInterface(view.datamodel.obj) +app = self.appbuilder.get_app self.perms = None +# Custom cookie session interface +# Override to implement your custom cookie session interface +app.session_interface = DefaultSessionInterface() Review comment: The security manager is already in "god class" territory, so I think we should avoid moving anything else into it. If we really want users to be able to customize everything (which I don't necessarily think is the case), we can easily allow customization of the init functions in other ways. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch constraints-master updated: Updating constraints. Build id:742954799
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a commit to branch constraints-master in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/constraints-master by this push: new d236ac6 Updating constraints. Build id:742954799 d236ac6 is described below commit d236ac64b4d68c0f629b69136ba9ef2a48b0775e Author: Automated GitHub Actions commit AuthorDate: Tue Apr 13 01:36:49 2021 + Updating constraints. Build id:742954799 This update in constraints is automatically committed by the CI 'constraints-push' step based on HEAD of 'refs/heads/master' in 'apache/airflow' with commit sha 1a85ba9e93d44601a322546e31814bd9ef11c125. All tests passed in this build so we determined we can push the updated constraints. See https://github.com/apache/airflow/blob/master/README.md#installing-from-pypi for details. --- constraints-3.6.txt | 46 ++-- constraints-3.7.txt | 50 constraints-3.8.txt | 50 constraints-no-providers-3.7.txt | 2 +- constraints-no-providers-3.8.txt | 2 +- 5 files changed, 75 insertions(+), 75 deletions(-) diff --git a/constraints-3.6.txt b/constraints-3.6.txt index 8e306c3..87047b9 100644 --- a/constraints-3.6.txt +++ b/constraints-3.6.txt @@ -41,49 +41,49 @@ alembic==1.5.8 amqp==2.6.1 analytics-python==1.2.9 ansiwrap==0.8.4 -apache-airflow-providers-airbyte==1.0.0rc1 -apache-airflow-providers-amazon==1.2.0 +apache-airflow-providers-airbyte==1.0.0 +apache-airflow-providers-amazon==1.3.0 apache-airflow-providers-apache-beam==1.0.1 apache-airflow-providers-apache-cassandra==1.0.1 apache-airflow-providers-apache-druid==1.1.0 apache-airflow-providers-apache-hdfs==1.0.1 -apache-airflow-providers-apache-hive==1.0.2 +apache-airflow-providers-apache-hive==1.0.3 apache-airflow-providers-apache-kylin==1.0.1 -apache-airflow-providers-apache-livy==1.0.1 +apache-airflow-providers-apache-livy==1.1.0 apache-airflow-providers-apache-pig==1.0.1 apache-airflow-providers-apache-pinot==1.0.1 apache-airflow-providers-apache-spark==1.0.2 apache-airflow-providers-apache-sqoop==1.0.1 apache-airflow-providers-celery==1.0.1 apache-airflow-providers-cloudant==1.0.1 -apache-airflow-providers-cncf-kubernetes==1.0.2 +apache-airflow-providers-cncf-kubernetes==1.1.0 apache-airflow-providers-databricks==1.0.1 apache-airflow-providers-datadog==1.0.1 apache-airflow-providers-dingding==1.0.2 apache-airflow-providers-discord==1.0.1 -apache-airflow-providers-docker==1.0.2 +apache-airflow-providers-docker==1.1.0 apache-airflow-providers-elasticsearch==1.0.3 apache-airflow-providers-exasol==1.1.1 -apache-airflow-providers-facebook==1.0.1 +apache-airflow-providers-facebook==1.1.0 apache-airflow-providers-ftp==1.0.1 -apache-airflow-providers-google==2.1.0 -apache-airflow-providers-grpc==1.0.1 -apache-airflow-providers-hashicorp==1.0.1 +apache-airflow-providers-google==2.2.0 +apache-airflow-providers-grpc==1.1.0 +apache-airflow-providers-hashicorp==1.0.2 apache-airflow-providers-http==1.1.1 apache-airflow-providers-imap==1.0.1 apache-airflow-providers-jdbc==1.0.1 apache-airflow-providers-jenkins==1.1.0 apache-airflow-providers-jira==1.0.1 -apache-airflow-providers-microsoft-azure==1.2.0 +apache-airflow-providers-microsoft-azure==1.3.0 apache-airflow-providers-microsoft-mssql==1.0.1 -apache-airflow-providers-microsoft-winrm==1.0.1 +apache-airflow-providers-microsoft-winrm==1.1.0 apache-airflow-providers-mongo==1.0.1 -apache-airflow-providers-mysql==1.0.2 +apache-airflow-providers-mysql==1.1.0 apache-airflow-providers-neo4j==1.0.1 apache-airflow-providers-odbc==1.0.1 apache-airflow-providers-openfaas==1.1.1 -apache-airflow-providers-opsgenie==1.0.1 -apache-airflow-providers-oracle==1.0.1 +apache-airflow-providers-opsgenie==1.0.2 +apache-airflow-providers-oracle==1.1.0 apache-airflow-providers-pagerduty==1.0.1 apache-airflow-providers-papermill==1.0.2 apache-airflow-providers-plexus==1.0.1 @@ -91,19 +91,19 @@ apache-airflow-providers-postgres==1.0.1 apache-airflow-providers-presto==1.0.2 apache-airflow-providers-qubole==1.0.2 apache-airflow-providers-redis==1.0.1 -apache-airflow-providers-salesforce==1.0.1 +apache-airflow-providers-salesforce==2.0.0 apache-airflow-providers-samba==1.0.1 apache-airflow-providers-segment==1.0.1 apache-airflow-providers-sendgrid==1.0.2 apache-airflow-providers-sftp==1.1.1 -apache-airflow-providers-singularity==1.0.1 +apache-airflow-providers-singularity==1.1.0 apache-airflow-providers-slack==3.0.0 -apache-airflow-providers-snowflake==1.1.1 +apache-airflow-providers-snowflake==1.2.0 apache-airflow-providers-sqlite==1.0.2 -apache-airflow-providers-ssh==1.2.0 +apache-airflow-providers-ssh==1.3.0 apache-airflow-providers-tableau==1.0.0 apache-airflow-providers-telegram==1.
[GitHub] [airflow] potiuk edited a comment on pull request #15339: Updates provider release process
potiuk edited a comment on pull request #15339: URL: https://github.com/apache/airflow/pull/15339#issuecomment-818364588 Hey @ashb @kaxil - as discussed before, I updated the process of provider release to upload the SVN final releases to PyPI after renaming. I tested it at two ocassions already and everything looks to be fine. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on pull request #15339: Updates provider release process
potiuk commented on pull request #15339: URL: https://github.com/apache/airflow/pull/15339#issuecomment-818364588 Hey @ashb @kaxil - as discussed before, I updated the process of provider release to upload the SVN final releases to PyPI after renaming. I tested it at two ocassions already and everything looks to be fined. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk opened a new pull request #15339: Updates provider release process
potiuk opened a new pull request #15339: URL: https://github.com/apache/airflow/pull/15339 Updates the provider release process with approach where Final PyPI upload are the same as SVN uploads. This way we avoid re-building packages again when we release PyPI version and the files which get uploaded to PyPI are the same that are stored in SVN. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #14152: Helm RBAC Best Practices
github-actions[bot] commented on pull request #14152: URL: https://github.com/apache/airflow/pull/14152#issuecomment-818354262 [The Workflow run](https://github.com/apache/airflow/actions/runs/743066973) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on a change in pull request #15311: WIP: Sync DAG specific permissions when parsing
jhtimmins commented on a change in pull request #15311: URL: https://github.com/apache/airflow/pull/15311#discussion_r612046000 ## File path: airflow/www/security.py ## @@ -516,24 +515,25 @@ def _get_all_roles_with_permissions(self) -> Dict[str, Role]: def create_dag_specific_permissions(self) -> None: """ -Creates 'can_read' and 'can_edit' permissions for all active and paused DAGs. +Creates 'can_read' and 'can_edit' permissions for all active and paused DAGs, +along with any `access_control` permissions provided in the DAG. :return: None. """ perms = self.get_all_permissions() -rows = ( -self.get_session.query(models.DagModel.dag_id) -.filter(or_(models.DagModel.is_active, models.DagModel.is_paused)) Review comment: I believe that `is_active` and `is_paused` are queried for explicitly bc a DAG could be in neither state, in which case we don't want to fetch it to create new permissions. IIRC it has to do with some historical reason. Something about DAGs getting deleted in the DB; the record sticks around but they've been soft deleted. We'll need to account for that when fetching results. ## File path: airflow/models/serialized_dag.py ## @@ -37,10 +37,25 @@ from airflow.utils import timezone from airflow.utils.session import provide_session from airflow.utils.sqlalchemy import UtcDateTime +from airflow.www.security import AirflowSecurityManager log = logging.getLogger(__name__) +class SimpleSecurityManager(AirflowSecurityManager): +"""Security Manager that doesn't need the whole flask app""" + +def __init__(self): # pylint: disable=super-init-not-called +self.session = None + +@property +def get_session(self): +return self.session + + +security_manager = SimpleSecurityManager() Review comment: Ok, after thinking more about this, I don't think we should be extending the security manager into the `/airflow/models` directory. I'd much rather create a `sync-permissions` API endpoint if one doesn't exist, and hitting that from the CLI via a separate HTTP request. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins commented on a change in pull request #15311: WIP: Sync DAG specific permissions when parsing
jhtimmins commented on a change in pull request #15311: URL: https://github.com/apache/airflow/pull/15311#discussion_r612039109 ## File path: airflow/models/serialized_dag.py ## @@ -37,10 +37,25 @@ from airflow.utils import timezone from airflow.utils.session import provide_session from airflow.utils.sqlalchemy import UtcDateTime +from airflow.www.security import AirflowSecurityManager log = logging.getLogger(__name__) +class SimpleSecurityManager(AirflowSecurityManager): Review comment: @jedcunningham Oof I need to think about this, because generally speaking we really don't want to extend the webserver-level controls into Airflow core. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] tag providers-ssh/1.3.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-ssh/1.3.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-trino/1.0.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-trino/1.0.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-snowflake/1.2.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-snowflake/1.2.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-singularity/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-singularity/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-salesforce/2.0.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-salesforce/2.0.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-opsgenie/1.0.2 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-opsgenie/1.0.2 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-mysql/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-mysql/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-oracle/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-oracle/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-microsoft-winrm/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-microsoft-winrm/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-hashicorp/1.0.2 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-hashicorp/1.0.2 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-microsoft-azure/1.3.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-microsoft-azure/1.3.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-grpc/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-grpc/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-google/2.2.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-google/2.2.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-docker/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-docker/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[GitHub] [airflow] jhtimmins commented on a change in pull request #15311: WIP: Sync DAG specific permissions when parsing
jhtimmins commented on a change in pull request #15311: URL: https://github.com/apache/airflow/pull/15311#discussion_r612039109 ## File path: airflow/models/serialized_dag.py ## @@ -37,10 +37,25 @@ from airflow.utils import timezone from airflow.utils.session import provide_session from airflow.utils.sqlalchemy import UtcDateTime +from airflow.www.security import AirflowSecurityManager log = logging.getLogger(__name__) +class SimpleSecurityManager(AirflowSecurityManager): Review comment: @jedcunningham Oof I need to think about this, because generally speaking we really don't want to extend the webserver-level controls into Airflow core. The ability to define access controls inside the DAG is a huge antipattern imo. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] tag providers-facebook/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-facebook/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-cncf-kubernetes/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-cncf-kubernetes/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-apache-livy/1.1.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-apache-livy/1.1.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-apache-hive/1.0.3 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-apache-hive/1.0.3 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-amazon/1.3.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-amazon/1.3.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[airflow] tag providers-airbyte/1.0.0 created (now 4e018a8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to tag providers-airbyte/1.0.0 in repository https://gitbox.apache.org/repos/asf/airflow.git. at 4e018a8 (commit) No new revisions were added by this update.
[GitHub] [airflow-site] potiuk merged pull request #402: Add documentation for packages - 2021-04-07
potiuk merged pull request #402: URL: https://github.com/apache/airflow-site/pull/402 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Commented] (AIRFLOW-6786) Adding KafkaConsumerHook, KafkaProducerHook, and KafkaSensor
[ https://issues.apache.org/jira/browse/AIRFLOW-6786?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17319784#comment-17319784 ] ASF GitHub Bot commented on AIRFLOW-6786: - github-actions[bot] closed pull request #12388: URL: https://github.com/apache/airflow/pull/12388 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Adding KafkaConsumerHook, KafkaProducerHook, and KafkaSensor > > > Key: AIRFLOW-6786 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6786 > Project: Apache Airflow > Issue Type: New Feature > Components: contrib, hooks >Affects Versions: 1.10.9 >Reporter: Daniel Ferguson >Assignee: Daniel Ferguson >Priority: Minor > > Add the KafkaProducerHook. > Add the KafkaConsumerHook. > Add the KafkaSensor which listens to messages with a specific topic. > Related Issue: > #1311 (Pre-dates Jira Migration) > Reminder to contributors: > You must add an Apache License header to all new files > Please squash your commits when possible and follow the 7 rules of good Git > commits > I am new to the community, I am not sure the files are at the right place or > missing anything. > The sensor could be used as the first node of a dag where the second node can > be a TriggerDagRunOperator. The messages are polled in a batch and the dag > runs are dynamically generated. > Thanks! > Note, as per denied PR [#1415|https://github.com/apache/airflow/pull/1415], > it is important to mention these integrations are not suitable for > low-latency/high-throughput/streaming. For reference, [#1415 > (comment)|https://github.com/apache/airflow/pull/1415#issuecomment-484429806]. > Co-authored-by: Dan Ferguson > [dferguson...@gmail.com|mailto:dferguson...@gmail.com] > Co-authored-by: YuanfΞi Zhu -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] github-actions[bot] commented on pull request #11260: Add DAG permissions based on DAG tags
github-actions[bot] commented on pull request #11260: URL: https://github.com/apache/airflow/pull/11260#issuecomment-818324106 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] closed pull request #12388: [AIRFLOW-6786] Added Kafka components, 3rd time's the charm
github-actions[bot] closed pull request #12388: URL: https://github.com/apache/airflow/pull/12388 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] closed issue #13393: Airflow 1.10 -> 2.0 DB upgrade broke
github-actions[bot] closed issue #13393: URL: https://github.com/apache/airflow/issues/13393 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on issue #13393: Airflow 1.10 -> 2.0 DB upgrade broke
github-actions[bot] commented on issue #13393: URL: https://github.com/apache/airflow/issues/13393#issuecomment-818324060 This issue has been closed because it has not received response from the issue author. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
svn commit: r47021 - /release/airflow/providers/
Author: potiuk Date: Mon Apr 12 23:57:51 2021 New Revision: 47021 Log: Proper provider versions added Modified: release/airflow/providers/apache-airflow-providers-salesforce-2.0.0.tar.gz release/airflow/providers/apache-airflow-providers-salesforce-2.0.0.tar.gz.asc release/airflow/providers/apache-airflow-providers-salesforce-2.0.0.tar.gz.sha512 release/airflow/providers/apache_airflow_providers_airbyte-1.0.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_airbyte-1.0.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_airbyte-1.0.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_amazon-1.3.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_amazon-1.3.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_amazon-1.3.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_hive-1.0.3-py3-none-any.whl release/airflow/providers/apache_airflow_providers_apache_hive-1.0.3-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_hive-1.0.3-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_livy-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_apache_livy-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_livy-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_docker-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_docker-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_docker-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_facebook-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_facebook-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_facebook-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_google-2.2.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_google-2.2.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_google-2.2.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_grpc-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_grpc-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_grpc-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_hashicorp-1.0.2-py3-none-any.whl release/airflow/providers/apache_airflow_providers_hashicorp-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_hashicorp-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_microsoft_azure-1.3.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_microsoft_azure-1.3.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_microsoft_azure-1.3.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_microsoft_winrm-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_microsoft_winrm-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_microsoft_winrm-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_mysql-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_mysql-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_mysql-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_opsgenie-1.0.2-py3-none-any.whl release/airflow/providers/apache_airflow_providers_opsgenie-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_opsgenie-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_oracle-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_oracle-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_oracle-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_salesforce-2.0.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_salesforce-2.0.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_salesforce-2.0.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_singularity-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_singularity-1.1.0-py3-none-any.whl.asc re
svn commit: r47020 - /release/airflow/providers/
Author: potiuk Date: Mon Apr 12 23:55:55 2021 New Revision: 47020 Log: remove old provider releases Removed: release/airflow/providers/apache_airflow_providers_amazon-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_amazon-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_amazon-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_amazon-1.2.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_amazon-1.2.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_amazon-1.2.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_beam-1.0.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_beam-1.0.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_druid-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_druid-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_hive-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_hive-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_hive-1.0.2-py3-none-any.whl release/airflow/providers/apache_airflow_providers_apache_hive-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_hive-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_livy-1.0.1-py3-none-any.whl release/airflow/providers/apache_airflow_providers_apache_livy-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_livy-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_spark-1.0.1-py3-none-any.whl release/airflow/providers/apache_airflow_providers_apache_spark-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_spark-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.1-py3-none-any.whl release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.2-py3-none-any.whl release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_dingding-1.0.1-py3-none-any.whl release/airflow/providers/apache_airflow_providers_dingding-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_dingding-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_docker-1.0.1-py3-none-any.whl release/airflow/providers/apache_airflow_providers_docker-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_docker-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_docker-1.0.2-py3-none-any.whl release/airflow/providers/apache_airflow_providers_docker-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_docker-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_exasol-1.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_exasol-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_exasol-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_facebook-1.0.1-py3-none-any.whl release/airflow/providers/apache_airflow_providers_facebook-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_facebook-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_google-2.0.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_google-2.0.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_google-2.0.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_google-2.1.0-py3-none-any.whl release/airflow/providers/apache_airflow_providers_google-2.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_google-2.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_grpc-1.0.1-py3-none-any.whl release/airflow/providers/apache_airflow_providers_grpc-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_grpc-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_hashicorp-1.0.1-py3-none-any.whl release/airflow/providers/apache_airflow_providers_hashicorp-1.0.1-py3
svn commit: r47019 [2/2] - /release/airflow/providers/
Added: release/airflow/providers/apache_airflow_providers_presto-1.0.2-py3-none-any.whl.asc == --- release/airflow/providers/apache_airflow_providers_presto-1.0.2-py3-none-any.whl.asc (added) +++ release/airflow/providers/apache_airflow_providers_presto-1.0.2-py3-none-any.whl.asc Mon Apr 12 23:55:13 2021 @@ -0,0 +1,11 @@ +-BEGIN PGP SIGNATURE- + +iQFGBAABCgAwFiEEXBQ//T0ZRV5n99CUq9WryVrDFVYFAmA6XF0SHHBvdGl1a0Bh +cGFjaGUub3JnAAoJEKvVq8lawxVWea8IALKO44mezoh+7q9NJaIV2UmreWYZqRuc +hTTDIBh2iMKAIBldWjD7H4k2bt1yF5nyjgLETUJ74d2QHQ1mLTVtt+qmkcyJYH1u +Sh1+VYS60WatafoylxBAiKD0rsHaCdInxpPZlthFO41zfUhEbd9DIxW7IYvqa3B1 +bPhyWiTgXZWvteCBd8HUthjP9jW6ufWq8DOyoHhTguBByCLAg1dlWDwg+LoS1ki8 +HvSFPJINLfzgcybZDIqOq0jW0yZJh48dsQCwYB5K9T36Y/FV3ft5K276Tkh3ToDI +W76GDRhmHupxBG6llM137YwxEeI5T//lt9AbKyFCEFeEl8/nTB75FhA= +=KmCE +-END PGP SIGNATURE- Added: release/airflow/providers/apache_airflow_providers_presto-1.0.2-py3-none-any.whl.sha512 == --- release/airflow/providers/apache_airflow_providers_presto-1.0.2-py3-none-any.whl.sha512 (added) +++ release/airflow/providers/apache_airflow_providers_presto-1.0.2-py3-none-any.whl.sha512 Mon Apr 12 23:55:13 2021 @@ -0,0 +1 @@ +e33a865a2e147477228a59bf44b080e4b9f9cbbcb15a64320c69ebdd47693db897b5b971b08fe4ce92ac6baf4ea76c8971acfceae14aceb33c1b34bf1ae92a36 apache_airflow_providers_presto-1.0.2rc1-py3-none-any.whl Added: release/airflow/providers/apache_airflow_providers_qubole-1.0.2-py3-none-any.whl == Binary file - no diff available. Propchange: release/airflow/providers/apache_airflow_providers_qubole-1.0.2-py3-none-any.whl -- svn:mime-type = application/octet-stream Added: release/airflow/providers/apache_airflow_providers_qubole-1.0.2-py3-none-any.whl.asc == --- release/airflow/providers/apache_airflow_providers_qubole-1.0.2-py3-none-any.whl.asc (added) +++ release/airflow/providers/apache_airflow_providers_qubole-1.0.2-py3-none-any.whl.asc Mon Apr 12 23:55:13 2021 @@ -0,0 +1,11 @@ +-BEGIN PGP SIGNATURE- + +iQFGBAABCgAwFiEEXBQ//T0ZRV5n99CUq9WryVrDFVYFAmA6XF4SHHBvdGl1a0Bh +cGFjaGUub3JnAAoJEKvVq8lawxVWhlAIALG2PgEt+vpMh4HFY7oEZpZXnFST0DEn +BkunloQTVwg0fJeFkLghhOAHq1Zh3qL4mFHGw7bTT0ZgO/KqocU0iOc+y9CIIyB3 +D1WzuK2rTJV5NvxF8TKXQn17BMQ5ZcDjrvaPkP/cHrxKdaWBN+gVc1EaunYfVcJ1 +jc3uQBJdVY29w2w4bxaujkxTySVcuoRCKXb1cL4mRqKsjWzrZ8eJjzYHh2jUu7Dv +BkpFk6+yaeFf3CW9nJL+b0naVJQnR+xZlsdgh7e6HKwYa5sRjUhrChhMWSDM0g2t +WBvVrFy92TNSoXwS4pTxlgLyUISGEXz4ZvLCDmUaMe1HunqLyHwG2yY= +=wgJY +-END PGP SIGNATURE- Added: release/airflow/providers/apache_airflow_providers_qubole-1.0.2-py3-none-any.whl.sha512 == --- release/airflow/providers/apache_airflow_providers_qubole-1.0.2-py3-none-any.whl.sha512 (added) +++ release/airflow/providers/apache_airflow_providers_qubole-1.0.2-py3-none-any.whl.sha512 Mon Apr 12 23:55:13 2021 @@ -0,0 +1 @@ +8d48a512deb97c5927f5fe80e2ce47a498164425e55e587d472e10c75da42e918a1b831dd8b0e87d4d2f97b2475ad6bd8b4c80b642e7a900c4f3c451e0c97776 apache_airflow_providers_qubole-1.0.2rc1-py3-none-any.whl Added: release/airflow/providers/apache_airflow_providers_sendgrid-1.0.2-py3-none-any.whl == Binary file - no diff available. Propchange: release/airflow/providers/apache_airflow_providers_sendgrid-1.0.2-py3-none-any.whl -- svn:mime-type = application/octet-stream Added: release/airflow/providers/apache_airflow_providers_sendgrid-1.0.2-py3-none-any.whl.asc == --- release/airflow/providers/apache_airflow_providers_sendgrid-1.0.2-py3-none-any.whl.asc (added) +++ release/airflow/providers/apache_airflow_providers_sendgrid-1.0.2-py3-none-any.whl.asc Mon Apr 12 23:55:13 2021 @@ -0,0 +1,11 @@ +-BEGIN PGP SIGNATURE- + +iQFGBAABCgAwFiEEXBQ//T0ZRV5n99CUq9WryVrDFVYFAmA6XF4SHHBvdGl1a0Bh +cGFjaGUub3JnAAoJEKvVq8lawxVWz68H/1LtxzRPa/VPzhLioGwS7d85RGsbs9R3 +H9KLTQoCliTFdRgYpy7bCgbOU9RRrFmcw1/E33EZXBH7y785SiAjW+t4jQ5vod1t +YPmDG2Ypb8A9QxENMlO3o9H3/DiQWy4awH6dKIOVlObPglgNB6RukuYhE4HOmPia +YSD77rR7toEq8cI3UALQVQ4uxzdR1Jce/+C64BIpPYOWCnEcTqF0nDxYa7UCEeM5 +owp+X0xeLLeukDGgvTO8znIjiURMqfSJXigJy42oCzS7taU0vUZSX/HahN8yID1T +wjzNqYPzoyJcexU7PiIJV3s9H6aAautOHMlgZ0h8isnrGOU9oVp3Omo= +=d8zu +-END PGP SIGNATURE- Added: release/airflow/providers/apache_airflow_providers_sendgrid-1.0.2-py3-none-any.whl.sha512 ===
svn commit: r47019 [1/2] - /release/airflow/providers/
Author: potiuk Date: Mon Apr 12 23:55:13 2021 New Revision: 47019 Log: rename wrongly named files Added: release/airflow/providers/apache_airflow_providers_airbyte-1.0.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_airbyte-1.0.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_airbyte-1.0.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_amazon-1.2.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_amazon-1.2.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_amazon-1.2.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_amazon-1.3.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_amazon-1.3.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_amazon-1.3.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_beam-1.0.1-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_apache_beam-1.0.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_beam-1.0.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_druid-1.1.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_apache_druid-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_druid-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_hive-1.0.2-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_apache_hive-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_hive-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_hive-1.0.3-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_apache_hive-1.0.3-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_hive-1.0.3-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_livy-1.1.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_apache_livy-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_livy-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_apache_spark-1.0.2-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_apache_spark-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_apache_spark-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.2-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.1.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_cncf_kubernetes-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_dingding-1.0.2-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_dingding-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_dingding-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_docker-1.0.2-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_docker-1.0.2-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_docker-1.0.2-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_docker-1.1.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_docker-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_docker-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_exasol-1.1.1-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_exasol-1.1.1-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_exasol-1.1.1-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_facebook-1.1.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_facebook-1.1.0-py3-none-any.whl.asc release/airflow/providers/apache_airflow_providers_facebook-1.1.0-py3-none-any.whl.sha512 release/airflow/providers/apache_airflow_providers_google-2.1.0-py3-none-any.whl (with props) release/airflow/providers/apache_airflow_providers_google-2.
[GitHub] [airflow] kaxil commented on pull request #14152: Helm RBAC Best Practices
kaxil commented on pull request #14152: URL: https://github.com/apache/airflow/pull/14152#issuecomment-818314692 > @ashb, @mik-laj, @kaxil This has been rebased with master again. Let me know if there's anything that needs to be addressed. Thanks, I will take a look in coming days, thanks -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated (e4c0689 -> 1a85ba9)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from e4c0689 Fix Helm GitSync dag volume mount from pod-template-file (#15331) add 1a85ba9 Add dynamic connection fields to Azure Connection (#15159) No new revisions were added by this update. Summary of changes: airflow/providers/microsoft/azure/hooks/adx.py | 68 +++--- .../providers/microsoft/azure/hooks/azure_batch.py | 35 ++- .../azure/hooks/azure_container_instance.py| 46 ++- .../azure/hooks/azure_container_registry.py| 24 .../microsoft/azure/hooks/azure_cosmos.py | 43 +- .../microsoft/azure/hooks/azure_data_factory.py| 56 +- .../microsoft/azure/hooks/azure_data_lake.py | 44 -- .../providers/microsoft/azure/hooks/base_azure.py | 56 -- airflow/providers/microsoft/azure/hooks/wasb.py| 61 --- airflow/providers/microsoft/azure/provider.yaml| 1 + .../run_install_and_test_provider_packages.sh | 6 +- tests/core/test_providers_manager.py | 27 + 12 files changed, 431 insertions(+), 36 deletions(-)
[GitHub] [airflow] kaxil merged pull request #15159: Add dynamic connection fields to Azure Connection
kaxil merged pull request #15159: URL: https://github.com/apache/airflow/pull/15159 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy commented on a change in pull request #15336: Fail task when containers inside a pod fails
ephraimbuddy commented on a change in pull request #15336: URL: https://github.com/apache/airflow/pull/15336#discussion_r612022098 ## File path: tests/executors/test_kubernetes_executor.py ## @@ -507,3 +506,113 @@ def test_process_status_catchall(self): self._run() self.watcher.watcher_queue.put.assert_not_called() + +def test_container_status_of_terminating_fails_pod(self): +self.pod.status.phase = "Pending" +self.pod.status.container_statuses = [ +k8s.V1ContainerStatus( +container_id=None, +image="apache/airflow:2.0.1-python3.8", +image_id="", +name="base", +ready="false", +restart_count=0, +state=k8s.V1ContainerState( +terminated=k8s.V1ContainerStateTerminated( +reason="Terminating", exit_code=1 Review comment: > Have you seen, or can you recreate a `phase=Pending` and `state.terminated` pod? I don't see how it is possible to have both. > > I've tried a few scenarios with both init containers and sidecars and every case has resulted in the watcher marking it as failed (though maybe not immediately, because `phase=Running`) - however the TI still gets marked as success. > > Said another way, I think there are bugs around here, but I don't think looking at stuff in `phase=Pending` will help? The state.terminated is of the container inside the pod. It happens. This is how you can reproduce it: 1. Checkout this PR, 2. Go to values.yaml and set `worker_container_repository: apache/airflow`, `worker_container_tag: 2.0.1-python3.8` 3. Use breeze to start the cluster: `./breeze kind-cluster start`, `./breeze kind-cluster deploy` 4. Monitor the pods in another terminal with k9s `./breeze kind-cluster k9s` 5. Check the scheduler logs, it will print the 'event' object at each watcher run. Inspect the object and you'll see the pod phase and container state. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] DerekHeldtWerle commented on pull request #14152: Helm RBAC Best Practices
DerekHeldtWerle commented on pull request #14152: URL: https://github.com/apache/airflow/pull/14152#issuecomment-818310728 @ashb, @mik-laj, @kaxil This has been rebased with master again. Let me know if there's anything that needs to be addressed. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy commented on pull request #15295: Prevent creating flask sessions on REST API requests
ephraimbuddy commented on pull request #15295: URL: https://github.com/apache/airflow/pull/15295#issuecomment-818309241 > > if user decides to configure it, then the user should also configure session interface > > Sounds good to me! > > Instead of detecting the `/api/` prefix, would it be better to rely on Flask’s routing for detection instead? IIRC API views are grouped in Blueprints already, so maybe `request.blueprint` would be useful. I’m not strong on this though; hard-coding URLs is usually not a good idea, but Airflow is already doing that in may places anyway (eh). I have used request.blueprint. Maybe we should refactor code later on and not hard-code the URL. Let me know if this is what you expected. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #15330: Add a Docker Taskflow decorator
github-actions[bot] commented on pull request #15330: URL: https://github.com/apache/airflow/pull/15330#issuecomment-818305808 [The Workflow run](https://github.com/apache/airflow/actions/runs/742835368) is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on a change in pull request #15336: Fail task when containers inside a pod fails
kaxil commented on a change in pull request #15336: URL: https://github.com/apache/airflow/pull/15336#discussion_r612014040 ## File path: airflow/executors/kubernetes_executor.py ## @@ -218,6 +239,34 @@ def process_status( resource_version, ) +def process_container_statuses( +self, +pod_id: str, +statuses: List[Any], +namespace: str, +annotations: Dict[str, str], +resource_version: str, +): +"""Monitor pod container statuses""" +for container_status in statuses: +terminated = container_status.state.terminated +waiting = container_status.state.waiting +if terminated: +self.log.debug( +"A container in the pod %s has terminated, reason: %s, message: %s", +pod_id, +terminated.reason, +terminated.message, +) +self.watcher_queue.put((pod_id, namespace, State.FAILED, annotations, resource_version)) Review comment: So probably: ```python any((container_status.state.terminated and container_status.state.terminated.exit_code == 1) for container_status in statuses) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jedcunningham commented on a change in pull request #15336: Fail task when containers inside a pod fails
jedcunningham commented on a change in pull request #15336: URL: https://github.com/apache/airflow/pull/15336#discussion_r611981896 ## File path: airflow/executors/kubernetes_executor.py ## @@ -218,6 +239,34 @@ def process_status( resource_version, ) +def process_container_statuses( +self, +pod_id: str, +statuses: List[Any], +namespace: str, +annotations: Dict[str, str], +resource_version: str, +): +"""Monitor pod container statuses""" +for container_status in statuses: +terminated = container_status.state.terminated +waiting = container_status.state.waiting +if terminated: +self.log.debug( +"A container in the pod %s has terminated, reason: %s, message: %s", +pod_id, +terminated.reason, +terminated.message, +) +self.watcher_queue.put((pod_id, namespace, State.FAILED, annotations, resource_version)) Review comment: I don't think we should be adding more than once to `watcher_queue`, right? It might be better to leave the queue handling to `process_status` and just return a bool, less to cart around too then. Maybe something like this: ``` def _has_terminated_containers(self, status: V1PodStatus) -> bool: ``` ## File path: airflow/executors/kubernetes_executor.py ## @@ -187,25 +188,45 @@ def process_status( self, pod_id: str, namespace: str, -status: str, +status: Any, Review comment: ```suggestion status: k8s.V1PodStatus, ``` ## File path: tests/executors/test_kubernetes_executor.py ## @@ -507,3 +506,113 @@ def test_process_status_catchall(self): self._run() self.watcher.watcher_queue.put.assert_not_called() + +def test_container_status_of_terminating_fails_pod(self): +self.pod.status.phase = "Pending" +self.pod.status.container_statuses = [ +k8s.V1ContainerStatus( +container_id=None, +image="apache/airflow:2.0.1-python3.8", +image_id="", +name="base", +ready="false", +restart_count=0, +state=k8s.V1ContainerState( +terminated=k8s.V1ContainerStateTerminated( +reason="Terminating", exit_code=1 Review comment: Have you seen, or can you recreate a `phase=Pending` and `state.terminated` pod? I don't see how it is possible to have both. I've tried a few scenarios with both init containers and sidecars and every case has resulted in the watcher marking it as failed (though maybe not immediately, because `phase=Running`) - however the TI still gets marked as success. Said another way, I think there are bugs around here, but I don't think looking at stuff in `phase=Pending` will help? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on a change in pull request #15336: Fail task when containers inside a pod fails
kaxil commented on a change in pull request #15336: URL: https://github.com/apache/airflow/pull/15336#discussion_r612010640 ## File path: airflow/executors/kubernetes_executor.py ## @@ -218,6 +239,34 @@ def process_status( resource_version, ) +def process_container_statuses( +self, +pod_id: str, +statuses: List[Any], +namespace: str, +annotations: Dict[str, str], +resource_version: str, +): +"""Monitor pod container statuses""" +for container_status in statuses: +terminated = container_status.state.terminated +waiting = container_status.state.waiting +if terminated: +self.log.debug( +"A container in the pod %s has terminated, reason: %s, message: %s", +pod_id, +terminated.reason, +terminated.message, +) +self.watcher_queue.put((pod_id, namespace, State.FAILED, annotations, resource_version)) Review comment: You should probably check exit_code too. Code: https://github.com/kubernetes-client/python/blob/v11.0.0/kubernetes/client/models/v1_container_state_terminated.py#L67 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] github-actions[bot] commented on pull request #15338: Change default of `[kubernetes] enable_tcp_keepalive` to `True`
github-actions[bot] commented on pull request #15338: URL: https://github.com/apache/airflow/pull/15338#issuecomment-818298005 The PR most likely needs to run full matrix of tests because it modifies parts of the core of Airflow. However, committers might decide to merge it quickly and take the risk. If they don't merge it quickly - please rebase it to the latest master at your convenience, or amend the last commit of the PR, and push it with --force-with-lease. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil commented on a change in pull request #15336: Fail task when containers inside a pod fails
kaxil commented on a change in pull request #15336: URL: https://github.com/apache/airflow/pull/15336#discussion_r612007522 ## File path: airflow/executors/kubernetes_executor.py ## @@ -218,6 +239,34 @@ def process_status( resource_version, ) +def process_container_statuses( +self, +pod_id: str, +statuses: List[Any], +namespace: str, +annotations: Dict[str, str], +resource_version: str, +): +"""Monitor pod container statuses""" +for container_status in statuses: +terminated = container_status.state.terminated +waiting = container_status.state.waiting +if terminated: +self.log.debug( +"A container in the pod %s has terminated, reason: %s, message: %s", +pod_id, +terminated.reason, +terminated.message, +) +self.watcher_queue.put((pod_id, namespace, State.FAILED, annotations, resource_version)) Review comment: should we short-circuit and return here, since we want to mark a task as Fail when any container in the POD fails right? We could also probably do ```python any(container_status.state.terminated for container_status in statuses) ``` However, "a terminated container" != "failed container" >A container in the Terminated state began execution and then either ran to completion or failed for some reason. When you use kubectl to query a Pod with a container that is Terminated, you see a reason, an exit code, and the start and finish time for that container's period of execution. From: https://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle/#container-state-terminated -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #15331: Fix Helm GitSync dag volume mount
boring-cyborg[bot] commented on pull request #15331: URL: https://github.com/apache/airflow/pull/15331#issuecomment-818293231 Awesome work, congrats on your first merged pull request! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated (8b56629 -> e4c0689)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from 8b56629 Add Configurable LivenessProbe Values to Scheduler (#15333) add e4c0689 Fix Helm GitSync dag volume mount from pod-template-file (#15331) No new revisions were added by this update. Summary of changes: chart/files/pod-template-file.kubernetes-helm-yaml | 1 - 1 file changed, 1 deletion(-)
[GitHub] [airflow] kaxil merged pull request #15331: Fix Helm GitSync dag volume mount
kaxil merged pull request #15331: URL: https://github.com/apache/airflow/pull/15331 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil merged pull request #15333: Add Configurable LivenessProbe Values to Scheduler
kaxil merged pull request #15333: URL: https://github.com/apache/airflow/pull/15333 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil closed issue #15259: Scheduler livenessprobe and k8s v1.20+
kaxil closed issue #15259: URL: https://github.com/apache/airflow/issues/15259 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated: Add Configurable LivenessProbe Values to Scheduler (#15333)
This is an automated email from the ASF dual-hosted git repository. kaxilnaik pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/master by this push: new 8b56629 Add Configurable LivenessProbe Values to Scheduler (#15333) 8b56629 is described below commit 8b56629ecd44d346e35c146779e2bb5422af1b5d Author: Ian Stanton AuthorDate: Mon Apr 12 18:46:59 2021 -0400 Add Configurable LivenessProbe Values to Scheduler (#15333) - Add configurable livenessProbe values to scheduler component in helm chart - Increase default values to avoid livenessProbe failure - Closes: #15259 --- .../templates/scheduler/scheduler-deployment.yaml | 8 +++ chart/tests/test_scheduler.py | 26 ++ chart/values.schema.json | 23 +++ chart/values.yaml | 7 ++ 4 files changed, 60 insertions(+), 4 deletions(-) diff --git a/chart/templates/scheduler/scheduler-deployment.yaml b/chart/templates/scheduler/scheduler-deployment.yaml index 702e514..7e90442 100644 --- a/chart/templates/scheduler/scheduler-deployment.yaml +++ b/chart/templates/scheduler/scheduler-deployment.yaml @@ -116,11 +116,11 @@ spec: env: {{- include "custom_airflow_environment" . | indent 10 }} {{- include "standard_airflow_environment" . | indent 10 }} - # If the scheduler stops heartbeating for 5 minutes (10*30s) kill the - # scheduler and let Kubernetes restart it livenessProbe: -failureThreshold: 10 -periodSeconds: 30 +initialDelaySeconds: {{ .Values.scheduler.livenessProbe.initialDelaySeconds }} +timeoutSeconds: {{ .Values.scheduler.livenessProbe.timeoutSeconds }} +failureThreshold: {{ .Values.scheduler.livenessProbe.failureThreshold }} +periodSeconds: {{ .Values.scheduler.livenessProbe.periodSeconds }} exec: command: - python diff --git a/chart/tests/test_scheduler.py b/chart/tests/test_scheduler.py index 57173fb..c65a81d 100644 --- a/chart/tests/test_scheduler.py +++ b/chart/tests/test_scheduler.py @@ -128,3 +128,29 @@ class SchedulerTest(unittest.TestCase): "spec.template.spec.tolerations[0].key", docs[0], ) + +def test_livenessprobe_values_are_configurable(self): +docs = render_chart( +values={ +"scheduler": { +"livenessProbe": { +"initialDelaySeconds": 111, +"timeoutSeconds": 222, +"failureThreshold": 333, +"periodSeconds": 444, +} +}, +}, +show_only=["templates/scheduler/scheduler-deployment.yaml"], +) + +assert 111 == jmespath.search( + "spec.template.spec.containers[0].livenessProbe.initialDelaySeconds", docs[0] +) +assert 222 == jmespath.search( +"spec.template.spec.containers[0].livenessProbe.timeoutSeconds", docs[0] +) +assert 333 == jmespath.search( +"spec.template.spec.containers[0].livenessProbe.failureThreshold", docs[0] +) +assert 444 == jmespath.search("spec.template.spec.containers[0].livenessProbe.periodSeconds", docs[0]) diff --git a/chart/values.schema.json b/chart/values.schema.json index c09bef2..b1060fd 100644 --- a/chart/values.schema.json +++ b/chart/values.schema.json @@ -682,6 +682,29 @@ "type": "object", "additionalProperties": false, "properties": { +"livenessProbe": { +"description": "Liveness probe configuration.", +"type": "object", +"additionalProperties": false, +"properties": { +"initialDelaySeconds": { +"description": "Scheduler Liveness probe initial delay.", +"type": "integer" +}, +"timeoutSeconds": { +"description": "Scheduler Liveness probe timeout seconds.", +"type": "integer" +}, +"failureThreshold": { +"description": "Scheduler Liveness probe failure threshold.", +"type": "integer" +}, +"periodSeconds": { +"description": "Webserver Liveness probe period seconds.", +"type": "integer" +} +} +}, "replicas": { "description": "Airflow 2.0 allows
[GitHub] [airflow] github-actions[bot] commented on pull request #15159: Add dynamic connection fields to Azure Connection
github-actions[bot] commented on pull request #15159: URL: https://github.com/apache/airflow/pull/15159#issuecomment-818291028 The PR most likely needs to run full matrix of tests because it modifies parts of the core of Airflow. However, committers might decide to merge it quickly and take the risk. If they don't merge it quickly - please rebase it to the latest master at your convenience, or amend the last commit of the PR, and push it with --force-with-lease. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
svn commit: r47018 - /dev/airflow/providers/
Author: potiuk Date: Mon Apr 12 22:25:15 2021 New Revision: 47018 Log: Release Airflow Providers on Tue 13 Apr 00:24:59 CEST 2021 Removed: dev/airflow/providers/apache-airflow-providers-airbyte-1.0.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-airbyte-1.0.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-airbyte-1.0.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-amazon-1.3.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-amazon-1.3.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-amazon-1.3.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-apache-hive-1.0.3rc1.tar.gz dev/airflow/providers/apache-airflow-providers-apache-hive-1.0.3rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-apache-hive-1.0.3rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-apache-livy-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-apache-livy-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-apache-livy-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-docker-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-docker-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-docker-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-elasticsearch-1.0.3rc1.tar.gz dev/airflow/providers/apache-airflow-providers-elasticsearch-1.0.3rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-elasticsearch-1.0.3rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-facebook-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-facebook-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-facebook-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-google-2.2.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-google-2.2.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-google-2.2.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-grpc-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-grpc-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-grpc-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-hashicorp-1.0.2rc1.tar.gz dev/airflow/providers/apache-airflow-providers-hashicorp-1.0.2rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-hashicorp-1.0.2rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-microsoft-azure-1.3.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-microsoft-azure-1.3.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-microsoft-azure-1.3.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-microsoft-winrm-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-microsoft-winrm-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-microsoft-winrm-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-mysql-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-mysql-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-mysql-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-opsgenie-1.0.2rc1.tar.gz dev/airflow/providers/apache-airflow-providers-opsgenie-1.0.2rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-opsgenie-1.0.2rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-oracle-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-oracle-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-oracle-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-salesforce-2.0.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-salesforce-2.0.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-salesforce-2.0.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-salesforce-2.0.0rc2.tar.gz dev/airflow/providers/apache-airflow-providers-singularity-1.1.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-singularity-1.1.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-singularity-1.1.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-snowflake-1.2.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-snowflake-1.2.0rc1.tar.gz.asc dev/airflow/providers/apache-airflow-providers-snowflake-1.2.0rc1.tar.gz.sha512 dev/airflow/providers/apache-airflow-providers-ssh-1.3.0rc1.tar.gz dev/airflow/providers/apache-airflow-providers-ssh-1.3.0rc1.tar.gz.asc
svn commit: r47017 - /release/airflow/providers/
Author: potiuk Date: Mon Apr 12 22:23:49 2021 New Revision: 47017 Log: Release Airflow Providers on Tue 13 Apr 00:23:24 CEST 2021 Added: release/airflow/providers/apache-airflow-providers-airbyte-1.0.0.tar.gz - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-airbyte-1.0.0rc1.tar.gz release/airflow/providers/apache-airflow-providers-airbyte-1.0.0.tar.gz.asc - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-airbyte-1.0.0rc1.tar.gz.asc release/airflow/providers/apache-airflow-providers-airbyte-1.0.0.tar.gz.sha512 - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-airbyte-1.0.0rc1.tar.gz.sha512 release/airflow/providers/apache-airflow-providers-amazon-1.3.0.tar.gz - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-amazon-1.3.0rc1.tar.gz release/airflow/providers/apache-airflow-providers-amazon-1.3.0.tar.gz.asc - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-amazon-1.3.0rc1.tar.gz.asc release/airflow/providers/apache-airflow-providers-amazon-1.3.0.tar.gz.sha512 - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-amazon-1.3.0rc1.tar.gz.sha512 release/airflow/providers/apache-airflow-providers-apache-hive-1.0.3.tar.gz - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-apache-hive-1.0.3rc1.tar.gz release/airflow/providers/apache-airflow-providers-apache-hive-1.0.3.tar.gz.asc - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-apache-hive-1.0.3rc1.tar.gz.asc release/airflow/providers/apache-airflow-providers-apache-hive-1.0.3.tar.gz.sha512 - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-apache-hive-1.0.3rc1.tar.gz.sha512 release/airflow/providers/apache-airflow-providers-apache-livy-1.1.0.tar.gz - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-apache-livy-1.1.0rc1.tar.gz release/airflow/providers/apache-airflow-providers-apache-livy-1.1.0.tar.gz.asc - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-apache-livy-1.1.0rc1.tar.gz.asc release/airflow/providers/apache-airflow-providers-apache-livy-1.1.0.tar.gz.sha512 - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-apache-livy-1.1.0rc1.tar.gz.sha512 release/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0.tar.gz - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0rc1.tar.gz release/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0.tar.gz.asc - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0rc1.tar.gz.asc release/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0.tar.gz.sha512 - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-1.1.0rc1.tar.gz.sha512 release/airflow/providers/apache-airflow-providers-docker-1.1.0.tar.gz - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-docker-1.1.0rc1.tar.gz release/airflow/providers/apache-airflow-providers-docker-1.1.0.tar.gz.asc - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-docker-1.1.0rc1.tar.gz.asc release/airflow/providers/apache-airflow-providers-docker-1.1.0.tar.gz.sha512 - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-docker-1.1.0rc1.tar.gz.sha512 release/airflow/providers/apache-airflow-providers-facebook-1.1.0.tar.gz - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-facebook-1.1.0rc1.tar.gz release/airflow/providers/apache-airflow-providers-facebook-1.1.0.tar.gz.asc - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-facebook-1.1.0rc1.tar.gz.asc release/airflow/providers/apache-airflow-providers-facebook-1.1.0.tar.gz.sha512 - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-facebook-1.1.0rc1.tar.gz.sha512 release/airflow/providers/apache-airflow-providers-google-2.2.0.tar.gz - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-google-2.2.0rc1.tar.gz release/airflow/providers/apache-airflow-providers-google-2.2.0.tar.gz.asc - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-google-2.2.0rc1.tar.gz.asc release/airflow/providers/apache-airflow-providers-google-2.2.0.tar.gz.sha512 - copied unchanged from r46980, dev/airflow/providers/apache-airflow-providers-google-2.2.0rc1.tar.gz.sha512 release/airflow/providers/apache-airflow-providers-grpc-1.1.0.tar.gz - copied unchanged from r46980, dev/airflow/
[GitHub] [airflow] potiuk commented on issue #15241: Testing Providers prepared 2021.04.07
potiuk commented on issue #15241: URL: https://github.com/apache/airflow/issues/15241#issuecomment-818261951 Thanks everyone. Preparing official release now. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk closed issue #15241: Testing Providers prepared 2021.04.07
potiuk closed issue #15241: URL: https://github.com/apache/airflow/issues/15241 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] potiuk commented on issue #15241: Testing Providers prepared 2021.04.07
potiuk commented on issue #15241: URL: https://github.com/apache/airflow/issues/15241#issuecomment-818261717 Just adding more comments: - I looked at microsoft.azure and the changes are fairly trivial - tested that azure_data_factory hook is there - salesforce provider includes tableu separation - it's been tested before but we have not released salesforce provider back then (copy&paste mistake) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jedcunningham opened a new pull request #15338: Change default of `[kubernetes] enable_tcp_keepalive` to `True`
jedcunningham opened a new pull request #15338: URL: https://github.com/apache/airflow/pull/15338 We've seen instances of connection resets happening, particularly in Azure, that are remedied by enabling tcp_keepalive. Enabling it by default should be safe and sane regardless of where we are running. Related: https://github.com/apache/airflow/issues/14261#issuecomment-784619202 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] BenoitHanotte closed pull request #15337: Calendar view MVP
BenoitHanotte closed pull request #15337: URL: https://github.com/apache/airflow/pull/15337 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] BenoitHanotte opened a new pull request #15337: Calendar view MVP
BenoitHanotte opened a new pull request #15337: URL: https://github.com/apache/airflow/pull/15337 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on pull request #15337: Calendar view MVP
boring-cyborg[bot] commented on pull request #15337: URL: https://github.com/apache/airflow/pull/15337#issuecomment-818240010 Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst) Here are some useful points: - Pay attention to the quality of your code (flake8, pylint and type annotations). Our [pre-commits]( https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that. - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short [guide](https://github.com/apache/airflow/blob/master/docs/apache-airflow/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it. - Consider using [Breeze environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations. - Be patient and persistent. It might take some time to get a review or get the final approval from Committers. - Please follow [ASF Code of Conduct](https://www.apache.org/foundation/policies/conduct) for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack. - Be sure to read the [Airflow Coding style]( https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices). Apache Airflow is a community-driven project and together we are making it better 🚀. In case of doubts contact the developers at: Mailing List: d...@airflow.apache.org Slack: https://s.apache.org/airflow-slack -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] jhtimmins merged pull request #14946: Standardize default fab perms
jhtimmins merged pull request #14946: URL: https://github.com/apache/airflow/pull/14946 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[airflow] branch master updated (cb9b9b3 -> 18c5b8a)
This is an automated email from the ASF dual-hosted git repository. jhtimmins pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/airflow.git. from cb9b9b3 Fix `sendgrid` -> `google`. (#15334) add 18c5b8a Standardize default fab perms (#14946) No new revisions were added by this update. Summary of changes: .github/actions/cancel-workflow-runs | 2 +- .../endpoints/role_and_permission_endpoint.py | 12 +- airflow/api_connexion/endpoints/user_endpoint.py | 4 +- ...ad25_resource_based_permissions_for_default_.py | 172 ++ airflow/security/permissions.py| 28 +-- airflow/www/security.py| 54 +++-- airflow/www/views.py | 217 + docs/apache-airflow/security/access-control.rst| 2 + .../endpoints/test_role_and_permission_endpoint.py | 12 +- .../api_connexion/endpoints/test_user_endpoint.py | 3 +- tests/www/test_security.py | 21 +- tests/www/test_views.py| 261 - 12 files changed, 723 insertions(+), 65 deletions(-) create mode 100644 airflow/migrations/versions/a13f7613ad25_resource_based_permissions_for_default_.py
[GitHub] [airflow] SevakAvet commented on issue #15001: S3MultipleKeysSensor operator
SevakAvet commented on issue #15001: URL: https://github.com/apache/airflow/issues/15001#issuecomment-818188105 I'd more towards modifying existing operator and making it accept list of prefixes. Also, the expected behaviour is that poke() will return True only if all keys exist, right? Would that be useful to have sort of any(..) (when at least one key exist) instead of all(..)? Also, this can be implemented as such: ``` return all(hook.check_for_key(key, self.bucket_name) for key in key_list) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] SevakAvetGD removed a comment on issue #15001: S3MultipleKeysSensor operator
SevakAvetGD removed a comment on issue #15001: URL: https://github.com/apache/airflow/issues/15001#issuecomment-818151138 I'd more towards modifying existing operator and making it accept list of prefixes. Also, the expected behaviour is that poke() will return True only if all keys exist, right? Would that be useful to have sort of any(..) (when at least one key exist) instead of all(..)? Also, this can be implemented as such: ``` return all(hook.check_for_key(key, self.bucket_name) for key in key_list) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] ephraimbuddy opened a new pull request #15336: Fail task when containers inside a pod fails
ephraimbuddy opened a new pull request #15336: URL: https://github.com/apache/airflow/pull/15336 Currently, when a container inside a pod terminates, airflow doesn't know about it and tasks remain queued. The kubernetes Job Watcher does not watch the status of containers inside pods. It only watches the pod and report the pod's status to airflow. From kubernetes doc, the pending phase of a pod is defined to include the time a Pod spends waiting to be scheduled as well as the time spent downloading container images over the network. Network failure can crash the container while the pod remains Pending until a certain time before it's deleted. This PR fixes this by including watching of containers in kubernetes job watcher's job This should close https://github.com/apache/airflow/issues/13542 and https://github.com/apache/airflow/issues/15218 hopefully. And I prefer it to timing out. cc: @jedcunningham --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] dsynkov commented on issue #12985: PythonVirtualenvOperator with provide_context=True does not have 'ti' keyword
dsynkov commented on issue #12985: URL: https://github.com/apache/airflow/issues/12985#issuecomment-818158798 Sharing a few notes as I've run into this same issue: * The [docs](https://airflow.apache.org/docs/apache-airflow/stable/howto/operator/python.html#pythonvirtualenvoperator) for `PythonVirtualenvOperator` address this: > You can use the op_args and op_kwargs arguments the same way you use it in the PythonOperator. Unfortunately we currently do not support to serialize var and ti / task_instance **due to incompatibilities with the underlying library**. * I'm not sure what the exact "incompatibilities" are, but searching for open issues mentioning `PythonVirtualenvOperator` there is https://github.com/apache/airflow/issues/7870 which addresses using `dill` vs. `cloudpickle` (but doesn't mention how/if this would enable serialization of `ti`, etc.). Anyone have any insight if https://github.com/apache/airflow/issues/7870 is a pre-req to this current issue? * Also found this Jira story (https://issues.apache.org/jira/browse/AIRFLOW-2738) that's still open that seems to be addressing the same problem. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] SevakAvetGD commented on issue #15001: S3MultipleKeysSensor operator
SevakAvetGD commented on issue #15001: URL: https://github.com/apache/airflow/issues/15001#issuecomment-818151138 I'd more towards modifying existing operator and making it accept list of prefixes. Also, the expected behaviour is that poke() will return True only if all keys exist, right? Would that be useful to have sort of any(..) (when at least one key exist) instead of all(..)? Also, this can be implemented as such: ``` return all(hook.check_for_key(key, self.bucket_name) for key in key_list) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] SevakAvetGD removed a comment on issue #15001: S3MultipleKeysSensor operator
SevakAvetGD removed a comment on issue #15001: URL: https://github.com/apache/airflow/issues/15001#issuecomment-818150046 I'd more towards modifying existing operator and making it accept list of prefixes. Also, the expected behaviour is that poke() will return True only if all keys exist, right? Would that be useful to have sort of any(..) (when at least one key exist) instead of all(..)? Also, this can be implemented as such: ``` return all(hook.check_for_key(key, self.bucket_name) for key in key_list) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] SevakAvetGD commented on issue #15001: S3MultipleKeysSensor operator
SevakAvetGD commented on issue #15001: URL: https://github.com/apache/airflow/issues/15001#issuecomment-818150046 I'd more towards modifying existing operator and making it accept list of prefixes. Also, the expected behaviour is that poke() will return True only if all keys exist, right? Would that be useful to have sort of any(..) (when at least one key exist) instead of all(..)? Also, this can be implemented as such: ``` return all(hook.check_for_key(key, self.bucket_name) for key in key_list) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [airflow] kaxil merged pull request #15334: Fix `sendgrid` -> `google`.
kaxil merged pull request #15334: URL: https://github.com/apache/airflow/pull/15334 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org