kadu-vido opened a new issue, #27504: URL: https://github.com/apache/airflow/issues/27504
### Apache Airflow Provider(s) cncf-kubernetes ### Versions of Apache Airflow Providers apache-airflow-providers-amazon==6.0.0 apache-airflow-providers-apache-hive==4.0.1 apache-airflow-providers-cncf-kubernetes==4.4.0 apache-airflow-providers-common-sql==1.2.0 apache-airflow-providers-databricks==3.3.0 apache-airflow-providers-ftp==3.1.0 apache-airflow-providers-http==4.0.0 apache-airflow-providers-imap==3.0.0 apache-airflow-providers-jdbc==3.2.1 apache-airflow-providers-sftp==4.1.0 apache-airflow-providers-slack==5.1.0 apache-airflow-providers-sqlite==3.2.1 apache-airflow-providers-ssh==3.2.0 ### Apache Airflow version apache-airflow==2.4.1 ### Operating System Debian GNU/Linux 11 (bullseye) ### Deployment Official Apache Airflow Helm Chart ### Deployment details Helm v3.9.4 Kubernetes Server v1.22.15-eks-fb459a0 / Client v1.25.0 Docker version 20.10.17 Docker Compose version v2.10.2 ### What happened **Context** Using the k8s pod operator + `airflow.kubernetes.secret.Secret` to pass secrets into docker as environment variables (as per [Google's suggested approach for composer](https://cloud.google.com/composer/docs/how-to/using/using-kubernetes-pod-operator#kubernetespodoperator_configuration), even though I'm not using Composer). **What happened** When a pod fails to launch (`airflow.providers.cncf.kubernetes.utils.pod_manager.PodLaunchFailedException`), a dictionary describing the pod is printed. Inside it (in `.spec.containers[].env[]`, the secret is exposed in plaintext: ``` 'env': [{'name': 'AIRFLOW_MACROS_SECRET_VARIABLE', 'value': '***', <-------------------------------------------- 'value_from': None}, | {'name': 'AIRFLOW_MACROS_MY_ENV_SECRET', | 'value': None, | 'value_from': {'config_map_key_ref': None, compare 'field_ref': None, | 'resource_field_ref': None, | 'secret_key_ref': {'key': 'my_key', | 'name': 'my-secret', <------- 'optional': None}}}], ``` ### What you think should happen instead Secrets passed with this method should be masked, just like Airflow Variables containing secrets. ### How to reproduce This requires an Airflow instance with the KubernetesPodOperator, and a Kubernetes cluster. 1. configure a secret in the cluster namespace running Airflow, e.g. `kubectl create secret generic my-secret --from-literal=my_key=my_value` 2. (optional for comparison) configure a variable in the Airflow UI, named e.g. my_secret_var (with any value) 3. Create a dagfile similar to [this one](https://gist.github.com/kadu-vido/123f72b574876a737f95265a451d59e5) (add an image, anything works as nothing will actually run) 4. make a change that causes a pod launch error (e.g. change namespace to one where Airflow cannot create pods) 5. In the logs, look for the environment variables created ### Anything else _No response_ ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
