dada-engineer opened a new issue, #40095:
URL: https://github.com/apache/airflow/issues/40095

   ### Official Helm Chart version
   
   1.13.1 (latest released)
   
   ### Apache Airflow version
   
   2.9.1
   
   ### Kubernetes Version
   
   1.28
   
   ### Helm Chart configuration
   
   Hi everyone,
   
   I get some airflow config params provided via k8s secrets / configmaps 
(maybe from another IT department during provisioning), e.g. the remote base 
log folder (some s3 URI created via open tofu and provided as a k8s secret)
   
   Currently it seems that I am not able to bring those into the cleanup 
cronjob because it is not rendering the `extraEnv` and `extraEnvFrom` global 
values, nor does it provide custom ones that are referenceable ([you can only 
specify a name: value custom env not a 
valueFrom](https://github.com/apache/airflow/blob/45bf7b972121828483f930acac60aa0751e2716f/chart/templates/_helpers.yaml#L1013)
 or use the standard ones built 
[here](https://github.com/apache/airflow/blob/1a613030e669ec8e8b0be893038da3a3ca1de9c9/chart/templates/_helpers.yaml#L54))
   
   As the docs state I can circumvent this by creating a custom helm chart (we 
actually do anyways) and then adding the cronjob I need (will also do this for 
now), but I think this can already be native airflow helm chart functionality
   
   [This is the 
job](https://github.com/apache/airflow/blob/helm-chart/1.13.1/chart/templates/cleanup/cleanup-cronjob.yaml)
 I am talking about
   
   ### Docker Image customizations
   
   Yes but I think this is unrelated to this issue
   
   ### What happened
   
   I tried to add the following env config (before checking the code actually)
   
   ```
   cleanup:
     enabled: true
     env:
         - name: AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER
           valueFrom:
             secretKeyRef:
               name: secret-name
               key: AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER
         - name: AIRFLOW__DATABASE__SQL_ALCHEMY_CONN
           valueFrom:
             secretKeyRef:
               name: secret-name            key: 
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN
   ```
   
   and as this is a bad format for now it crashed with the following:
   
   ```
   helm.go:84: [debug] values don't meet the specifications of the schema(s) in 
the following chart(s):
   airflow:
   - cleanup.env.0: value is required
   - cleanup.env.0: Additional property valueFrom is not allowed
   - cleanup.env.1: value is required
   - cleanup.env.1: Additional property valueFrom is not allowed
   ```
   
   Which is clear when you look up how custom_container_env is built so no 
surprise here
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   Try to add some custom env and reference an already existing secret or 
configmap.
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to