dacamposol opened a new issue, #47963:
URL: https://github.com/apache/airflow/issues/47963

   ### Official Helm Chart version
   
   1.15.0 (latest released)
   
   ### Apache Airflow version
   
   2.9.3
   
   ### Kubernetes Version
   
   1.31.4
   
   ### Helm Chart configuration
   
   ```yaml
   createUserJob:
     applyCustomEnv: false
     useHelmHooks: false
   dags:
     gitSync:
       branch: main
       credentialsSecret: airflow-git-credentials
       enabled: true
       repo: https://github.com/dacamposol/kg-infra.git
       subPath: dags
   extraEnv: |
     - name: AIRFLOW__API__AUTH_BACKENDS
       value: 'airflow.api.auth.backend.basic_auth'
   migrateDatabaseJob:
     applyCustomEnv: false
     jobAnnotations:
       'argocd.argoproj.io/hook': Sync
     useHelmHooks: false
   useStandardNaming: true
   ```
   
   ### Docker Image customizations
   
   _No response_
   
   ### What happened
   
   > [!IMPORTANT] 
   > This is an instance managed via ArgoCD.
   
   1. I created a connection from type HTTP directly from the Web Server UI.
   2. I modified a field in the Argo Application in regards of adding the 
`.extraEnv` to allow REST API calls.
   3. I accessed again to the Web Server UI.
   4. Cannot open the **Connections** section (all others work fine).
   
   Error in the `webserver` Pod:
   
   ```bash
   File 
"/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/mapper.py", 
line 3702, in _event_on_load
       instrumenting_mapper._reconstructor(state.obj())
     File 
"/home/airflow/.local/lib/python3.12/site-packages/airflow/models/connection.py",
 line 213, in on_db_load
       if self.password:
          ^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.12/site-packages/sqlalchemy/orm/attributes.py",
 line 606, in __get__
       retval = self.descriptor.__get__(instance, owner)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.12/site-packages/airflow/models/connection.py",
 line 340, in get_password
       return fernet.decrypt(bytes(self._password, "utf-8")).decode()
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.12/site-packages/cryptography/fernet.py", 
line 211, in decrypt
       raise InvalidToken
   cryptography.fernet.InvalidToken
   ```
   
   When I go to my Argo CD deployment, I can indeed see how for the Airflow 
application, the `airflow-fernet-key` got recreated at the same time than the 
sync happened.
   
   I am assuming the Secret got recreated and somehow the rotation process 
failed, so the **Connections** are now encrypted with the old key, and thus 
cannot be accessed anymore.
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   1. Have an Airflow Cluster managed by Argo CD.
   2. Create a connection in that cluster.
   3. Modify settings in the Application, and wait for Sync.
   4. Attempt to access again to the Connections in the aforementioned cluster.
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to