IgV52 opened a new issue, #50421:
URL: https://github.com/apache/airflow/issues/50421

   ### Apache Airflow version
   
   3.0.0
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   MetastoreBackend, get_connection, get_variable - maybe return RuntimeError, 
but obj get.
   
   ### What you think should happen instead?
   
   Use MetastoreBackend just select sql, get obj. But use provide_session maybe 
return Error msg = UNEXPECTED COMMIT - THIS WILL BREAK HA LOCKS!
   Should I give the error a more specific name and skip it in certain places?
   
   ### How to reproduce
   
   docker compose - services airflow + minio(s3 logs), 
   AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
   AIRFLOW__LOGGING__REMOTE_LOGGING=true
   AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=minio
   AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=s3://<...>/
   
   
   
   
   ### Operating System
   
   debian:stable-slim
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==9.6.1
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   `
   
       networks:
         test:
           name: test
       
       x-airflow--common: &airflow__common
         image: test_airflow
         build:
           context: .
           dockerfile: Dockerfile
         env_file: !override
           - airflow_env
         volumes:
             - ./dags/:/app/dags/
             - ./manager.py:/app/manager.py # simple script init
         depends_on: &airflow__common_depends_on
           airflow_postgres:
             condition: service_healthy
           minio:
             condition: service_healthy
         networks:
           - test
       
       volumes:
         airflow_postgres:
         minio_data:
       
       services:
         airflow_webserver:
           <<: *airflow__common
           hostname: airflow_webserver
           container_name: airflow_webserver
           command: [ "airflow", "api-server" ]
           ports:
             - 8080:8080
           healthcheck:
             test: ["CMD", "curl", "--fail", 
"http://localhost:8080/api/v2/version";]
             interval: 30s
             timeout: 10s
             retries: 5
             start_period: 30s
           depends_on:
             <<: *airflow__common_depends_on
             airflow_manager:
                 condition: service_completed_successfully
       
         airflow_dags:
           <<: *airflow__common
           hostname: airflow_dags
           container_name: airflow_dags
           command: [ "airflow", "dag-processor" ]
           healthcheck:
             test: ["CMD-SHELL", 'airflow jobs check --job-type DagProcessorJob 
--hostname "$${HOSTNAME}"']
             interval: 30s
             timeout: 10s
             retries: 5
             start_period: 30s
           depends_on:
             <<: *airflow__common_depends_on
             airflow_manager:
                 condition: service_completed_successfully
       
         airflow_scheduler:
           <<: *airflow__common
           container_name: airflow_scheduler
           hostname: airflow_scheduler
           command: [ "airflow", "scheduler" ]
           healthcheck:
             test: ["CMD", "curl", "--fail", "http://localhost:8974/health";]
             interval: 30s
             timeout: 10s
             retries: 5
             start_period: 30s
           depends_on:
             <<: *airflow__common_depends_on
             airflow_manager:
                 condition: service_completed_successfully
       
         airflow_triggerer:
           <<: *airflow__common
           container_name: airflow_triggerer
           hostname: airflow_triggerer
           command: [ "airflow", "triggerer" ]
           healthcheck:
             test: ["CMD-SHELL", 'airflow jobs check --job-type TriggererJob 
--hostname "$${HOSTNAME}"']
             interval: 30s
             timeout: 10s
             retries: 5
             start_period: 30s
           depends_on:
             <<: *airflow__common_depends_on
             airflow_manager:
                 condition: service_completed_successfully
       
         airflow_manager:
           <<: *airflow__common
           container_name: airflow_manager
           hostname: airflow_manager
           command: python /app/manager.py
           volumes:
               - ./dags:/app/dags
               - ./manager.py:/app/manager.py
       
         airflow_postgres:
           image: bitnami/postgresql:16.1.0-debian-11-r4
           container_name: airflow_postgres
           hostname: airflow_postgres
           ports:
             - 5432:5432
           healthcheck:
             test: ["CMD-SHELL", "pg_isready -U $${POSTGRESQL_USERNAME}"]
             interval: 10s
             retries: 5
             start_period: 5s
           environment:
             - POSTGRESQL_USERNAME=postgres
             - POSTGRESQL_PASSWORD=postgres
             - POSTGRESQL_DATABASE=postgres
             - POSTGRESQL_POSTGRES_PASSWORD=postgres
           volumes:
             - airflow_postgres:/bitnami/postgresql
           networks:
             - test
       
         minio:
           image: bitnami/minio:2024.9.9
           container_name: minio
           hostname: minio
           ports:
             - 9000:9000
             - 9001:9001
           healthcheck:
             test: ["CMD", "mc", "ready", "local"]
             interval: 5s
             timeout: 5s
             retries: 5
           volumes:
             - minio_data:/bitnami/minio/data
           environment:
             - MINIO_ROOT_USER=admin123
             - MINIO_ROOT_PASSWORD=admin123
           networks:
             - test
   `
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [x] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to