jordi-crespo opened a new issue, #35817: URL: https://github.com/apache/airflow/issues/35817
### Apache Airflow version Other Airflow 2 version (please specify below) ### What happened airflow_version: 2.6.2 Deployed in: Azure Kubernetes Service I am encountering a persistent and unusual issue with the StatsD configuration in Airflow. Despite following the guidelines provided in the [Datadog Integration Documentation](https://docs.datadoghq.com/integrations/airflow/?tab=containerized) and the [Airflow Helm Chart Production Guide](https://airflow.apache.org/docs/helm-chart/1.9.0/production-guide.html), I consistently receive an XML error message related to the statsd_host environment variable. ### Error Message: The statsd_host variable is set as follows in my configuration: ```yaml Copy code - name: AIRFLOW__METRICS__STATSD_HOST value: '10.0.32.180' ``` However, regardless of the configuration I use (including different host settings), the Airflow UI shows the following error under the "Configuration" tab in the "Metrics" section for statsd_host: ```xml Copy code <?xml version="1.0" encoding="utf-8"?> <Error xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <Code>ResourceNotFound</Code> <Message>The specified resource does not exist.</Message> <Details>'latest' isn't a valid resource name.</Details> </Error> ``` <img width="1495" alt="image" src="https://github.com/apache/airflow/assets/43036171/ff4f3122-552d-4896-911c-28723d3ce790"> If I exec into one of the pods and I echo the `AIRFLOW__METRICS__STATSD_HOST` env variable I get: `10.0.32.180` ## Attempts to Resolve: I've tried various configurations for the statsd_host variable, including using the service name, IP address, and even testing with localhost. I've verified network connectivity and confirmed that there are no apparent network or firewall issues blocking the communication. The error persists regardless of the method used to set the statsd_host. ## Seeking Help: I am looking for guidance or suggestions on what might be causing this issue and how to resolve it. It appears that the configuration is correct according to the documentation, but the error suggests a deeper problem, possibly related to how Airflow is interpreting or fetching the environment variable. Any help or insights from the community would be greatly appreciated. ### What you think should happen instead When configuring the AIRFLOW__METRICS__STATSD_HOST environment variable in Airflow, I expect the system to correctly interpret and utilize the provided host value (in this case, 10.0.32.180) for sending metrics to StatsD/Datadog. The expected outcomes are: Correct Resolution of Host Value: Airflow should resolve the statsd_host value without errors and use it for establishing a connection to the StatsD server or the Datadog agent. No XML Error Messages: The configuration should not result in XML error messages like 'ResourceNotFound'. Instead, it should either successfully connect to the StatsD/Datadog service or provide a clear, relevant error message if the connection fails. Visible Confirmation in Airflow UI: In the Airflow UI, under the "Configuration" tab in the "Metrics" section, the statsd_host setting should reflect the entered value without displaying unrelated XML error messages. Successful Metrics Transmission: Metrics collected by Airflow should be successfully transmitted to the specified StatsD/Datadog host, allowing for monitoring and analysis through the Datadog platform. In summary, I expect a seamless integration where Airflow can send metrics to the specified StatsD/Datadog service without encountering XML-related errors or misinterpretations of the configuration settings. ### How to reproduce Deploy Airflow in Kubernetes: Use the Airflow Helm chart to deploy Airflow in a Kubernetes cluster. Make sure to include the following environment variable configuration in your Helm chart's values.yaml file or as part of your deployment configuration: ```yaml Copy code env: - name: AIRFLOW__METRICS__STATSD_HOST value: '10.0.32.180' ``` Access Airflow UI: Once Airflow is deployed, access the Airflow web UI. Navigate to Configuration Tab: In the Airflow UI, navigate to the 'Admin' menu and select 'Configuration'. Observe the Error: In the 'Metrics' section under the 'Configuration' tab, observe the XML error message displayed for the statsd_host setting. ### Operating System Docker, creating the image from python:3.10-slim-buster ### Versions of Apache Airflow Providers apache-airflow-providers-http==4.3.0 apache-airflow-providers-jdbc==3.3.0 apache-airflow-providers-celery==3.1.0 apache-airflow-providers-snowflake==4.0.5 apache-airflow-providers-google==10.0.0 apache-airflow-providers-mysql==5.3.1 apache-airflow-providers-common-sql==1.4.0 apache-airflow-providers-ftp==3.3.1 apache-airflow-providers-imap==3.1.1 apache-airflow-providers-sqlite==3.3.2 apache-airflow-providers-microsoft-azure==7.0.0 ### Deployment Official Apache Airflow Helm Chart ### Deployment details _No response_ ### Anything else _No response_ ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org