wolvery opened a new issue, #55148:
URL: https://github.com/apache/airflow/issues/55148
### Apache Airflow version
3.0.6
### If "Other Airflow 2 version" selected, which one?
_No response_
### What happened?
After upgrading from 2.9.3 → 3.0.6, worker pods fail with connection errors
when running tasks that previously succeeded.
The error appears shortly after task start:
```
{"timestamp":"2025-09-01T14:23:58.100611Z","level":"info","event":"Connecting
to
server:","server":"http://workflow-api-server:8080/execution/","logger":"__main__"}
{"timestamp":"2025-09-01T14:24:00.609317Z","level":"warning","event":"Starting
call to 'airflow.sdk.api.client.Client.request', this is the 3rd time calling
it.","logger":"airflow.sdk.api.client"}
...
httpx.ConnectError: [Errno 111] Connection refused
```
Eventually, the worker process is killed:
```
{"timestamp":"2025-09-01T14:24:11.946331Z","level":"info","event":"Process
exited","pid":18,"exit_code":-9,"signal_sent":"SIGKILL","logger":"supervisor"}
```
Full stack trace (truncated):
```
[2025-09-01T14:23:57.703+0000] {settings.py:345} DEBUG - Setting up DB
connection pool (PID 7)
[2025-09-01T14:23:57.705+0000] {settings.py:422} DEBUG -
settings.prepare_engine_args(): Using NullPool
[2025-09-01T14:23:58.038+0000] {configuration.py:861} DEBUG - Could not
retrieve value from section database, for key sql_alchemy_engine_args. Skipping
redaction of this conf.
[2025-09-01T14:23:58.042+0000] {configuration.py:861} DEBUG - Could not
retrieve value from section core, for key asset_manager_kwargs. Skipping
redaction of this conf.
{"timestamp":"2025-09-01T14:23:58.099056Z","level":"info","event":"Executing
workload","workload":"ExecuteTask(token='',
ti=TaskInstance(id=UUID('019905a9-a838-76a4-8103-2d729244d294'),
task_id='pip_freeze', dag_id='system__pip_freeze',
run_id='manual__2025-09-01T14:23:48.637858+00:00', try_number=1, map_index=-1,
pool_slots=1, queue='default', priority_weight=1, executor_config=None,
parent_context_carrier={}, context_carrier={}, queued_dttm=None),
dag_rel_path=PurePosixPath('system_dags/pip_freeze.py'),
bundle_info=BundleInfo(name='dags-folder', version=None),
log_path='dag_id=system__pip_freeze/run_id=manual__2025-09-01T14:23:48.637858+00:00/task_id=pip_freeze/attempt=1.log',
type='ExecuteTask')","logger":"__main__"}
{"timestamp":"2025-09-01T14:23:58.100611Z","level":"info","event":"Connecting
to
server:","server":"http://workflow-api-server:8080/execution/","logger":"__main__"}
{"timestamp":"2025-09-01T14:23:58.155082Z","level":"info","event":"Secrets
backends loaded for
worker","count":1,"backend_classes":["EnvironmentVariablesBackend"],"logger":"supervisor"}
{"timestamp":"2025-09-01T14:23:58.269015Z","level":"warning","event":"Starting
call to 'airflow.sdk.api.client.Client.request', this is the 1st time calling
it.","logger":"airflow.sdk.api.client"}
{"timestamp":"2025-09-01T14:23:59.272105Z","level":"warning","event":"Starting
call to 'airflow.sdk.api.client.Client.request', this is the 2nd time calling
it.","logger":"airflow.sdk.api.client"}
{"timestamp":"2025-09-01T14:24:00.609317Z","level":"warning","event":"Starting
call to 'airflow.sdk.api.client.Client.request', this is the 3rd time calling
it.","logger":"airflow.sdk.api.client"}
{"timestamp":"2025-09-01T14:24:02.361164Z","level":"warning","event":"Starting
call to 'airflow.sdk.api.client.Client.request', this is the 4th time calling
it.","logger":"airflow.sdk.api.client"}
{"timestamp":"2025-09-01T14:24:11.946331Z","level":"info","event":"Process
exited","pid":18,"exit_code":-9,"signal_sent":"SIGKILL","logger":"supervisor"}
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.10/site-packages/httpx/_transports/default.py",
line 101, in map_httpcore_exceptions
yield
File
"/home/airflow/.local/lib/python3.10/site-packages/httpx/_transports/default.py",
line 250, in handle_request
resp = self._pool.handle_request(req)
File
"/home/airflow/.local/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py",
line 256, in handle_request
raise exc from None
File
"/home/airflow/.local/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py",
line 236, in handle_request
response = connection.handle_request(
File
"/home/airflow/.local/lib/python3.10/site-packages/httpcore/_sync/connection.py",
line 101, in handle_request
raise exc
File
"/home/airflow/.local/lib/python3.10/site-packages/httpcore/_sync/connection.py",
line 78, in handle_request
stream = self._connect(request)
File
"/home/airflow/.local/lib/python3.10/site-packages/httpcore/_sync/connection.py",
line 124, in _connect
stream = self._network_backend.connect_tcp(**kwargs)
File
"/home/airflow/.local/lib/python3.10/site-packages/httpcore/_backends/sync.py",
line 207, in connect_tcp
with map_exceptions(exc_map):
File "/usr/local/lib/python3.10/contextlib.py", line 153, in __exit__
self.gen.throw(typ, value, traceback)
File
"/home/airflow/.local/lib/python3.10/site-packages/httpcore/_exceptions.py",
line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 111] Connection refused
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/execution_time/execute_workload.py",
line 125, in <module>
main()
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/execution_time/execute_workload.py",
line 121, in main
execute_workload(workload)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/execution_time/execute_workload.py",
line 66, in execute_workload
supervise(
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py",
line 1829, in supervise
process = ActivitySubprocess.start(
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py",
line 933, in start
proc._on_child_started(ti=what, dag_rel_path=dag_rel_path,
bundle_info=bundle_info)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py",
line 944, in _on_child_started
ti_context = self.client.task_instances.start(ti.id, self.pid,
start_date)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/api/client.py",
line 152, in start
resp = self.client.patch(f"task-instances/{id}/run",
content=body.model_dump_json())
File "/home/airflow/.local/lib/python3.10/site-packages/httpx/_client.py",
line 1218, in patch
return self.request(
File
"/home/airflow/.local/lib/python3.10/site-packages/tenacity/__init__.py", line
338, in wrapped_f
return copy(f, *args, **kw)
File
"/home/airflow/.local/lib/python3.10/site-packages/tenacity/__init__.py", line
477, in __call__
do = self.iter(retry_state=retry_state)
File
"/home/airflow/.local/lib/python3.10/site-packages/tenacity/__init__.py", line
378, in iter
result = action(retry_state)
File
"/home/airflow/.local/lib/python3.10/site-packages/tenacity/__init__.py", line
420, in exc_check
raise retry_exc.reraise()
File
"/home/airflow/.local/lib/python3.10/site-packages/tenacity/__init__.py", line
187, in reraise
raise self.last_attempt.result()
File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 451, in
result
return self.__get_result()
File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in
__get_result
raise self._exception
File
"/home/airflow/.local/lib/python3.10/site-packages/tenacity/__init__.py", line
480, in __call__
result = fn(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/api/client.py",
line 735, in request
return super().request(*args, **kwargs)
File "/home/airflow/.local/lib/python3.10/site-packages/httpx/_client.py",
line 825, in request
return self.send(request, auth=auth, follow_redirects=follow_redirects)
File "/home/airflow/.local/lib/python3.10/site-packages/httpx/_client.py",
line 914, in send
response = self._send_handling_auth(
File "/home/airflow/.local/lib/python3.10/site-packages/httpx/_client.py",
line 942, in _send_handling_auth
response = self._send_handling_redirects(
File "/home/airflow/.local/lib/python3.10/site-packages/httpx/_client.py",
line 979, in _send_handling_redirects
response = self._send_single_request(request)
File "/home/airflow/.local/lib/python3.10/site-packages/httpx/_client.py",
line 1014, in _send_single_request
response = transport.handle_request(request)
File
"/home/airflow/.local/lib/python3.10/site-packages/httpx/_transports/default.py",
line 249, in handle_request
with map_httpcore_exceptions():
File "/usr/local/lib/python3.10/contextlib.py", line 153, in __exit__
self.gen.throw(typ, value, traceback)
File
"/home/airflow/.local/lib/python3.10/site-packages/httpx/_transports/default.py",
line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 111] Connection refused
```
### What you think should happen instead?
It should be able to communicate with Airflow api server and execute the
python task.
### How to reproduce
We have Airflow 3.0.6 using Kubernetes Executor. This task is a BASH
OPERATOR:
```python
from datetime import datetime
from airflow import DAG
from airflow.operators.bash import BashOperator
default_args = {
"owner": "data_platform",
"depends_on_past": False,
"start_date": datetime(2023, 4, 27),
}
with DAG(
"system__pip_freeze",
catchup=False,
default_args=default_args,
schedule=None,
tags=["system", "utils", "debug"],
):
BashOperator(
task_id="pip_freeze",
bash_command="pip freeze",
)
```
We are using this image as base:
https://hub.docker.com/layers/apache/airflow/3.0.6-python3.10/images/sha256-7f0989cb20b7e2bf5cb59b73be86ce7a1681693578ab26e236dd85601e464ac0
### Operating System
Debian GNU/Linux 12 (bookworm)
### Versions of Apache Airflow Providers
apache-airflow==3.0.6
apache-airflow-core==3.0.6
apache-airflow-providers-amazon==9.12.0
apache-airflow-providers-celery==3.12.2
apache-airflow-providers-cncf-kubernetes==10.7.0
apache-airflow-providers-common-compat==1.7.3
apache-airflow-providers-common-io==1.6.2
apache-airflow-providers-common-messaging==1.0.5
apache-airflow-providers-common-sql==1.27.5
apache-airflow-providers-docker==4.4.2
apache-airflow-providers-elasticsearch==6.3.2
apache-airflow-providers-fab==2.4.1
apache-airflow-providers-ftp==3.13.2
apache-airflow-providers-git==0.0.6
apache-airflow-providers-google==17.1.0
apache-airflow-providers-grpc==3.8.2
apache-airflow-providers-hashicorp==4.3.2
apache-airflow-providers-http==5.3.3
apache-airflow-providers-microsoft-azure==12.6.1
apache-airflow-providers-mysql==6.3.3
apache-airflow-providers-odbc==4.10.2
apache-airflow-providers-openlineage==2.6.1
apache-airflow-providers-postgres==6.2.3
apache-airflow-providers-redis==4.2.0
apache-airflow-providers-sendgrid==4.1.3
apache-airflow-providers-sftp==5.3.4
apache-airflow-providers-slack==9.1.4
apache-airflow-providers-smtp==2.2.0
apache-airflow-providers-snowflake==6.5.2
apache-airflow-providers-ssh==4.1.3
apache-airflow-providers-standard==1.6.0
apache-airflow-task-sdk==1.0.6
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
Version used is 1.18.0
### Anything else?
Is it might related with Environment variables? I saw that this issue might
be HTTPx...
https://www.python-httpx.org/environment_variables/
### Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [x] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]