Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
kiros19 commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-381212 Hello, @nailo2c No, the error doesn't cause task to fail. Worker is spamming with logs above; rabbitmq's saying, worker closes connection shortly after opening it Could you please share your celery config so that i can try your setup (as you don't see the same issue, i believe)? Thank you! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nailo2c commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3875298296 Hi, I found a bug in the celery worker log when setting `celery_stdout_stderr_separation=True`, and I opened a PR to fix it. However, I can't reproduce @kiros19's issue since their `celery_stdout_stderr_separation` is set to `False`. Could you tell me more about what happens when this error occurs? Does it cause your task to fail? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
kiros19 commented on issue #49966:
URL: https://github.com/apache/airflow/issues/49966#issuecomment-3852224693
> I've got the same error on setup: Airlfow 3.1.6 +
apache-airflow-providers-fab 3.1.2 (trying to solve apiserver UI not opening
after some time) CeleryExecutor + rabbitMQ:3.13.7 Erlang 26.2.5.13 Docker
>
> I've seen the error before, but i believe some reloads, version changes
made it dissapear someway The problem as i see it: Celery has some buit-in log
formatting setting, that's failing That's causing worker to close AMQP
connection to rabbit
>
> I believe that logging config is built-in as my config is default and has
'taskname' everywhere, not task_name And logs were not modified from scripts or
anything, it's clean
>
> The №1 sus thing is that airflow jobs still executed ok. Nothing is
failing. But how could it be ok, if AMQP is closed instanly on opening, due to
that celery logging error?
>
> Other sus thing - Error is not appearing for my prod setup: Airlfow 3.1.5
+ airflow-providers-fab 3.0.3 And I'm not sure I've seen it for my prev test
setup (not 100% sure here): Airlfow 3.1.6 + apache-airflow-providers-fab 3.1.1
>
> Below are my rabbit,celery logs and my logging config
>
> RABBITMQ: --ok 2026-02-04 06:46:26.518060+00:00 [info] <0.3105.0>
accepting AMQP connection <0.3105.0> (100.18.0.4:46058 -> 100.18.0.2:5672)
2026-02-04 06:46:26.520354+00:00 [info] <0.3105.0> connection <0.3105.0>
(100.1.0.4:46058 -> 100.18.0.2:5672): user 'airflow' authenticated and granted
access to vhost 'airflow_queue' 2026-02-04 06:46:26.528031+00:00 [info]
<0.3122.0> accepting AMQP connection <0.3122.0> (100.18.0.4:46064 ->
100.18.0.2:5672) 2026-02-04 06:46:26.529746+00:00 [info] <0.3122.0> connection
<0.3122.0> (100.0.4:46064 -> 100.18.0.2:5672): user 'airflow' authenticated and
granted access to vhost 'airflow_queue' --instanlty closing 2026-02-04
06:46:26.93+00:00 [warning] <0.3105.0> closing AMQP connection <0.3105.0>
(100.18.0.4:46058 -> 100.18.0.2:5672, vhost: 'airflow_queue', user: 'airflow'):
2026-02-04 06:46:26.93+00:00 [warning] <0.3105.0> client unexpectedly
closed TCP connection 2026-02-04 06:46:26.922550+00:00 [warning] <0.3122.0>
closing AMQP connecti
on <0.3122.0> (100.18.0.4:46064 -> 100.18.0.2:5672, vhost: 'airflow_queue',
user: 'airflow'): 2026-02-04 06:46:26.922550+00:00 [warning] <0.3122.0> client
unexpectedly closed TCP connection
>
> WORKER: --- Logging error --- Traceback (most recent call last): File
"/usr/python/lib/python3.12/logging/**init**.py", line 464, in format return
self._format(record) File
"/usr/python/lib/python3.12/logging/**init**.py", line 460, in _format return
self._fmt % values ~~^~~~ KeyError: 'task_name'
>
> During handling of the above exception, another exception occurred:
>
> Traceback (most recent call last): File
"/usr/python/lib/python3.12/logging/**init**.py", line 1160, in emit msg =
self.format(record) ^^^ File
"/usr/python/lib/python3.12/logging/**init**.py", line 999, in format return
fmt.format(record) ^^ File
"/usr/python/lib/python3.12/logging/**init**.py", line 706, in format s =
self.formatMessage(record) ^^ File
"/usr/python/lib/python3.12/logging/**init**.py", line 675, in formatMessage
return self._style.format(record) ^^ File
"/usr/python/lib/python3.12/logging/**init**.py", line 466, in format raise
ValueError('Formatting field not found in record: %s' % e) ValueError:
Formatting field not found in record: 'task_name' Call stack: File
"/home/airflow/.local/bin/airflow", line 7, in sys.exit(main()) File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/**main**.py", line
55, in main args.func(args) File "/home/airflow/.local/lib/python3.12/site-p
ackages/airflow/cli/cli_config.py", line 49, in command return func(*args,
**kwargs) File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/cli.py", line
114, in wrapper return f(*args, **kwargs) File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/celery/cli/celery_command.py",
line 66, in wrapper providers_configuration_loaded(func)(*args, **kwargs) File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/providers_configuration_loader.py",
line 54, in wrapped_function return func(*args, **kwargs) File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/celery/cli/celery_command.py",
line 293, in worker _run_command_with_daemon_option( File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/celery/cli/celery_command.py",
line 52, in _run_command_with_daemon_option
run_command_with_daemon_option(*args, **kwargs) File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/cli/commands/daemon_util
s.py", line 86, in run_command_with_daemon_option callback() File
"
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
kiros19 commented on issue #49966:
URL: https://github.com/apache/airflow/issues/49966#issuecomment-3845726832
I've got the same error on setup:
Airlfow 3.1.6 + apache-airflow-providers-fab 3.1.2 (trying to solve
apiserver UI not opening after some time)
CeleryExecutor + rabbitMQ:3-management
Docker
I've seen the error before, but i believe some reloads, version changes made
it dissapear someway
The problem as i see it:
Celery has some buit-in log formatting setting, that's failing
That's causing worker to close AMQP connection to rabbit
I believe that logging config is built-in as my config is default and has
'taskname' everywhere, not task_name
And logs were not modified from scripts or anything, it's clean
The №1 sus thing is that airflow jobs still executed ok. Nothing is failing.
But how could it be ok, if AMQP is closed instanly on opening, due to that
celery logging error?
Other sus thing - Error is not appearing for my prod setup:
Airlfow 3.1.5 + airflow-providers-fab 3.0.3
And I'm not sure I've seen it for my prev test setup (not 100% sure here):
Airlfow 3.1.6 + apache-airflow-providers-fab 3.1.1
Below are my rabbit,celery logs and my logging config
RABBITMQ:
--ok
2026-02-04 06:46:26.518060+00:00 [info] <0.3105.0> accepting AMQP connection
<0.3105.0> (172.18.0.4:46058 -> 172.18.0.2:5672)
2026-02-04 06:46:26.520354+00:00 [info] <0.3105.0> connection <0.3105.0>
(172.18.0.4:46058 -> 172.18.0.2:5672): user 'airflow' authenticated and granted
access to vhost 'airflow_queue'
2026-02-04 06:46:26.528031+00:00 [info] <0.3122.0> accepting AMQP connection
<0.3122.0> (172.18.0.4:46064 -> 172.18.0.2:5672)
2026-02-04 06:46:26.529746+00:00 [info] <0.3122.0> connection <0.3122.0>
(172.18.0.4:46064 -> 172.18.0.2:5672): user 'airflow' authenticated and granted
access to vhost 'airflow_queue'
--instanlty closing
2026-02-04 06:46:26.93+00:00 [warning] <0.3105.0> closing AMQP
connection <0.3105.0> (172.18.0.4:46058 -> 172.18.0.2:5672, vhost:
'airflow_queue', user: 'airflow'):
2026-02-04 06:46:26.93+00:00 [warning] <0.3105.0> client unexpectedly
closed TCP connection
2026-02-04 06:46:26.922550+00:00 [warning] <0.3122.0> closing AMQP
connection <0.3122.0> (172.18.0.4:46064 -> 172.18.0.2:5672, vhost:
'airflow_queue', user: 'airflow'):
2026-02-04 06:46:26.922550+00:00 [warning] <0.3122.0> client unexpectedly
closed TCP connection
WORKER:
--- Logging error ---
Traceback (most recent call last):
File "/usr/python/lib/python3.12/logging/__init__.py", line 464, in format
return self._format(record)
File "/usr/python/lib/python3.12/logging/__init__.py", line 460, in _format
return self._fmt % values
~~^~~~
KeyError: 'task_name'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/python/lib/python3.12/logging/__init__.py", line 1160, in emit
msg = self.format(record)
^^^
File "/usr/python/lib/python3.12/logging/__init__.py", line 999, in format
return fmt.format(record)
^^
File "/usr/python/lib/python3.12/logging/__init__.py", line 706, in format
s = self.formatMessage(record)
^^
File "/usr/python/lib/python3.12/logging/__init__.py", line 675, in
formatMessage
return self._style.format(record)
^^
File "/usr/python/lib/python3.12/logging/__init__.py", line 466, in format
raise ValueError('Formatting field not found in record: %s' % e)
ValueError: Formatting field not found in record: 'task_name'
Call stack:
File "/home/airflow/.local/bin/airflow", line 7, in
sys.exit(main())
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/__main__.py", line
55, in main
args.func(args)
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/cli/cli_config.py",
line 49, in command
return func(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/cli.py", line
114, in wrapper
return f(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/celery/cli/celery_command.py",
line 66, in wrapper
providers_configuration_loaded(func)(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/providers_configuration_loader.py",
line 54, in wrapped_function
return func(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/celery/cli/celery_command.py",
line 293, in worker
_run_command_with_daemon_option(
File
"/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/celery/cli/celery_comm
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
potiuk commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3660009847 I think there is something terrribly wrong in your logging configuration. I recommend you to remove custom configuration you have and gradually add your modifications and customisations you have done - there is no way for us to know - or debug - your setup of logging, you likely inherited that from someone in the past and now it's unfortunately your task to debug it - and if needed to find-out - bisecting it by intelligent guessing which part of the configuration is causing it - look at your local_settings and any special customisations you have on top of standard airflow configuration. My best guess is that somewhere some of your logging attempts to add (in a custom way) and expects "task_name" to be added when logged, but what it is and how I think only inspecting your customisations might tell. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3655192489 Hello, in version 3.1.5, this behavior is still present at startup in the worker logs (see attachment). Regards, Nicola [worker-logs-315.csv](https://github.com/user-attachments/files/24164645/worker-logs-315.csv) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3154513733 [log-events-viewer-result.csv](https://github.com/user-attachments/files/21596518/log-events-viewer-result.csv) I see an additional error in the worker logs when I execute a task, even if the task is successful (see attached logs). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3153965058 @amoghrajesh I'm able to run tasks now. However, the error in the worker logs at startup is still there. Wouldn't be better to wait for versionb 3.0.4 ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
amoghrajesh commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3153633060 @nicolamarangoni are we good to close this issue? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3150952881 @amoghrajesh thanks! I'm looking forward to version 3.0.4! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
amoghrajesh commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3149481077 If it's the queue issue, it has been fixed by: https://github.com/apache/airflow/pull/52871 Something worth checking out -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3145063396 FYI, I found the root cause of https://github.com/apache/airflow/issues/42136 In my setup with AWS SQS (not Redis), AIRFLOW__OPERATORS__DEFAULT_QUEUE is ignored by the scheduler since version 3.0.x. The scheduler only uses the queue named `default`. If I set AIRFLOW__OPERATORS__DEFAULT_QUEUE=default I can run tasks, but the logging error at workers' startuo persists. Maybe the 2 issues are unrelated, but who knows... -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3132902940 @amoghrajesh I did a fresh install of Airflow. There are currently no DAGs at all in the instance. I still see the same error at startup in the worker logs. This error is not related to any DAG/task run. It appears at startup. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
amoghrajesh commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3130704867 @nicolamarangoni looking more carefully, this could be a logging error Your logging formatter is trying to substitute a field `%(task_name)s`, but the log record being emitted doesn’t always have that attribute. Check your task code for potential usage of `task_name` or if you could share your dag code -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3128580148 @amoghrajesh I have this in my config: ``` airflow config list base_url = https://dev.myinternal.company.domain.tech ``` It's the URL of the web UI -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
amoghrajesh commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3125670185 @nicolamarangoni could you check what's set as config for `base_url`? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3113830991 @amoghrajesh Thank you for the hint. I downloaded and substituted the supervisor.py file into my image and restarted the ECS services. I still see the error (see logs in attachment). However I think it's only a false positive because the ECS service of the worker is running. Currently my issue is this one: https://github.com/apache/airflow/issues/42136#issuecomment-3097260505 I don't know if the 2 issues are related. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
amoghrajesh commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3101243612 @nicolamarangoni I merged a recent fix related to this: [Handle invalid execution API urls in supervisor #53082](https://github.com/apache/airflow/pull/53082) Could you pull in that commit in your deployment and try again? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3097235884 Thiese are the logs: [extract-2025-07-21T15_23_57.822Z.csv](https://github.com/user-attachments/files/21350268/extract-2025-07-21T15_23_57.822Z.csv) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966:
URL: https://github.com/apache/airflow/issues/49966#issuecomment-3097233842
After setting AIRFLOW__CORE__EXECUTION_API_SERVER_URL I still have the same
erro but in another position:
https://github.com/celery/celery/blob/main/celery/worker/consumer/connection.py
line 22:
```
def start(self, c):
c.connection = c.connect()
info('Connected to %s', c.connection.as_uri())
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nicolamarangoni commented on issue #49966:
URL: https://github.com/apache/airflow/issues/49966#issuecomment-3097030854
My installation steps consist in a huge terraform deployment.
However, I think that the error happen here:
`celery/apps/worker.py`, line 176:
https://github.com/celery/celery/blob/main/celery/apps/worker.py
This is the code:
```
def on_consumer_ready(self, consumer):
signals.worker_ready.send(sender=consumer)
logger.info('%s ready.', safe_str(self.hostname))
```
It's related to this issue:
https://github.com/apache/airflow/issues/42136#issuecomment-3052053161
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
nailo2c commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-3001404721 I can't reproduce it either, could you share your celery version? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
Re: [I] Error in worker logs: KeyError: 'task_name' [airflow]
amoghrajesh commented on issue #49966: URL: https://github.com/apache/airflow/issues/49966#issuecomment-2861894496 @nicolamarangoni can you help with your installation steps? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
