poilade-de-coulemelle opened a new issue, #63736:
URL: https://github.com/apache/airflow/issues/63736

   ### Apache Airflow Provider(s)
   
   elasticsearch
   
   ### Versions of Apache Airflow Providers
   
   ```
   apache-airflow-providers-elasticsearch==6.5.0
   elastic-transport==8.17.1
   elasticsearch==8.19.3
   ```
   
   ### Apache Airflow version
   
   3.1.8
   
   ### Operating System
   
   debian
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   Configuration of Elasticsearch section in Helm chart  used:
   
   ```
             elasticsearch:
               write_stdout: 'False'
               write_to_es: 'True'
               json_format: 'True'
               host_field: 'host.name'
               index_patterns: 'airflow-logs'
               log_id_template: 
'{dag_id}-{task_id}-{run_id}-{map_index}-{try_number}'
   ```
   
   Resulting in Airflow configuration below
   
   ```
   [elasticsearch]
   host = https://##########/elastic
   log_id_template = {dag_id}-{task_id}-{run_id}-{map_index}-{try_number}
   end_of_log_mark = end_of_log
   frontend = 
   write_stdout = False
   write_to_es = True
   target_index = airflow-logs
   json_format = True
   json_fields = asctime, filename, lineno, levelname, message
   host_field = host.name
   offset_field = offset
   index_patterns = airflow-logs
   index_patterns_callable = 
   max_lines_per_page = 1000
   [elasticsearch_configs]
   http_compress = False
   verify_certs = True
   max_retries = 3
   retry_timeout = True
   timeout = 30
   use_ssl = True
   ```
   
   
   ### What happened
   
   When using Elasticseatch to store and view Airflow task logs inside Airflow 
task logs tab, exceptions raised by a task are not visible.
   It seems only information located in the json field `event` is displayed. 
The information located inside the field `error_detail` and its nested field 
are missing.
   
   If I check on the Elasticsearch side, I can see the field `error_detail` in 
my document. So the information is not missing. See below
   
   ```
     "error_detail": [
         {
           "is_cause": false,
           "frames": [
             {
               "filename": 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/execution_time/task_runner.py",
               "lineno": 1112,
               "name": "run"
             },
             {
               "filename": 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/execution_time/task_runner.py",
               "lineno": 1528,
               "name": "_execute_task"
             },
             {
               "filename": 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/bases/operator.py",
               "lineno": 417,
               "name": "wrapper"
             },
             {
               "filename": 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/standard/operators/python.py",
               "lineno": 228,
               "name": "execute"
             },
             {
               "filename": 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/standard/operators/python.py",
               "lineno": 251,
               "name": "execute_callable"
             },
             {
               "filename": 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/execution_time/callback_runner.py",
               "lineno": 82,
               "name": "run"
             },
             {
               "filename": 
"/opt/airflow/dags/repo/airflow/src/dags/###/build_dag_fail.py",
               "lineno": 13,
               "name": "log_and_raise"
             }
           ],
           "exc_notes": [],
           "syntax_error": null,
           "exc_type": "RuntimeError",
           "exc_value": "Woopsie. Something went wrong. This is an exception 
statement.",
           "exceptions": [],
           "is_group": false
         }
       ],```
   
   
   
   ### What you think should happen instead
   
   I would expect to have the same information in Airflow 3 task logs tab than 
I have in Airflow 2.
   
   Below log visible in Airflow 2
   
   ```
   [2026-03-16, 10:32:25 UTC] {__init__.py:54} DEBUG - Loading core task 
runner: StandardTaskRunner
   [2026-03-16, 10:32:25 UTC] {base_task_runner.py:73} DEBUG - Planning to run 
as the  user
   [2026-03-16, 10:32:25 UTC] {local_task_job_runner.py:123} ▶ Pre task 
execution logs
   [2026-03-16, 10:32:25 UTC] {build_dag_fail.py:8} DEBUG - this is a debugging 
statement.
   [2026-03-16, 10:32:25 UTC] {build_dag_fail.py:9} INFO - this is an info 
statement.
   [2026-03-16, 10:32:25 UTC] {build_dag_fail.py:10} WARNING - this is a 
warning statement.
   [2026-03-16, 10:32:25 UTC] {build_dag_fail.py:11} ERROR - this is an error 
statement.
   [2026-03-16, 10:32:25 UTC] {taskinstance.py:3313} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/models/taskinstance.py",
 line 768, in _execute_task
       result = _execute_callable(context=context, **execute_callable_kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/models/taskinstance.py",
 line 734, in _execute_callable
       return ExecutionCallableRunner(
              ^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/operator_helpers.py",
 line 252, in run
       return self.func(*args, **kwargs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/models/baseoperator.py",
 line 424, in wrapper
       return func(self, *args, **kwargs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/operators/python.py",
 line 238, in execute
       return_value = self.execute_callable()
                      ^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/operators/python.py",
 line 256, in execute_callable
       return runner.run(*self.op_args, **self.op_kwargs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/operator_helpers.py",
 line 252, in run
       return self.func(*args, **kwargs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^
     File "/opt/airflow/dags/repo/airflow/src/dags/####/build_dag_fail.py", 
line 13, in log_and_raise
       raise RuntimeError("Woopsie. Something went wrong. This is an exception 
statement.")
   RuntimeError: Woopsie. Something went wrong. This is an exception statement.
   [2026-03-16, 10:32:25 UTC] {taskinstance.py:907} DEBUG - Task Duration set 
to 0.249458
   [2026-03-16, 10:32:25 UTC] {taskinstance.py:929} DEBUG - Clearing 
next_method and next_kwargs.
   [2026-03-16, 10:32:25 UTC] {taskinstance.py:1226} INFO - Marking task as 
FAILED. dag_id=failing_dag_airflow_2, task_id=log_and_raise, 
run_id=manual__2026-03-16T10:32:06.339293+00:00, 
execution_date=20260316T103206, start_date=20260316T103225, 
end_date=20260316T103225
   [2026-03-16, 10:32:25 UTC] {taskinstance.py:341} ▶ Post task execution logs
   ```
   In Airflow 3, you can see missing information under `Task failed with 
exception`. I should see the stacktrace details.
   
   ```
   [2026-03-16 15:16:35] DEBUG - Loading plugins source=airflow.plugins_manager
   [2026-03-16 15:16:35] INFO - Stats instance was created in PID 7 but 
accessed in PID 22. Re-initializing. source=airflow.stats
   [2026-03-16 15:16:35] DEBUG - Loading plugins from directory: 
/opt/airflow/plugins source=airflow.plugins_manager
   [2026-03-16 15:16:35] DEBUG - Loading plugins from entrypoints 
source=airflow.plugins_manager
   [2026-03-16 15:16:35] DEBUG - Importing entry_point plugin openlineage 
source=airflow.plugins_manager
   [2026-03-16 15:16:35] DEBUG - Loading 1 plugin(s) took 47.44 ms 
source=airflow.plugins_manager
   [2026-03-16 15:16:35] DEBUG - Calling 'on_starting' with {'component': 
<airflow.sdk.execution_time.task_runner.TaskRunnerMarker object at 
0x7755924515d0>} source=airflow.listeners.listener
   [2026-03-16 15:16:35] DEBUG - Hook impls: [] 
source=airflow.listeners.listener
   [2026-03-16 15:16:35] DEBUG - Result from 'on_starting': [] 
source=airflow.listeners.listener
   [2026-03-16 15:16:36] INFO - DAG bundles loaded: dags-folder 
source=airflow.dag_processing.bundles.manager.DagBundlesManager
   [2026-03-16 15:16:36] INFO - Filling up the DagBag from 
/opt/airflow/dags/repo/airflow/src/dags/####/build_dag_fail.py 
source=airflow.models.dagbag.BundleDagBag
   [2026-03-16 15:16:36] DEBUG - Importing 
/opt/airflow/dags/repo/airflow/src/dags/####/build_dag_fail.py 
source=airflow.models.dagbag.BundleDagBag
   [2026-03-16 15:16:36] WARNING - The 
`airflow.operators.python.PythonOperator` attribute is deprecated. Please use 
`'airflow.providers.standard.operators.python.PythonOperator'`. 
source=py.warnings
   [2026-03-16 15:16:36] DEBUG - Loaded DAG <DAG: failing_dag_airflow_2> 
source=airflow.models.dagbag.BundleDagBag
   [2026-03-16 15:16:36] DEBUG - Dag file parsed source=task
   [2026-03-16 15:16:36] DEBUG - Plugins are already loaded. Skipping. 
source=airflow.plugins_manager
   [2026-03-16 15:16:36] DEBUG - Integrate Macros plugins 
source=airflow.plugins_manager
   [2026-03-16 15:16:36] DEBUG - Calling 'on_task_instance_running' with 
{'previous_state': <TaskInstanceState.QUEUED: 'queued'>, 'task_instance': 
RuntimeTaskInstance(id=UUID('019cf700-fc40-7da0-985a-73cb8ef37137'), 
task_id='log_and_raise', dag_id='failing_dag_airflow_2', 
run_id='manual__2026-03-16T14:16:02.853480+00:00', try_number=1, 
dag_version_id=UUID('019cf6c4-5535-788d-9bc9-8d77c72f7e90'), map_index=-1, 
hostname='failing-dag-airflow-2-log-and-raise-atvt8s2a', context_carrier={}, 
task=<Task(PythonOperator): log_and_raise>, 
bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, 
start_date=datetime.datetime(2026, 3, 16, 14, 16, 34, 808436, 
tzinfo=datetime.timezone.utc), end_date=None, state=<TaskInstanceState.RUNNING: 
'running'>, is_mapped=False, rendered_map_index=None)} 
source=airflow.listeners.listener
   [2026-03-16 15:16:36] DEBUG - Hook impls: [] 
source=airflow.listeners.listener
   [2026-03-16 15:16:36] DEBUG - Result from 'on_task_instance_running': [] 
source=airflow.listeners.listener
   [2026-03-16 15:16:36] DEBUG - this is a debugging statement. source=root
   [2026-03-16 15:16:36] INFO - this is an info statement. source=root
   [2026-03-16 15:16:36] WARNING - this is a warning statement. source=root
   [2026-03-16 15:16:36] ERROR - this is an error statement. source=root
   [2026-03-16 15:16:36] ERROR - Task failed with exception source=task
   [2026-03-16 15:16:36] DEBUG - Running finalizers source=task
   [2026-03-16 15:16:36] DEBUG - Calling 'on_task_instance_failed' with 
{'previous_state': <TaskInstanceState.RUNNING: 'running'>, 'task_instance': 
RuntimeTaskInstance(id=UUID('019cf700-fc40-7da0-985a-73cb8ef37137'), 
task_id='log_and_raise', dag_id='failing_dag_airflow_2', 
run_id='manual__2026-03-16T14:16:02.853480+00:00', try_number=1, 
dag_version_id=UUID('019cf6c4-5535-788d-9bc9-8d77c72f7e90'), map_index=-1, 
hostname='failing-dag-airflow-2-log-and-raise-atvt8s2a', context_carrier={}, 
task=<Task(PythonOperator): log_and_raise>, 
bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, 
start_date=datetime.datetime(2026, 3, 16, 14, 16, 34, 808436, 
tzinfo=datetime.timezone.utc), end_date=datetime.datetime(2026, 3, 16, 14, 16, 
36, 206566, tzinfo=datetime.timezone.utc), state=<TaskInstanceState.FAILED: 
'failed'>, is_mapped=False, rendered_map_index=None), 'error': 
RuntimeError('Woopsie. Something went wrong. This is an exception statement.')} 
source=airflow.listeners.listener
   [2026-03-16 15:16:36] DEBUG - Hook impls: [] 
source=airflow.listeners.listener
   [2026-03-16 15:16:36] DEBUG - Result from 'on_task_instance_failed': [] 
source=airflow.listeners.listener
   [2026-03-16 15:16:36] DEBUG - Calling 'before_stopping' with {'component': 
<airflow.sdk.execution_time.task_runner.TaskRunnerMarker object at 
0x77559201c4d0>} source=airflow.listeners.listener
   [2026-03-16 15:16:36] DEBUG - Hook impls: [] 
source=airflow.listeners.listener
   [2026-03-16 15:16:36] DEBUG - Result from 'before_stopping': [] 
source=airflow.listeners.listener
   ```
   
   ### How to reproduce
   
   - Deploy two Airflow, one in version  2.10.5 and the other in version 3.1.8 
using the apache-airflow-providers-elasticsearch configuration given above.
   - Run a task which throw an exception (See some dag example below)
   - See the difference between the two consoles outputs
   
   
   Dag example Airflow 2
   
   ```
   from airflow import DAG
   import pendulum
   from airflow.operators.python import PythonOperator
   import logging
   
   
   def log_and_raise(**context):
       logging.debug("this is a debugging statement.")
       logging.info("this is an info statement.")
       logging.warning("this is a warning statement.")
       logging.error("this is an error statement.")
   
       raise RuntimeError("Woopsie. Something went wrong. This is an exception 
statement.")
   
   
   dag = DAG(
       "failing_dag_airflow_2",
       start_date=pendulum.datetime(2026, 3, 12, 11, 30, tz="Europe/Paris"),
       is_paused_upon_creation=True,
       catchup=False,
       tags=["FAIL"],
       schedule=None,
   )
   
   with dag:
       op = PythonOperator(
           task_id="log_and_raise",
           python_callable=log_and_raise,
       )
   ```
   
   
   Dag Airflow 3
   
   ```
   from airflow import DAG
   import pendulum
   from airflow.operators.python import PythonOperator
   import logging
   
   
   def log_and_raise(**context):
       logging.debug("this is a debugging statement.")
       logging.info("this is an info statement.")
       logging.warning("this is a warning statement.")
       logging.error("this is an error statement.")
   
       raise RuntimeError("Woopsie. Something went wrong. This is an exception 
statement.")
   
   
   dag = DAG(
       "failing_dag_airflow_3",
       start_date=pendulum.datetime(2026, 3, 12, 11, 30, tz="Europe/Paris"),
       is_paused_upon_creation=True,
       catchup=False,
       tags=["FAIL"],
       schedule=None,
   )
   
   with dag:
       op = PythonOperator(
           task_id="log_and_raise",
           python_callable=log_and_raise,
       )
   
   ```
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to