vatsrahul1001 opened a new issue, #48993:
URL: https://github.com/apache/airflow/issues/48993

   ### Apache Airflow version
   
   3.0.0
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   When we use asset listeners, details are not showing in task logs. 
   
   
   
   
   ### What you think should happen instead?
   
   _No response_
   
   ### How to reproduce
   
   1. Create a file with name `listener_code.py` and pace it in `/flies/dags` 
folder
   
   ```
   from airflow.sdk import Asset
   from airflow.listeners import hookimpl
   
   
   @hookimpl
   def on_asset_changed(asset: Asset):
       """Execute when dataset change is registered."""
       print("I am always listening for any Dataset changes and I heard that!")
   
       print("Done!")
   
   
   @hookimpl
   def on_asset_created(asset: Asset):
       """Execute when dataset change is registered."""
       print("I am always listening for any Dataset created")
   
       print("dataset created")
   
   @hookimpl
   def on_task_instance_success(previous_state, task_instance
   ):
       """Execute when task state changes to SUCCESS. previous_state can be 
None."""
       print("I am always listening for any TaskInstance to succeed and I heard 
that!")
       print("Now I fail on purpose...")
   
   @hookimpl
   def on_task_instance_running(previous_state, task_instance
   ):
       """Execute when task state changes to SUCCESS. previous_state can be 
None."""
       print("task running listener")
       print("testing")
   
   ```
   
   2. Create a file with name `my_listener_plugin.py` and place it in 
`/files/plugin` folder
   
   `from airflow.plugins_manager import AirflowPlugin
   from plugins import listener_code
   
   class MyListenerPlugin(AirflowPlugin):
       name = "my_listener_plugin"
       listeners = [listener_code]
   `
   
   4. Put below DAG in files folder
   
   ```
   from airflow.sdk import Asset
   from airflow.decorators import dag, task
   
   from pendulum import datetime
   
   
   
   URI = "file://include/bears"
   MY_DATASET = Asset(URI)
   
   
   @dag(
       start_date=datetime(2023, 12, 1),
       schedule=None,
       catchup=False,
       tags=["Listeners"],
   )
   def listener_test():
       @task(
           outlets=[MY_DATASET],
       )
       def get_bear():
           print("hi")
   
       get_bear()
   
   
   
   listener_test()
   ```
   
   5. Run `breeze start-airflow --executor CeleryExecutor --backend postgres 
--dev-mode`
   6. Exeute DAG `listener_test`
   7. Verify  print statement `"I am always listening for any Dataset changes 
and I heard that!"` is not visible in task logs. It's only visible in API 
server logs with  `PydanticSerializationUnexpectedValue`
   
   ```
   I am always listening for any Dataset changes and I heard that!
   Done!
   [2025-04-09T07:15:39.180+0000] {task_instances.py:398} INFO - TI 
01961967-87af-7ed4-923a-63bd7cf04494 state updated to success: 1 row(s) affected
         INFO   127.0.0.1:54020 - "PATCH
                /execution/task-instances/01961967-87af…
                HTTP/1.1" 204
   [2025-04-09T07:15:46.866+0000] {manager.py:120} INFO - DAG bundles loaded: 
dags-folder
         INFO   192.168.97.1:57408 - "GET
                /ui/grid/listener_test?limit=10&order_b…
                HTTP/1.1" 200
   [2025-04-09T07:15:46.870+0000] {manager.py:120} INFO - DAG bundles loaded: 
dags-folder
         INFO   192.168.97.1:57390 - "GET
                /ui/dags/recent_dag_runs?dag_ids=listen…
                HTTP/1.1" 200
         INFO   192.168.97.1:57392 - "GET
                /api/v2/dags/listener_test/dagRuns?limi…
                HTTP/1.1" 200
   [2025-04-09T07:15:48.108+0000] {manager.py:120} INFO - DAG bundles loaded: 
dags-folder
         INFO   192.168.97.1:57390 - "GET
                /api/v2/dags/listener_test/tasks/get_be…
                HTTP/1.1" 200
   [2025-04-09T07:15:48.112+0000] {manager.py:120} INFO - DAG bundles loaded: 
dags-folder
         INFO   192.168.97.1:57408 - "GET
                /api/v2/dags/listener_test/dagRuns/manu…
                HTTP/1.1" 200
         INFO   192.168.97.1:57392 - "GET
                /api/v2/dags/listener_test/dagRuns/manu…
                HTTP/1.1" 200
   /usr/local/lib/python3.9/site-packages/pydantic/type_adapter.py:572 
UserWarning: Pydantic serializer warnings:
     PydanticSerializationUnexpectedValue(Expected `StructuredLogMessage` - 
serialized value may not be as expected 
[input_value=StructuredLogMessage(time...et_bear/attempt=1.log']), 
input_type=StructuredLogMessage])
     PydanticSerializationUnexpectedValue(Expected `str` - serialized value may 
not be as expected 
[input_value=StructuredLogMessage(time...et_bear/attempt=1.log']), 
input_type=StructuredLogMessage])
         INFO   192.168.97.1:57392 - "GET
                /api/v2/dags/listener_test/dagRuns/manu…
                HTTP/1.1" 200
   
   ```
   
   
   ### Operating System
   
   Linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to