vatsrahul1001 opened a new issue, #47767:
URL: https://github.com/apache/airflow/issues/47767

   ### Apache Airflow version
   
   main (development)
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   Extra info not getting stored when in asset_events table when used with 
classic operator. This is working fine with Airflow 2.10.5
   
   <img width="1211" alt="Image" 
src="https://github.com/user-attachments/assets/c0e097a0-c560-4394-b2de-2dcbe8156954";
 />
   
   ### What you think should happen instead?
   
   Extra info should also get updated in asset_events table with classic 
operator
   
   
   
   ### How to reproduce
   
   1. Below mentioned are 3 DAGS(`dataset_with_extra_by_yield`, 
`dataset_with_extra_by_context`, dataset_with_extra_from_classic_operator)
   2. Try executing `dataset_with_extra_from_classic_operator` and check 
asset_event table extra column is empty in case of 
`dataset_with_extra_by_yield` and `dataset_with_extra_by_context` its getting 
store
   
   
   
   
   ```
   from __future__ import annotations
   
   import datetime
   
   from airflow.datasets import Dataset
   from airflow.sdk.definitions.asset.metadata import Metadata
   from airflow.decorators import task
   from airflow.models.dag import DAG
   from airflow.providers.standard.operators.bash import BashOperator
   
   ds = Dataset("s3://output/1.txt")
   
   with DAG(
       dag_id="dataset_with_extra_by_yield",
       catchup=False,
       start_date=datetime.datetime.min,
       schedule="@daily",
       tags=["datasets"],
   ):
   
       @task(outlets=[ds])
       def dataset_with_extra_by_yield():
           yield Metadata(ds, {"hi": "bye"})
   
       dataset_with_extra_by_yield()
   
   with DAG(
       dag_id="dataset_with_extra_by_context",
       catchup=False,
       start_date=datetime.datetime.min,
       schedule="@daily",
       tags=["dataset"],
   ):
   
       @task(outlets=[ds])
       def dataset_with_extra_by_context(*, outlet_events=None):
           outlet_events[ds].extra = {"hi": "bye"}
   
       dataset_with_extra_by_context()
   
   with DAG(
       dag_id="dataset_with_extra_from_classic_operator",
       catchup=False,
       start_date=datetime.datetime.min,
       schedule="@daily",
       tags=["dataset"],
   ):
   
       def _dataset_with_extra_from_classic_operator_post_execute(context, 
result):
           context["outlet_events"][ds].extra = {"hi": "bye"}
   
       BashOperator(
           task_id="dataset_with_extra_from_classic_operator",
           outlets=[ds],
           bash_command=":",
           post_execute=_dataset_with_extra_from_classic_operator_post_execute,
       )
   
   ```
   
   ### Operating System
   
   linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to