lifnaja opened a new issue, #60579:
URL: https://github.com/apache/airflow/issues/60579

   ### Apache Airflow version
   
   3.1.6
   
   ### If "Other Airflow 3 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   The documentation for Airflow version 3.1.6 mentions that users can access 
the source_dag_run through the triggering_asset_events template variable. 
However, this attribute appears to be missing in the actual tag release of 
3.1.6, resulting in an error when trying to use it in templates.
   
   Document : 
https://airflow.apache.org/docs/apache-airflow/3.1.6/authoring-and-scheduling/asset-scheduling.html#accessing-triggering-asset-events-with-jinja
   
   
   
   - In [v3.1.6 
tag](https://github.com/apache/airflow/blob/3.1.6/task-sdk/src/airflow/sdk/execution_time/comms.py#L325),
 the source_dag_run attribute/function is not yet merged or present in the 
codebase.
   - In the [main 
branch](https://github.com/apache/airflow/blob/main/task-sdk/src/airflow/sdk/execution_time/comms.py#L347),
 the code for source_dag_run exists.
   
   Attempting to use this in a DAG on version 3.1.6 results in an 
AttributeError or undefined variable error within the Jinja template.
   
   <img width="965" height="39" alt="Image" 
src="https://github.com/user-attachments/assets/bcf32b1b-9144-4321-80b0-1c93a38db87c";
 />
   
   ### What you think should happen instead?
   
   _No response_
   
   ### How to reproduce
   
   ```
   import logging
   
   import pendulum
   from airflow.sdk import DAG, Asset, task
   
   
   def _say_hello():
       print("Hello!")
   
   
   asset_1 = Asset("asset_1")
   asset_2 = Asset("asset_2")
   
   with DAG(
       dag_id="example_asset_producer_1",
       schedule=None,
       start_date=pendulum.datetime(2024, 1, 1),
       catchup=False,
       tags=["example", "asset"],
   ) as dag:
   
       @task(outlets=[asset_1])
       def say_hello():
           _say_hello()
   
       say_hello()
   
   
   with DAG(
       dag_id="example_asset_producer_2",
       schedule=None,
       start_date=pendulum.datetime(2024, 1, 1),
       catchup=False,
       tags=["example", "asset"],
   ) as dag:
   
       @task(outlets=[asset_2])
       def say_hello():
           _say_hello()
   
       say_hello()
   
   with DAG(
       dag_id="example_asset_consumer",
       schedule=[asset_1, asset_2],
       start_date=pendulum.datetime(2024, 1, 1),
       catchup=False,
       tags=["example", "asset"],
   ) as dag:
   
       @task()
       def get_asset_detail(x):
           logging.info(x)
           logging.info(type(x))
           logging.info(x.__dict__)
           logging.info(x.name)
   
       get_asset_detail(x='{{ (triggering_asset_events.values() | first | 
first).source_dag_run.data_interval_end }}')
   ```
   
   ### Operating System
   
   docker
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to