jablecherman opened a new issue, #61558:
URL: https://github.com/apache/airflow/issues/61558

   ### Apache Airflow version
   
   Other Airflow 3 version (please specify below)
   
   ### If "Other Airflow 3 version" selected, which one?
   
   3.1.2
   
   ### What happened?
   
   `Cannot expand <SerializedMappedTask(PythonOperator): process_item> for run 
manual__2026-02-06T19:06:43.772890+00:00; missing upstream values: ['op_args']`
   
   This occurs whenever you run `dag.test()` and provide a 
`mark_success_pattern` for the upstream task to the dynamic operator. The 
dynamic task has no input values to map over, and predictably fails.
   
   What could be considered a bug, is the same error still occurs when you ALSO 
provide a `mark_success_pattern` for the dynamic task itself. 
   
   ### What you think should happen instead?
   
   I would prefer if the test function were smart enough to check for tasks it 
should auto-succeed, **including dynamic tasks**, before checking for "missing 
upstream values."
   
   One difficulty is a task still needs to be generated, but it should be 
simple enough to generate a single successful mapped task.
   
   (The more I think about this, the more I think it is a feature request and 
not a bug, but hey 🤷 ).
   
   
   ### How to reproduce
   
   ```
   from airflow import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   
   def generate_list():
       return [["item1"], ["item2"], ["item3"]]
   
   def process_item(item):
       print(f"Processing {item}")
   
   with DAG(
       dag_id="test_dynamic_mapping_dag",
       default_args={"owner": "airflow"},
       schedule=None,
   ) as dag:
   
       generate_task = PythonOperator(
           task_id="generate_list",
           python_callable=generate_list,
       )
   
       process_task = PythonOperator.partial(
           task_id="process_item",
           python_callable=process_item,
       ).expand(op_args=generate_task.output)
   
   if __name__ == "__main__":
       # running the below will understandably fail
       # dag.test(mark_success_pattern="generate_list")
   
       # running the below will fail, but arguably should succeed
       dag.test(mark_success_pattern="generate_list|process_item")
   
   ```
   
   ### Operating System
   
   Debian GNU/Linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to