MaksYermak opened a new issue #22360:
URL: https://github.com/apache/airflow/issues/22360


   ### Apache Airflow version
   
   main (development)
   
   ### What happened
   
   When you will try to create an Object in operator properties and for this 
Object you will try to template his property in this case Jinja will not work.
   Code example:
   ```
   create_batch = DataprocCreateBatchOperator(
           task_id="create_batch",
           project_id=PROJECT_ID,
           region=REGION,
           batch=Batch(
               pyspark_batch=PySparkBatch(
                   main_python_file_uri="main_python_file_uri",
                   python_file_uris="files",
                   args=[
                       "--start-date",
                       "{{ data_interval_start }}", # it doesn't work
                       "--end-date",
                       "{{ ds }}", # it doesn't work
                       "--table",
                       "table_name",
                       "--source-bucket",
                       "source_bucket",
                       "--source-prefix",
                       "source_prefix",
                       "--output-path",
                       "path",
                   ],
               ),
           ),
           batch_id=f"pyspark-table-{str(uuid4())}",
       )
   ```
   Result:
   
![image](https://user-images.githubusercontent.com/12415200/158992837-46726d08-1bd5-4bf1-a089-441954b183cc.png)
   
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   For reproduce you can run this code
   
   ```
   import os
   from datetime import datetime
   from uuid import uuid4
   
   from airflow import models
   from google.cloud.dataproc_v1 import Batch, PySparkBatch
   from airflow.providers.google.cloud.operators.dataproc import (
       DataprocCreateBatchOperator,
   )
   
   PROJECT_ID = os.environ.get("GCP_PROJECT_ID", "an-id")
   REGION = os.environ.get("GCP_LOCATION", "europe-west1")
   
   with models.DAG(
       "example_gcp_batch_dataproc",
       schedule_interval='@once',
       start_date=datetime(2021, 1, 1),
       catchup=False,
   ) as dag_batch:
       create_batch = DataprocCreateBatchOperator(
           task_id="create_batch",
           project_id=PROJECT_ID,
           region=REGION,
           batch=Batch(
               pyspark_batch=PySparkBatch(
                   main_python_file_uri="main_python_file_uri",
                   python_file_uris="files",
                   args=[
                       "--start-date",
                       "{{ data_interval_start }}", # it doesn't work
                       "--end-date",
                       "{{ ds }}", # it doesn't work
                       "--table",
                       "table_name",
                       "--source-bucket",
                       "source_bucket",
                       "--source-prefix",
                       "source_prefix",
                       "--output-path",
                       "path",
                   ],
               ),
           ),
           batch_id=f"pyspark-table-{str(uuid4())}",
       )
   
   ```
   
   ### Operating System
   
   Debian GNU/Linux 11 (bullseye)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to