[ 
https://issues.apache.org/jira/browse/AIRFLOW-1156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16430007#comment-16430007
 ] 

John Cheng edited comment on AIRFLOW-1156 at 4/9/18 4:04 AM:
-------------------------------------------------------------

Same problem with version 1.9
{code:java}
#!/usr/bin/python3

from airflow.models import DAG
from airflow.operators.dummy_operator import DummyOperator
from datetime import datetime, timedelta

args = {
'owner': 'airflow',
'depends_on_past': False,
'email_on_failure': False,
'email_on_retry': False
}

dag = DAG(
dag_id='dummy',
schedule_interval='*/5 * * * *',
start_date= datetime(2018, 3, 16, 5, 1),
default_args=args,
catchup=False,
dagrun_timeout=timedelta(minutes=2))

dummy = DummyOperator(
task_id='success_exit',
dag=dag
)
{code}
Run the following command at 2018-04-09 02:57
{code:java}
echo "airflow unpause dummy" | at 03:00
{code}
2 DAG runs is triggered
||Execution Date||Operator||Start Date||
|04-09T02:55:00|DummyOperator|04-09T03:00:14.232328|
|04-09T02:50:00|DummyOperator|04-09T03:00:08.060602|

  
  

 

 


was (Author: ckljohn):
Same problem with version 1.9


{code:java}
#!/usr/bin/python3

from airflow.models import DAG
from airflow.operators.dummy_operator import DummyOperator
from datetime import datetime, timedelta

args = {
'owner': 'airflow',
'depends_on_past': False,
'email_on_failure': False,
'email_on_retry': False
}

dag = DAG(
dag_id='dummy',
schedule_interval='*/5 * * * *',
start_date= datetime(2018, 3, 16, 5, 1),
default_args=args,
catchup=False,
dagrun_timeout=timedelta(minutes=2))

dummy = DummyOperator(
task_id='success_exit',
dag=dag
)
{code}
Run the following command at 2018-04-09 02:57

 

 
{code:java}
echo "airflow unpause dummy" | at 03:00
{code}
2 DAG runs is triggered
||Execution Date||Operator||Start Date||
|04-09T02:55:00|DummyOperator|04-09T03:00:14.232328|
|04-09T02:50:00|DummyOperator|04-09T03:00:08.060602|

  
 



 

 

> Using a timedelta object as a Schedule Interval with catchup=False causes the 
> start_date to no longer be honored.
> -----------------------------------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-1156
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1156
>             Project: Apache Airflow
>          Issue Type: Bug
>    Affects Versions: Airflow 1.8
>            Reporter: Zachary Lawson
>            Priority: Minor
>
> Currently, in Airflow v1.8, if you set your schedule_interval to a timedelta 
> object and set catchup=False, the start_date is no longer honored and the DAG 
> is scheduled immediately upon unpausing the DAG. It is then schedule on the 
> schedule interval from that point onward. Example below:
> {code}
> from airflow import DAG
> from datetime import datetime, timedelta
> import logging
> from airflow.operators.python_operator import PythonOperator
> default_args = {
>     'owner': 'airflow',
>     'depends_on_past': False,
>     'start_date': datetime(2015, 6, 1),
> }
> dag = DAG('test', default_args=default_args, 
> schedule_interval=timedelta(seconds=5), catchup=False)
> def context_test(ds, **context):
>     logging.info('testing')
> test_context = PythonOperator(
>     task_id='test_context',
>     provide_context=True,
>     python_callable=context_test,
>     dag=dag
> )
> {code}
> If you switch the above over to a CRON expression, the behavior of the 
> scheduling is returned to the expected.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to