Hey Frank, Just as an fyi, but you shouldn't have to include the time regardless of if your backfilling a dag scheduled through a cron expression - I backfill dags all the time just using start/end dates and those dags are scheduled through cron expressions. I think it may have to do with your dynamic start date and the range the backfill was looking for.
On Thu, Jul 11, 2019 at 1:07 PM Frank Maritato <fmarit...@opentable.com.invalid> wrote: > Ah never mind, I figured it out. The backfill command has to include the > time. In this case, > > airflow backfill -s '2019-07-10T01:30:00' myjob > > On Thu, Jul 11, 2019 at 12:07 PM Frank Maritato <fmarit...@opentable.com> > wrote: > > > Hi All, > > > > I have a dag with a schedule_interval that is a cron entry: > > > > args = { > > 'owner': 'airflow', > > 'depends_on_past': False, > > 'provide_context': True, > > 'start_date': airflow.utils.dates.days_ago(1), > > 'on_failure_callback': slack_failure_callback, > > > > } > > > > dag = DAG( > > DAG_NAME, > > default_args=args, > > dagrun_timeout=timedelta(hours=2), > > schedule_interval="30 1 * * *", > > ) > > > > and when I try to run > > > > airflow backfill -s '2019-07-10' myjob > > > > I get the following message: > > > > [2019-07-11 12:00:28,213] {jobs.py:2447} INFO - No run dates were found > for the given dates and dag interval. > > > > If my job's schedule_interval is @daily or something like that, I'm able > > to run the backfills as I expect. Is there a way to do this or am I going > > about this wrong? > > -- > > Frank Maritato > > > > > -- > Frank Maritato > -- Austin Weaver Software Engineer FLYR, Inc. www.flyrlabs.com