I am having difficulty troubleshooting an issue where running a DAG with a
single task via 'airflow test dag_id task_id' works fine, but running
'airflow backfill dag_id -s 2017-04-20 -e 2017-04-21' fails.

The error I'm receiving from the backfill command is:
root@8ece999d3084:/etc/airflow# airflow backfill dfp-executive-report -s
2017-05-01 -e 2017-05-01
[2017-05-02 18:59:17,131] {__init__.py:57} INFO - Using executor
CeleryExecutor
[2017-05-02 18:59:17,190] {driver.py:120} INFO - Generating grammar tables
from /usr/lib/python2.7/lib2to3/Grammar.txt
[2017-05-02 18:59:17,210] {driver.py:120} INFO - Generating grammar tables
from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
[2017-05-02 18:59:17,377] {models.py:167} INFO - Filling up the DagBag from
/etc/airflow/dags
[2017-05-02 18:59:17,670] {connectionpool.py:203} INFO - Starting new HTTP
connection (1): 169.254.169.254
[2017-05-02 18:59:17,672] {connectionpool.py:203} INFO - Starting new HTTP
connection (1): 169.254.169.254
[2017-05-02 18:59:18,029] {models.py:1126} INFO - Dependencies all met for
<TaskInstance: dfp-executive-report.dfp_to_s3_etl 2017-05-01 00:00:00
[scheduled]>
[2017-05-02 18:59:18,033] {base_executor.py:50} INFO - Adding to queue:
airflow run dfp-executive-report dfp_to_s3_etl 2017-05-01T00:00:00 --pickle
7 --local
[2017-05-02 18:59:22,832] {celery_executor.py:78} INFO - [celery] queuing
(u'dfp-executive-report', u'dfp_to_s3_etl', datetime.datetime(2017, 5, 1,
0, 0)) through celery, queue=default
[2017-05-02 18:59:23,122] {models.py:4024} INFO - Updating state for
<DagRun dfp-executive-report @ 2017-05-01 00:00:00:
backfill_2017-05-01T00:00:00, externally triggered: False> considering 1
task(s)
[2017-05-02 18:59:23,138] {jobs.py:1978} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 0 | succeeded: 0 | kicked_off: 1 |
failed: 0 | skipped: 0 | deadlocked: 0 | not ready: 0
[2017-05-02 18:59:27,830] {jobs.py:1725} ERROR - Executor reports task
instance <TaskInstance: dfp-executive-report.dfp_to_s3_etl 2017-05-01
00:00:00 [queued]> finished (failed) although the task says its queued. Was
the task killed externally?
[2017-05-02 18:59:27,830] {models.py:1417} ERROR - Executor reports task
instance <TaskInstance: dfp-executive-report.dfp_to_s3_etl 2017-05-01
00:00:00 [queued]> finished (failed) although the task says its queued. Was
the task killed externally?

It seems like the task is being killed while it's still queued. I've also
cleared the DB of any dag runs or task runs, but I'm still receiving the
same issue. Any idea why the error occurs when running a backfill or a dag
run, but not when I do `airflow test`?


It's also failing immediately when scheduled via 'airflow run'.

Additional info:
Airflow Version 1.8
CeleryExecutor

Reply via email to