podhornyi opened a new issue #15077:
URL: https://github.com/apache/airflow/issues/15077
**Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl
version`): v1.15.10
**Environment**:
```
TZ: Etc/UTC
## ----------------
## Airflow
## ----------------
AIRFLOW__CORE__DAGS_FOLDER: "/opt/airflow/git-sync"
AIRFLOW__CORE__EXECUTOR: "KubernetesExecutor"
AIRFLOW__CORE__FERNET_KEY: "{{ .Values.fernet.key }}"
AIRFLOW__CORE__LOAD_EXAMPLES: "False"
AIRFLOW__SCHEDULER__CHILD_PROCESS_LOG_DIRECTORY:
"/opt/airflow/logs/scheduler"
AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL: "15"
AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL: "0"
AIRFLOW__SCHEDULER__CATCHUP_BY_DEFAULT: "False"
AIRFLOW__WEBSERVER__BASE_URL: "http://localhost:8080"
AIRFLOW__WEBSERVER__WEB_SERVER_PORT: "8080"
AIRFLOW__WEBSERVER__AUTHENTICATE: "True"
AIRFLOW__WEBSERVER__AUTH_BACKEND:
"airflow.contrib.auth.backends.password_auth"
AIRFLOW__WEBSERVER__COOKIE_SAMESITE: "Lax"
## ----------------
## Airflow - User Configs
## ----------------
AIRFLOW__API__AUTH_BACKEND: "airflow.api.auth.backend.deny_all"
AIRFLOW__WEBSERVER__EXPOSE_CONFIG: "True"
AIRFLOW__WEBSERVER__DEFAULT_DAG_RUN_DISPLAY_NUMBER: "50"
AIRFLOW__CORE__DAG_CONCURRENCY: "1"
AIRFLOW__CORE__MAX_ACTIVE_RUNS_PER_DAG: "1"
AIRFLOW__SCHEDULER__MAX_DAGRUNS_PER_LOOP_TO_SCHEDULE: "1"
AIRFLOW__CORE__DAG_RUN_CONF_OVERRIDES_PARAMS: "True"
# Kubernetes section
AIRFLOW__KUBERNETES__NAMESPACE: "default"
AIRFLOW__KUBERNETES__DELETE_WORKER_PODS_ON_FAILURE: "False"
AIRFLOW__KUBERNETES__DELETE_WORKER_PODS: "True"
# LOGGING
AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS:
"airflow_service.loggers.log_config.LOGGING_CONFIG"
AIRFLOW__LOGGING__BASE_LOG_FOLDER: "/opt/airflow/logs"
AIRFLOW__LOGGING__DAG_PROCESSOR_MANAGER_LOG_LOCATION:
"/opt/airflow/logs/dag_processor_manager/dag_processor_manager.log"
AIRFLOW__LOGGING__REMOTE_LOGGING: "True"
AIRFLOW__LOGGING__COLORED_CONSOLE_LOG: "False"
# ELK
AIRFLOW__ELASTICSEARCH__JSON_FORMAT: "True"
AIRFLOW__ELASTICSEARCH__WRITE_STDOUT: "True"
AIRFLOW__ELASTICSEARCH__HOST: "http://elk.logging.svc.cluster.local:8080"
AIRFLOW__ELASTICSEARCH__JSON_FIELDS: "asctime, name, levelname, filename,
lineno, message"
```
- **Kernel** (e.g. `uname -a`): 4.9.0-11-amd64 #1 SMP Debian
4.9.189-3+deb9u2 (2019-11-11) x86_64 GNU/Linux
**What happened**:
After scheduler create k8s pod, and the node where that pod run goes down,
scheduler delete the pod, set it to state up_fore_reschedule and task stuck in
queue state forever.
**What you expected to happen**:
As for task defined retry=0, task should be failed
or
Scheduler should recreate in case of retry > 0
**How to reproduce it**:
Delete k8s node when scheduler create a task, another words when scheduler
send and API call to k8s and receive Ok status.
**Anything else we need to know**:
After scheduler restart, its saw task in queue state and execute it on via
k8s call in regular way:
```
{"asctime": "2021-03-29 09:37:57,815", "processName": "MainProcess",
"module": "kubernetes_executor", "filename": "kubernetes_executor.py",
"lineno": 462, "levelname": "INFO", "message": "When executor started up, found
1 queued task instances"}
{"asctime": "2021-03-29 09:37:57,846", "processName": "MainProcess",
"module": "kubernetes_executor", "filename": "kubernetes_executor.py",
"lineno": 480, "levelname": "INFO", "message": "TaskInstance: <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-29 09:34:01.780510+00:00
[queued]> found in queued state but was not launched, rescheduling"}
...
{"asctime": "2021-03-29 09:37:58,153", "processName": "MainProcess",
"module": "scheduler_job", "filename": "scheduler_job.py", "lineno": 938,
"levelname": "INFO", "message": "1 tasks up for execution:\n\t<TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-29 09:34:01.780510+00:00
[scheduled]>"}
{"asctime": "2021-03-29 09:37:58,157", "processName": "MainProcess",
"module": "scheduler_job", "filename": "scheduler_job.py", "lineno": 967,
"levelname": "INFO", "message": "Figuring out tasks to run in
Pool(name=default_pool) with 128 open slots and 1 task instances ready to be
queued"}
{"asctime": "2021-03-29 09:37:58,157", "processName": "MainProcess",
"module": "scheduler_job", "filename": "scheduler_job.py", "lineno": 995,
"levelname": "INFO", "message": "DAG test_reschedule.up-for-reschedule has 0/1
running and queued tasks"}
{"asctime": "2021-03-29 09:37:58,157", "processName": "MainProcess",
"module": "scheduler_job", "filename": "scheduler_job.py", "lineno": 1060,
"levelname": "INFO", "message": "Setting the following tasks to queued
state:\n\t<TaskInstance: test_reschedule.up-for-reschedule.task_1 2021-03-29
09:34:01.780510+00:00 [scheduled]>"}
{"asctime": "2021-03-29 09:37:58,161", "processName": "MainProcess",
"module": "scheduler_job", "filename": "scheduler_job.py", "lineno": 1102,
"levelname": "INFO", "message": "Sending
TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 29, 9, 34, 1, 780510,
tzinfo=Timezone('UTC')), try_number=1) to executor with priority 3 and queue
default"}
```
Scheduler logs
Task which stuck in queue: `test_reschedule.up-for-reschedule.task_1`
Pod name which end with up_for_reschedule:
`estrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b`
```
[2021-03-26 16:34:07,585] [INFO] 1 tasks up for execution:
<TaskInstance: test_reschedule.up-for-reschedule.task_1 2021-03-26
16:34:06.628707+00:00 [scheduled]> {scheduler_job.py:938}
[2021-03-26 16:34:07,604] [INFO] Figuring out tasks to run in
Pool(name=default_pool) with 128 open slots and 1 task instances ready to be
queued {scheduler_job.py:967}
[2021-03-26 16:34:07,604] [INFO] DAG test_reschedule.up-for-reschedule has
0/1 running and queued tasks {scheduler_job.py:995}
[2021-03-26 16:34:07,604] [INFO] Setting the following tasks to queued state:
<TaskInstance: test_reschedule.up-for-reschedule.task_1 2021-03-26
16:34:06.628707+00:00 [scheduled]> {scheduler_job.py:1060}
[2021-03-26 16:34:07,610] [INFO] Sending
TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1) to executor with priority 3 and queue
default {scheduler_job.py:1102}
[2021-03-26 16:34:07,611] [INFO] Adding to queue: ['airflow', 'tasks',
'run', 'test_reschedule.up-for-reschedule', 'task_1',
'2021-03-26T16:34:06.628707+00:00', '--local', '--pool', 'default_pool',
'--subdir',
'/opt/airflow/git-sync/data-import/reschedule/deployment/airflow_dags/dag.py']
{base_executor.py:79}
[2021-03-26 16:34:07,613] [DEBUG] 0 running task instances
{base_executor.py:147}
[2021-03-26 16:34:07,613] [DEBUG] 1 in queue {base_executor.py:148}
[2021-03-26 16:34:07,613] [DEBUG] 32 open slots {base_executor.py:149}
[2021-03-26 16:34:07,613] [INFO] Add task
TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1) with command ['airflow', 'tasks', 'run',
'test_reschedule.up-for-reschedule', 'task_1',
'2021-03-26T16:34:06.628707+00:00', '--local', '--pool', 'default_pool',
'--subdir',
'/opt/airflow/git-sync/data-import/reschedule/deployment/airflow_dags/dag.py']
with executor_config {...} {kubernetes_executor.py:510}
[2021-03-26 16:34:07,629] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:07,629] [DEBUG] self.running:
{TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1)} {kubernetes_executor.py:524}
[2021-03-26 16:34:07,630] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:07,630] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
[2021-03-26 16:34:07,642] [INFO] Kubernetes job is
(TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1), ['airflow', 'tasks', 'run',
'test_reschedule.up-for-reschedule', 'task_1',
'2021-03-26T16:34:06.628707+00:00', '--local', '--pool', 'default_pool',
'--subdir',
'/opt/airflow/git-sync/data-import/reschedule/deployment/airflow_dags/dag.py'],
{...}, None) {kubernetes_executor.py:277}
[2021-03-26 16:34:07,716] [DEBUG] Kubernetes running for command ['airflow',
'tasks', 'run', 'test_reschedule.up-for-reschedule', 'task_1',
'2021-03-26T16:34:06.628707+00:00', '--local', '--pool', 'default_pool',
'--subdir',
'/opt/airflow/git-sync/data-import/reschedule/deployment/airflow_dags/dag.py']
{kubernetes_executor.py:306}
[2021-03-26 16:34:07,716] [DEBUG] Kubernetes launching image
XXXXXXXXXXX.dkr.ecr.eu-central-1.amazonaws.com/airflow:2.0.0
{kubernetes_executor.py:307}
[2021-03-26 16:34:07,718] [DEBUG] Pod Creation Request:
{...} {pod_launcher.py:79}
[2021-03-26 16:34:07,750] [DEBUG] response body:
{"kind":"Pod","apiVersion":"v1","metadata":{"name":"testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b",
...,"status":{"phase":"Pending","qosClass":"Burstable"}}
{rest.py:230}
[2021-03-26 16:34:07,751] [DEBUG] Disposing DB connection pool (PID 20086)
{settings.py:290}
[2021-03-26 16:34:07,760] [INFO] Event:
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b had an
event of type ADDED {kubernetes_executor.py:147}
[2021-03-26 16:34:07,761] [INFO] Event:
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b Pending
{kubernetes_executor.py:202}
[2021-03-26 16:34:07,808] [INFO] Event:
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b had an
event of type MODIFIED {kubernetes_executor.py:147}
[2021-03-26 16:34:07,808] [INFO] Event:
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b Pending
{kubernetes_executor.py:202}
[2021-03-26 16:34:07,810] [DEBUG] Disposing DB connection pool (PID 20072)
{settings.py:290}
[2021-03-26 16:34:07,760] [DEBUG] Pod Creation Response: {...}
{pod_launcher.py:84}
[2021-03-26 16:34:07,812] [DEBUG] Kubernetes Job created!
{kubernetes_executor.py:311}
[2021-03-26 16:34:07,813] [INFO] Executor reports execution of
test_reschedule.up-for-reschedule.task_1 execution_date=2021-03-26
16:34:06.628707+00:00 exited with status queued for try_number 1
{scheduler_job.py:1193}
[2021-03-26 16:34:07,822] [INFO] Setting external_id for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> to 9 {scheduler_job.py:1220}
[2021-03-26 16:34:07,906] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:07,906] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:07,910] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:07,910] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:07,911] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:07,911] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:07,911] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:07,911] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:07,911] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:07,912] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:07,916] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:07,916] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:07,917] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:07,917] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:07,917] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:07,930] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:07,931] [DEBUG] 1 running task instances
{base_executor.py:147}
[2021-03-26 16:34:07,932] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:07,932] [DEBUG] 31 open slots {base_executor.py:149}
[2021-03-26 16:34:07,932] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:07,932] [DEBUG] self.running:
{TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1)} {kubernetes_executor.py:524}
[2021-03-26 16:34:07,932] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:07,932] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
....
[2021-03-26 16:34:08,977] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:08,978] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:08,977] [DEBUG] Started a process (PID: 20286) to generate
tasks for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/loggers/log_config.py
{dag_processing.py:1000}
[2021-03-26 16:34:08,978] [DEBUG] 2/2 DAG parsing processes running
{dag_processing.py:983}
[2021-03-26 16:34:08,978] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:08,978] [DEBUG] 3 file paths queued for processing
{dag_processing.py:985}
[2021-03-26 16:34:08,978] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:08,979] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:08,979] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:08,979] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:08,979] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:08,980] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:08,980] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:08,984] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:08,985] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:08,985] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:08,985] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:08,985] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:08,998] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:09,000] [DEBUG] 1 running task instances
{base_executor.py:147}
[2021-03-26 16:34:09,000] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:09,000] [DEBUG] 31 open slots {base_executor.py:149}
[2021-03-26 16:34:09,000] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:09,000] [DEBUG] self.running:
{TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1)} {kubernetes_executor.py:524}
[2021-03-26 16:34:09,000] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
...
[2021-03-26 16:34:10,048] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:10,048] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:10,048] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:10,048] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:10,049] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:10,049] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:10,049] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:10,049] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:10,049] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:10,050] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:10,055] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:10,056] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:10,056] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:10,056] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:10,056] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:10,066] [DEBUG] Processor for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/loggers/log_config.py
finished {dag_processing.py:949}
[2021-03-26 16:34:10,066] [DEBUG] Waiting for <ForkProcess
name='DagFileProcessor2911-Process' pid=20464 parent=82 stopped exitcode=0>
{scheduler_job.py:309}
[2021-03-26 16:34:10,071] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:10,071] [DEBUG] Started a process (PID: 20478) to generate
tasks for
/opt/airflow/git-sync/data-import/reschedule/deployment/airflow_dags/dag.py
{dag_processing.py:1000}
[2021-03-26 16:34:10,072] [DEBUG] 2/2 DAG parsing processes running
{dag_processing.py:983}
[2021-03-26 16:34:10,072] [DEBUG] 1 file paths queued for processing
{dag_processing.py:985}
[2021-03-26 16:34:10,073] [DEBUG] 1 running task instances
{base_executor.py:147}
[2021-03-26 16:34:10,073] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:10,073] [DEBUG] 31 open slots {base_executor.py:149}
[2021-03-26 16:34:10,074] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:10,074] [DEBUG] self.running:
{TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1)} {kubernetes_executor.py:524}
[2021-03-26 16:34:10,074] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:10,074] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
...
[2021-03-26 16:34:10,315] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:10,315] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:10,316] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:10,316] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:10,316] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:10,316] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:10,316] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:10,317] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:10,317] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:10,317] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:10,323] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:10,323] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:10,324] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:10,324] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:10,324] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:10,342] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:10,344] [DEBUG] 1 running task instances
{base_executor.py:147}
[2021-03-26 16:34:10,344] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:10,344] [DEBUG] 31 open slots {base_executor.py:149}
[2021-03-26 16:34:10,344] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:10,344] [DEBUG] self.running:
{TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1)} {kubernetes_executor.py:524}
[2021-03-26 16:34:10,345] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:10,345] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
...
[2021-03-26 16:34:11,408] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:11,408] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:11,409] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:11,409] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:11,409] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:11,409] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:11,409] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:11,409] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:11,410] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:11,410] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:11,412] [DEBUG] Started a process (PID: 20702) to generate
tasks for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/dag.py
{dag_processing.py:1000}
[2021-03-26 16:34:11,413] [DEBUG] 2/2 DAG parsing processes running
{dag_processing.py:983}
[2021-03-26 16:34:11,414] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:11,415] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:11,415] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:11,415] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:11,413] [DEBUG] 2 file paths queued for processing
{dag_processing.py:985}
[2021-03-26 16:34:11,415] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:11,416] [DEBUG] Disposing DB connection pool (PID 20695)
{settings.py:290}
[2021-03-26 16:34:11,428] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:11,429] [DEBUG] 1 running task instances
{base_executor.py:147}
[2021-03-26 16:34:11,430] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:11,430] [DEBUG] 31 open slots {base_executor.py:149}
[2021-03-26 16:34:11,430] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:11,430] [DEBUG] self.running:
{TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1)} {kubernetes_executor.py:524}
[2021-03-26 16:34:11,430] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:11,431] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
....
[2021-03-26 16:34:12,486] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:12,486] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:12,486] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:12,486] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:12,487] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:12,487] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:12,487] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:12,488] [DEBUG] Processor for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/dag.py finished
{dag_processing.py:949}
[2021-03-26 16:34:12,487] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:12,489] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:12,489] [DEBUG] Waiting for <ForkProcess
name='DagFileProcessor2969-Process' pid=20873 parent=82 started>
{scheduler_job.py:309}
[2021-03-26 16:34:12,495] [DEBUG] Started a process (PID: 20884) to generate
tasks for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/utils.py
{dag_processing.py:1000}
[2021-03-26 16:34:12,490] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:12,497] [DEBUG] 2/2 DAG parsing processes running
{dag_processing.py:983}
[2021-03-26 16:34:12,497] [DEBUG] 0 file paths queued for processing
{dag_processing.py:985}
[2021-03-26 16:34:12,509] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:12,509] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:12,509] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:12,509] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:12,510] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:12,529] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:12,531] [DEBUG] 1 running task instances
{base_executor.py:147}
[2021-03-26 16:34:12,531] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:12,534] [DEBUG] 31 open slots {base_executor.py:149}
[2021-03-26 16:34:12,534] [DEBUG] Disposing DB connection pool (PID 20884)
{settings.py:290}
[2021-03-26 16:34:12,534] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:12,534] [DEBUG] self.running:
{TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1)} {kubernetes_executor.py:524}
[2021-03-26 16:34:12,534] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:12,534] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
...
...
...
[2021-03-26 16:34:48,998] [INFO] Event:
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b had an
event of type MODIFIED {kubernetes_executor.py:147}
[2021-03-26 16:34:49,004] [INFO] Event:
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b Pending
{kubernetes_executor.py:202}
...
[2021-03-26 16:34:49,024] [INFO] Event:
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b had an
event of type DELETED {kubernetes_executor.py:147}
[2021-03-26 16:34:49,024] [INFO] Event: Failed to start pod
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b, will
reschedule {kubernetes_executor.py:197}
...
[2021-03-26 16:34:49,372] [DEBUG] DAG test_reschedule.up-for-reschedule not
changed structure, skipping dagrun.verify_integrity {scheduler_job.py:1692}
[2021-03-26 16:34:49,380] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:49,380] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:49,380] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:49,381] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:49,381] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:49,381] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:49,381] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:49,382] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:49,382] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:49,382] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:49,388] [DEBUG] Disposing DB connection pool (PID 27073)
{settings.py:290}
[2021-03-26 16:34:49,390] [DEBUG] Processor for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/final_task.py
finished {dag_processing.py:949}
[2021-03-26 16:34:49,390] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:49,391] [DEBUG] Waiting for <ForkProcess
name='DagFileProcessor3853-Process' pid=27062 parent=82 stopped exitcode=0>
{scheduler_job.py:309}
[2021-03-26 16:34:49,391] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:49,391] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:49,391] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:49,395] [DEBUG] Started a process (PID: 27082) to generate
tasks for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/dag.py
{dag_processing.py:1000}
[2021-03-26 16:34:49,403] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:49,404] [DEBUG] 2/2 DAG parsing processes running
{dag_processing.py:983}
[2021-03-26 16:34:49,404] [DEBUG] 2 file paths queued for processing
{dag_processing.py:985}
[2021-03-26 16:34:49,411] [DEBUG] Processor for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/loggers/log_config.py
finished {dag_processing.py:949}
[2021-03-26 16:34:49,412] [DEBUG] Waiting for <ForkProcess
name='DagFileProcessor3855-Process' pid=27073 parent=82 stopped exitcode=0>
{scheduler_job.py:309}
[2021-03-26 16:34:49,416] [DEBUG] Started a process (PID: 27087) to generate
tasks for
/opt/airflow/git-sync/data-import/reschedule/deployment/airflow_dags/dag.py
{dag_processing.py:1000}
[2021-03-26 16:34:49,417] [DEBUG] 2/2 DAG parsing processes running
{dag_processing.py:983}
[2021-03-26 16:34:49,417] [DEBUG] 1 file paths queued for processing
{dag_processing.py:985}
[2021-03-26 16:34:49,419] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:49,420] [DEBUG] 1 running task instances
{base_executor.py:147}
[2021-03-26 16:34:49,420] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:49,420] [DEBUG] 31 open slots {base_executor.py:149}
[2021-03-26 16:34:49,420] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:49,421] [DEBUG] self.running:
{TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=Timezone('UTC')), try_number=1)} {kubernetes_executor.py:524}
[2021-03-26 16:34:49,421] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:49,421] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
[2021-03-26 16:34:49,421] [DEBUG] Processing task
('testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b',
'et-airflow-qa', 'up_for_reschedule', {'dag_id':
'test_reschedule.up-for-reschedule', 'task_id': 'task_1', 'execution_date':
'2021-03-26T16:34:06.628707+00:00', 'try_number': '1'}, '137947551')
{kubernetes_executor.py:343}
[2021-03-26 16:34:49,421] [INFO] Attempting to finish pod; pod_id:
testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b; state:
up_for_reschedule; annotations: {'dag_id': 'test_reschedule.up-for-reschedule',
'task_id': 'task_1', 'execution_date': '2021-03-26T16:34:06.628707+00:00',
'try_number': '1'} {kubernetes_executor.py:353}
[2021-03-26 16:34:49,421] [DEBUG] Creating task key for annotations
{'dag_id': 'test_reschedule.up-for-reschedule', 'task_id': 'task_1',
'execution_date': '2021-03-26T16:34:06.628707+00:00', 'try_number': '1'}
{kubernetes_executor.py:362}
[2021-03-26 16:34:49,422] [DEBUG] finishing job
TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=tzlocal()), try_number=1) - up_for_reschedule
(testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b)
{kubernetes_executor.py:358}
[2021-03-26 16:34:49,423] [INFO] Changing state of
(TaskInstanceKey(dag_id='test_reschedule.up-for-reschedule', task_id='task_1',
execution_date=datetime.datetime(2021, 3, 26, 16, 34, 6, 628707,
tzinfo=tzlocal()), try_number=1), 'up_for_reschedule',
'testrescheduleupforrescheduletask1-90fe939148314774bfcc61079113f76b',
'et-airflow-qa', '137947551') to up_for_reschedule {kubernetes_executor.py:546}
[2021-03-26 16:34:49,424] [INFO] Executor reports execution of
test_reschedule.up-for-reschedule.task_1 execution_date=2021-03-26
16:34:06.628707+00:00 exited with status up_for_reschedule for try_number 1
{scheduler_job.py:1193}
...
[2021-03-26 16:34:49,465] [DEBUG] DAG test_reschedule.up-for-reschedule not
changed structure, skipping dagrun.verify_integrity {scheduler_job.py:1692}
[2021-03-26 16:34:49,473] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:49,474] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:49,474] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:49,474] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:49,474] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:49,474] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:49,475] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:49,475] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:49,475] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:49,475] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:49,475] [DEBUG] Processor for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/dag.py finished
{dag_processing.py:949}
[2021-03-26 16:34:49,476] [DEBUG] Waiting for <ForkProcess
name='DagFileProcessor3856-Process' pid=27082 parent=82 stopped exitcode=0>
{scheduler_job.py:309}
[2021-03-26 16:34:49,480] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:49,480] [DEBUG] Started a process (PID: 27094) to generate
tasks for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/utils.py
{dag_processing.py:1000}
[2021-03-26 16:34:49,480] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:49,480] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:49,480] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:49,480] [DEBUG] 2/2 DAG parsing processes running
{dag_processing.py:983}
[2021-03-26 16:34:49,481] [DEBUG] 0 file paths queued for processing
{dag_processing.py:985}
[2021-03-26 16:34:49,481] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:49,512] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:49,513] [DEBUG] 0 running task instances
{base_executor.py:147}
[2021-03-26 16:34:49,514] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:49,514] [DEBUG] 32 open slots {base_executor.py:149}
[2021-03-26 16:34:49,514] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:49,514] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:49,515] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
...
[2021-03-26 16:34:50,558] [DEBUG] DAG test_reschedule.up-for-reschedule not
changed structure, skipping dagrun.verify_integrity {scheduler_job.py:1692}
[2021-03-26 16:34:50,567] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:50,567] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:50,568] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:50,568] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:50,569] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:50,569] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:50,569] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:50,570] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:50,570] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:50,570] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:50,575] [DEBUG] Disposing DB connection pool (PID 27269)
{settings.py:290}
[2021-03-26 16:34:50,575] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:50,575] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:50,575] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:50,575] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:50,576] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:50,584] [DEBUG] Processor for
/opt/airflow/git-sync/airflow/up-for-reschedule/airflow_service/loggers/log_config.py
finished {dag_processing.py:949}
[2021-03-26 16:34:50,584] [DEBUG] Waiting for <ForkProcess
name='DagFileProcessor3883-Process' pid=27269 parent=82 started>
{scheduler_job.py:309}
[2021-03-26 16:34:50,590] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:50,604] [DEBUG] 0 running task instances
{base_executor.py:147}
[2021-03-26 16:34:50,604] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:50,604] [DEBUG] 32 open slots {base_executor.py:149}
[2021-03-26 16:34:50,604] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:50,604] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:50,605] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
...
[2021-03-26 16:34:52,281] [DEBUG] DAG test_reschedule.up-for-reschedule not
changed structure, skipping dagrun.verify_integrity {scheduler_job.py:1692}
[2021-03-26 16:34:52,290] [DEBUG] number of tis tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 3
task(s) {dagrun.py:490}
[2021-03-26 16:34:52,290] [DEBUG] number of scheduleable tasks for <DagRun
test_reschedule.up-for-reschedule @ 2021-03-26 16:34:06.628707+00:00:
manual__2021-03-26T16:34:06.628707+00:00, externally triggered: True>: 2
task(s) {dagrun.py:498}
[2021-03-26 16:34:52,290] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:52,290] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:52,290] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:836}
[2021-03-26 16:34:52,291] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_3 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_2'}
{taskinstance.py:816}
[2021-03-26 16:34:52,291] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Previous Dagrun State' PASSED: True, The task did not have
depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:52,291] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Not In Retry Period' PASSED: True, The task instance was
not marked for retrying. {taskinstance.py:836}
[2021-03-26 16:34:52,291] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]> dependency 'Trigger Rule' PASSED: False, Task's trigger rule
'all_success' requires all upstream tasks to have succeeded, but found 1
non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped':
0, 'failed': 0, 'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:836}
[2021-03-26 16:34:52,304] [DEBUG] Disposing DB connection pool (PID 27542)
{settings.py:290}
[2021-03-26 16:34:52,306] [DEBUG] Dependencies not met for <TaskInstance:
test_reschedule.up-for-reschedule.task_2 2021-03-26 16:34:06.628707+00:00
[None]>, dependency 'Trigger Rule' FAILED: Task's trigger rule 'all_success'
requires all upstream tasks to have succeeded, but found 1 non-success(es).
upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 0,
'upstream_failed': 0, 'done': 0}, upstream_task_ids={'task_1'}
{taskinstance.py:816}
[2021-03-26 16:34:52,311] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not
have depends_on_past set. {taskinstance.py:836}
[2021-03-26 16:34:52,311] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Not In Retry Period' PASSED: True, The context specified
that being in a retry period was permitted. {taskinstance.py:836}
[2021-03-26 16:34:52,311] [DEBUG] <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not
have any upstream tasks. {taskinstance.py:836}
[2021-03-26 16:34:52,311] [DEBUG] Dependencies all met for <TaskInstance:
test_reschedule.up-for-reschedule.task_1 2021-03-26 16:34:06.628707+00:00
[queued]> {taskinstance.py:826}
[2021-03-26 16:34:52,312] [DEBUG] Skipping SLA check for <DAG:
test_reschedule.up-for-reschedule> because no tasks in DAG have SLAs
{scheduler_job.py:1720}
[2021-03-26 16:34:52,323] [DEBUG] Processor for
/opt/airflow/git-sync/data-import/reschedule/deployment/airflow_dags/dag.py
finished {dag_processing.py:949}
[2021-03-26 16:34:52,324] [DEBUG] Waiting for <ForkProcess
name='DagFileProcessor3922-Process' pid=27542 parent=82 stopped exitcode=0>
{scheduler_job.py:309}
[2021-03-26 16:34:52,324] [DEBUG] No tasks to consider for execution.
{scheduler_job.py:933}
[2021-03-26 16:34:52,325] [DEBUG] 0 running task instances
{base_executor.py:147}
[2021-03-26 16:34:52,326] [DEBUG] 0 in queue {base_executor.py:148}
[2021-03-26 16:34:52,326] [DEBUG] 32 open slots {base_executor.py:149}
[2021-03-26 16:34:52,326] [DEBUG] Calling the <class
'airflow.executors.kubernetes_executor.KubernetesExecutor'> sync method
{base_executor.py:158}
[2021-03-26 16:34:52,326] [DEBUG] Syncing KubernetesExecutor
{kubernetes_executor.py:337}
[2021-03-26 16:34:52,326] [DEBUG] KubeJobWatcher alive, continuing
{kubernetes_executor.py:263}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]