And, I guess, I found another issue.

With `store_dag_code` enabled scheduler throws the following error:

```
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/multiprocessing/process.py", line
297, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py",
line 158, in _run_file_processor
    pickle_dags)
  File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py",
line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py",
line 1582, in process_file
    dag.sync_to_db()
  File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py",
line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/dag.py",
line 1519, in sync_to_db
    DagCode.bulk_sync_to_db([dag.fileloc for dag in orm_dag])
TypeError: 'DagModel' object is not iterable
```

Looks like this is caused by a mistake during backporting of
https://github.com/apache/airflow/commit/e146518#diff-e5cbc8f771ec50ccb79ad8505f6f5697R1533
as
https://github.com/apache/airflow/commit/eb308e9#diff-e5cbc8f771ec50ccb79ad8505f6f5697R1519

The `orm_dag` variable in the 1.10 branch is produced from `DagModel`
with `.first()` -- which is a single model instance; while in master
`orm_dags` is
produced with `.all()` -- hence the "object is not iterable" error.

Best,
Kostya

Reply via email to