[ https://issues.apache.org/jira/browse/AIRFLOW-182?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ash Berlin-Taylor closed AIRFLOW-182. ------------------------------------- Resolution: Cannot Reproduce Airflow 1.7 is now quite old. If this is still happening on the new latest version please open another issue and we'd be happy to help solve it > CLI command `airflow backfill` fails while CLI `airflow run` succeeds > --------------------------------------------------------------------- > > Key: AIRFLOW-182 > URL: https://issues.apache.org/jira/browse/AIRFLOW-182 > Project: Apache Airflow > Issue Type: Bug > Components: celery > Affects Versions: Airflow 1.7.0 > Environment: Heroku Cedar 14, Heroku Redis as Celery Broker > Reporter: Hariharan Mohanraj > Priority: Minor > > When I run the backfill command, I get an error that claims there is no dag > in my dag folder with the name "unusual_prefix_dag1", although my dag is > actually named dag1. However when I run the run command, the task is > scheduled and it works flawlessly. > {code} > $ airflow backfill -t task1 -s 2016-05-01 -e 2016-05-07 dag1 > 2016-05-26T23:22:28.816908+00:00 app[worker.1]: [2016-05-26 23:22:28,816] > {__init__.py:36} INFO - Using executor CeleryExecutor > 2016-05-26T23:22:29.214006+00:00 app[worker.1]: Traceback (most recent call > last): > 2016-05-26T23:22:29.214083+00:00 app[worker.1]: File > "/app/.heroku/python/bin/airflow", line 15, in <module> > 2016-05-26T23:22:29.214121+00:00 app[worker.1]: args.func(args) > 2016-05-26T23:22:29.214151+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/airflow/bin/cli.py", line > 174, in run > 2016-05-26T23:22:29.214207+00:00 app[worker.1]: > DagPickle).filter(DagPickle.id == args.pickle).first() > 2016-05-26T23:22:29.214230+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/orm/query.py", > line 2634, in first > 2016-05-26T23:22:29.214616+00:00 app[worker.1]: ret = list(self[0:1]) > 2016-05-26T23:22:29.214626+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/orm/query.py", > line 2457, in __getitem__ > 2016-05-26T23:22:29.214984+00:00 app[worker.1]: return list(res) > 2016-05-26T23:22:29.214992+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", > line 86, in instances > 2016-05-26T23:22:29.215053+00:00 app[worker.1]: util.raise_from_cause(err) > 2016-05-26T23:22:29.215074+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/util/compat.py", > line 200, in raise_from_cause > 2016-05-26T23:22:29.215121+00:00 app[worker.1]: reraise(type(exception), > exception, tb=exc_tb, cause=cause) > 2016-05-26T23:22:29.215142+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", > line 71, in instances > 2016-05-26T23:22:29.215175+00:00 app[worker.1]: rows = [proc(row) for row > in fetch] > 2016-05-26T23:22:29.215200+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", > line 428, in _instance > 2016-05-26T23:22:29.215274+00:00 app[worker.1]: loaded_instance, > populate_existing, populators) > 2016-05-26T23:22:29.215282+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", > line 486, in _populate_full > 2016-05-26T23:22:29.215369+00:00 app[worker.1]: dict_[key] = getter(row) > 2016-05-26T23:22:29.215406+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/sqlalchemy/sql/sqltypes.py", > line 1253, in process > 2016-05-26T23:22:29.215574+00:00 app[worker.1]: return loads(value) > 2016-05-26T23:22:29.215595+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/dill/dill.py", line 260, in > loads > 2016-05-26T23:22:29.215657+00:00 app[worker.1]: return load(file) > 2016-05-26T23:22:29.215678+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/dill/dill.py", line 250, in > load > 2016-05-26T23:22:29.215738+00:00 app[worker.1]: obj = pik.load() > 2016-05-26T23:22:29.215758+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/pickle.py", line 858, in load > 2016-05-26T23:22:29.215895+00:00 app[worker.1]: dispatch[key](self) > 2016-05-26T23:22:29.215902+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/pickle.py", line 1090, in load_global > 2016-05-26T23:22:29.216069+00:00 app[worker.1]: klass = > self.find_class(module, name) > 2016-05-26T23:22:29.216077+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/site-packages/dill/dill.py", line 406, in > find_class > 2016-05-26T23:22:29.216181+00:00 app[worker.1]: return > StockUnpickler.find_class(self, module, name) > 2016-05-26T23:22:29.216190+00:00 app[worker.1]: File > "/app/.heroku/python/lib/python2.7/pickle.py", line 1124, in find_class > 2016-05-26T23:22:29.216360+00:00 app[worker.1]: __import__(module) > 2016-05-26T23:22:29.216412+00:00 app[worker.1]: ImportError: No module named > unusual_prefix_dag1 > # runs flawlessly > $ airflow run dag1 task1 2016-05-07 > {code} > Apologies if the format is wrong or if I haven't provided enough information, > this is the first time I've ever submitted an issue .. anywhere! -- This message was sent by Atlassian JIRA (v7.6.3#76005)