cccs-cat001 opened a new issue #7872: New dags fail to run
URL: https://github.com/apache/airflow/issues/7872
 
 
   <!--
   
   IMPORTANT!!!
   
   Please complete the next sections or the issue will be closed.
   This questions are the first thing we need to know to understand the context.
   
   -->
   
   **Apache Airflow version**: 1.10.9
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): N/A
   
   **Environment**: [JupyterLab docker 
image](https://github.com/jupyter/docker-stacks/tree/master/pyspark-notebook), 
Ubuntu 18.04 VM
   
   - **Cloud provider or hardware configuration**: Microsoft Azure
   - **OS** (e.g. from /etc/os-release): Ubuntu 18.04
   - **Kernel** (e.g. `uname -a`): Linux <name> 5.0.0-1032-azure #34-Ubuntu SMP 
Mon Feb 10 19:37:25 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
   
   - **Install tools**: apache-airflow
   - **Others**:
   **What happened**:
   On a fresh install of airflow, I run `airflow initdb`, and then create a dag 
(bash.py)
   ```
   from datetime import datetime
   from airflow.models import DAG
   from airflow.operators.bash_operator import BashOperator
   from airflow.operators.dummy_operator import DummyOperator
   from datetime import timedelta
   import airflow
   
   start_date = datetime(2020, 1, 1)
   
   args={'owner': 'cccs-cat001', 'start_date': start_date}
   
   dag = DAG(dag_id='date', default_args=args, 
schedule_interval=timedelta(minutes=5), dagrun_timeout=timedelta(minutes=60))
   
   run_dag = BashOperator(task_id='show_date', bash_command='date', dag=dag)
   
   for i in range(5):
           task = DummyOperator(task_id='dummy_'+str(i), dag=dag)
           task.set_upstream(run_dag)
   ```
   and then `airflow list_dags` will show that the dag exists, and if you run 
`airflow trigger_dag date` it will give you the following error 
   ```
   Traceback (most recent call last):
     File "/home/artifactory/.local/bin/airflow", line 37, in <module>
       args.func(args)
     File 
"/home/artifactory/.local/lib/python3.6/site-packages/airflow/utils/cli.py", 
line 75, in wrapper
       return f(*args, **kwargs)
     File 
"/home/artifactory/.local/lib/python3.6/site-packages/airflow/bin/cli.py", line 
237, in trigger_dag
       execution_date=args.exec_date)
     File 
"/home/artifactory/.local/lib/python3.6/site-packages/airflow/api/client/local_client.py",
 line 34, in trigger_dag
       execution_date=execution_date)
     File 
"/home/artifactory/.local/lib/python3.6/site-packages/airflow/api/common/experimental/trigger_dag.py",
 line 124, in trigger_dag
       raise DagNotFound("Dag id {} not found in DagModel".format(dag_id))
   airflow.exceptions.DagNotFound: Dag id bash not found in DagModel
   ```
   And it won't run. Now run `airflow initdb` and then trigger the dag again, 
and it works fine. 
   
   **What you expected to happen**: Should just be able to run the dag without 
initializing the db again...
   
   **How to reproduce it**: see above
   
   
   **Anything else we need to know**: The other version we're running is 
1.10.5, and this issue doesn't seem to happen there. Once you run `airflow 
initdb` after the dag, it'll run always. But add a new dag and trigger it and 
the same thing occurs. 
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to